TY - JOUR A1 - Laskov, Pavel A1 - Gehl, Christian A1 - Krüger, Stefan A1 - Müller, Klaus-Robert T1 - Incremental support vector learning: analysis, implementation and applications JF - Journal of machine learning research N2 - Incremental Support Vector Machines (SVM) are instrumental in practical applications of online learning. This work focuses on the design and analysis of efficient incremental SVM learning, with the aim of providing a fast, numerically stable and robust implementation. A detailed analysis of convergence and of algorithmic complexity of incremental SVM learning is carried out. Based on this analysis, a new design of storage and numerical operations is proposed, which speeds up the training of an incremental SVM by a factor of 5 to 20. The performance of the new algorithm is demonstrated in two scenarios: learning with limited resources and active learning. Various applications of the algorithm, such as in drug discovery, online monitoring of industrial devices and and surveillance of network traffic, can be foreseen. KW - incremental SVM KW - online learning KW - drug discovery KW - intrusion detection Y1 - 2006 SN - 1532-4435 VL - 7 SP - 1909 EP - 1936 PB - MIT Press CY - Cambridge, Mass. ER - TY - JOUR A1 - Steuer, Ralf A1 - Humburg, Peter A1 - Selbig, Joachim T1 - Validation and functional annotation of expression-based clusters based on gene ontology JF - BMC bioinformatics N2 - Background: The biological interpretation of large-scale gene expression data is one of the paramount challenges in current bioinformatics. In particular, placing the results in the context of other available functional genomics data, such as existing bio-ontologies, has already provided substantial improvement for detecting and categorizing genes of interest. One common approach is to look for functional annotations that are significantly enriched within a group or cluster of genes, as compared to a reference group. Results: In this work, we suggest the information-theoretic concept of mutual information to investigate the relationship between groups of genes, as given by data-driven clustering, and their respective functional categories. Drawing upon related approaches (Gibbons and Roth, Genome Research 12: 1574-1581, 2002), we seek to quantify to what extent individual attributes are sufficient to characterize a given group or cluster of genes. Conclusion: We show that the mutual information provides a systematic framework to assess the relationship between groups or clusters of genes and their functional annotations in a quantitative way. Within this framework, the mutual information allows us to address and incorporate several important issues, such as the interdependence of functional annotations and combinatorial combinations of attributes. It thus supplements and extends the conventional search for overrepresented attributes within a group or cluster of genes. In particular taking combinations of attributes into account, the mutual information opens the way to uncover specific functional descriptions of a group of genes or clustering result. All datasets and functional annotations used in this study are publicly available. All scripts used in the analysis are provided as additional files. Y1 - 2006 U6 - https://doi.org/10.1186/1471-2105-7-380 SN - 1471-2105 VL - 7 IS - 380 PB - BioMed Central CY - London ER - TY - JOUR A1 - Linke, Thomas A1 - Tompits, Hans A1 - Woltran, Stefan T1 - On Acyclic and head-cycle free nested logic programs Y1 - 2004 SN - 3-540-22671-01 ER - TY - JOUR A1 - Linke, Thomas A1 - Tompits, Hans A1 - Woltran, Stefan T1 - On acyclic and head-cycle free nested logic programs Y1 - 2004 ER - TY - JOUR A1 - Hafer, Jörg A1 - Ludwig, Joachim A1 - Schumann, Marlen T1 - Fallstudien in medialen Räumen JF - Commentarii informaticae didacticae : (CID) N2 - Ziel dieses Beitrages ist es, das didaktische Konzept Fallstudien und seine lerntheoretisch-didaktische Begründung vorzustellen. Es wird die These begründet, dass mediale Räume für die Bearbeitung von Fallstudien lernunterstützend wirken und sich in besonderer Weise für Prozesse der Lernberatung und Lernbegleitung in der Hochschule eignen. Diese These wird entlang dem lerntheoretischen Konzept der Bedeutungsräume von Studierenden in Verbindung mit den Spezifika medialer Räume entfaltet. Für den daraus entstandenen E-Learning-Ansatz Online-Fallstudien kann hier lediglich ein Ausblick gegeben werden. Y1 - 2010 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64431 SN - 1868-0844 SN - 2191-1940 IS - 4 SP - 93 EP - 98 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Bordihn, Henning A1 - Fernau, Henning A1 - Holzer, Markus A1 - Manca, Vincenzo A1 - Martin-Vide, Carlos T1 - Iterated sequential transducers as language generating devices JF - Theoretical computer science N2 - Iterated finite state sequential transducers are considered as language generating devices. The hierarchy induced by the size of the state alphabet is proved to collapse to the fourth level. The corresponding language families are related to the families of languages generated by Lindenmayer systems and Chomsky grammars. Finally, some results on deterministic and extended iterated finite state transducers are established. KW - finite state sequential transducers KW - state complexity KW - Lindenmayer systems Y1 - 2006 U6 - https://doi.org/10.1016/j.tcs.2006.07.059 SN - 0304-3975 VL - 369 IS - 1 SP - 67 EP - 81 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Stoffel, Dominik A1 - Kunz, Wolfgang A1 - Gerber, Stefan T1 - And/Or reasoning graphs for determining prime implicants in multi-level combinational networks Y1 - 1997 ER - TY - JOUR A1 - Baier, Thomas A1 - Di Ciccio, Claudio A1 - Mendling, Jan A1 - Weske, Mathias T1 - Matching events and activities by integrating behavioral aspects and label analysis JF - Software and systems modeling N2 - Nowadays, business processes are increasingly supported by IT services that produce massive amounts of event data during the execution of a process. These event data can be used to analyze the process using process mining techniques to discover the real process, measure conformance to a given process model, or to enhance existing models with performance information. Mapping the produced events to activities of a given process model is essential for conformance checking, annotation and understanding of process mining results. In order to accomplish this mapping with low manual effort, we developed a semi-automatic approach that maps events to activities using insights from behavioral analysis and label analysis. The approach extracts Declare constraints from both the log and the model to build matching constraints to efficiently reduce the number of possible mappings. These mappings are further reduced using techniques from natural language processing, which allow for a matching based on labels and external knowledge sources. The evaluation with synthetic and real-life data demonstrates the effectiveness of the approach and its robustness toward non-conforming execution logs. KW - Process mining KW - Event mapping KW - Business process intelligence KW - Constraint satisfaction KW - Declare KW - Natural language processing Y1 - 2018 U6 - https://doi.org/10.1007/s10270-017-0603-z SN - 1619-1366 SN - 1619-1374 VL - 17 IS - 2 SP - 573 EP - 598 PB - Springer CY - Heidelberg ER - TY - JOUR A1 - Przybylla, Mareen A1 - Romeike, Ralf T1 - Empowering learners with tools in CS education BT - physical computing in secondary schools JF - it - Information Technology N2 - In computer science, computer systems are both, objects of investigation and tools that enable creative learning and design. Tools for learning have a long tradition in computer science education. Already in the late 1960s, Papert developed a concept which had an immense impact on the development of informal education in the following years: his theory of constructionism understands learning as a creative process of knowledge construction that is most effective when learners create something purposeful that they can try out, show around, discuss, analyse and receive praise for. By now, there are numerous learning and programming environments that are based on the constructionist ideas. Modern tools offer opportunities for students to learn in motivating ways and gain impressive results in programming games, animations, implementing 3D models or developing interactive objects. This article gives an overview of computer science education research related to tools and media to be used in educational settings. We analyse different types of tools with a special focus on the categorization and development of tools for student adequate physical computing activities in the classroom. Research around the development and evaluation of tools and learning resources in the domain of physical computing is illustrated with the example of "My Interactive Garden", a constructionist learning and programming environment. It is explained how the results from empirical studies are integrated in the continuous development of the learning material. KW - tools KW - media KW - resources KW - computer science education KW - physical computing Y1 - 2018 U6 - https://doi.org/10.1515/itit-2017-0032 SN - 1611-2776 SN - 2196-7032 VL - 60 IS - 2 SP - 91 EP - 101 PB - De Gruyter CY - Berlin ER - TY - JOUR A1 - Möring, Sebastian A1 - Leino, Olli Tapio T1 - Die neoliberale Bedingung von Computerspielen JF - Kontrollmaschinen - zur Dispositivtheorie des Computerspiels Y1 - 2022 SN - 978-3-643-14780-6 SP - 41 EP - 61 PB - LiteraturWissenschaft.de CY - Münster ER - TY - JOUR A1 - Ly, Ibrahim A1 - Tarkhanov, Nikolai Nikolaevich T1 - A variational approach to the Cauchy problem for nonlinear elliptic differential equations N2 - We discuss the relaxation of a class of nonlinear elliptic Cauchy problems with data on a piece S of the boundary surface by means of a variational approach known in the optimal control literature as "equation error method". By the Cauchy problem is meant any boundary value problem for an unknown function y in a domain X with the property that the data on S, if combined with the differential equations in X, allow one to determine all derivatives of y on S by means of functional equations. In the case of real analytic data of the Cauchy problem, the existence of a local solution near S is guaranteed by the Cauchy-Kovalevskaya theorem. We also admit overdetermined elliptic systems, in which case the set of those Cauchy data on S for which the Cauchy problem is solvable is very "thin". For this reason we discuss a variational setting of the Cauchy problem which always possesses a generalised solution. Y1 - 2009 UR - http://dx.doi.org/10.1515/jiip U6 - https://doi.org/10.1515/Jiip.2009.037 SN - 0928-0219 ER - TY - JOUR A1 - Prasse, Paul A1 - Iversen, Pascal A1 - Lienhard, Matthias A1 - Thedinga, Kristina A1 - Herwig, Ralf A1 - Scheffer, Tobias T1 - Pre-Training on In Vitro and Fine-Tuning on Patient-Derived Data Improves Deep Neural Networks for Anti-Cancer Drug-Sensitivity Prediction JF - MDPI N2 - Large-scale databases that report the inhibitory capacities of many combinations of candidate drug compounds and cultivated cancer cell lines have driven the development of preclinical drug-sensitivity models based on machine learning. However, cultivated cell lines have devolved from human cancer cells over years or even decades under selective pressure in culture conditions. Moreover, models that have been trained on in vitro data cannot account for interactions with other types of cells. Drug-response data that are based on patient-derived cell cultures, xenografts, and organoids, on the other hand, are not available in the quantities that are needed to train high-capacity machine-learning models. We found that pre-training deep neural network models of drug sensitivity on in vitro drug-sensitivity databases before fine-tuning the model parameters on patient-derived data improves the models’ accuracy and improves the biological plausibility of the features, compared to training only on patient-derived data. From our experiments, we can conclude that pre-trained models outperform models that have been trained on the target domains in the vast majority of cases. KW - deep neural networks KW - drug-sensitivity prediction KW - anti-cancer drugs Y1 - 2022 U6 - https://doi.org/10.3390/cancers14163950 SN - 2072-6694 VL - 14 SP - 1 EP - 14 PB - MDPI CY - Basel, Schweiz ET - 16 ER - TY - JOUR A1 - Everardo Pérez, Flavio Omar A1 - Osorio, Mauricio T1 - Towards an answer set programming methodology for constructing programs following a semi-automatic approach BT - extended and revised version JF - Electronic notes in theoretical computer science N2 - Answer Set Programming (ASP) is a successful rule-based formalism for modeling and solving knowledge-intense combinatorial (optimization) problems. Despite its success in both academic and industry, open challenges like automatic source code optimization, and software engineering remains. This is because a problem encoded into an ASP might not have the desired solving performance compared to an equivalent representation. Motivated by these two challenges, this paper has three main contributions. First, we propose a developing process towards a methodology to implement ASP programs, being faithful to existing methods. Second, we present ASP encodings that serve as the basis from the developing process. Third, we demonstrate the use of ASP to reverse the standard solving process. That is, knowing answer sets in advance, and desired strong equivalent properties, “we” exhaustively reconstruct ASP programs if they exist. This paper was originally motivated by the search of propositional formulas (if they exist) that represent the semantics of a new aggregate operator. Particularly, a parity aggregate. This aggregate comes as an improvement from the already existing parity (xor) constraints from xorro, where lacks expressiveness, even though these constraints fit perfectly for reasoning modes like sampling or model counting. To this end, this extended version covers the fundaments from parity constraints as well as the xorro system. Hence, we delve a little more in the examples and the proposed methodology over parity constraints. Finally, we discuss our results by showing the only representation available, that satisfies different properties from the classical logic xor operator, which is also consistent with the semantics of parity constraints from xorro. KW - answer set programming KW - combinatorial optimization problems KW - parity aggregate operator Y1 - 2020 U6 - https://doi.org/10.1016/j.entcs.2020.10.004 SN - 1571-0661 VL - 354 SP - 29 EP - 44 PB - Elsevier CY - Amsterdam [u.a.] ER - TY - JOUR A1 - Romeike, Ralf T1 - Output statt Input JF - Commentarii informaticae didacticae : (CID) N2 - Die in der Fachdidaktik Informatik im Zusammenhang mit den Bildungsstandards seit Jahren diskutierte Outputorientierung wird mittelfristig auch für die Hochschullehre verbindlich. Diese Änderung kann als Chance aufgefasst werden, aktuellen Problemen der Informatiklehre gezielt entgegenzuwirken. Basierend auf der Theorie des Constructive Alignment wird vorgeschlagen, im Zusammenhang mit der Outputorientierung eine Abstimmung von intendierter Kompetenz, Lernaktivität und Prüfung vorzunehmen. Zusätzlich profitieren Lehramtsstudenten von den im eigenen Lernprozess erworbenen Erfahrungen im Umgang mit Kompetenzen: wie diese formuliert, erarbeitet und geprüft werden. Anforderungen an die Formulierung von Kompetenzen werden untersucht, mit Beispielen belegt und Möglichkeiten zur Klassifizierung angeregt. Ein Austausch in den Fachbereichen und Fachdidaktiken über die individuell festgelegten Kompetenzen wird vorgeschlagen, um die hochschuldidaktische Diskussion zu bereichern. Y1 - 2010 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64317 SN - 1868-0844 SN - 2191-1940 IS - 4 SP - 35 EP - 46 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Frenkel, Marcus A1 - Weicker, Karsten T1 - Pseudo BT - eine Programmiersprache auf der Basis von Pseudocode zur Unterstützung der akademischen Lehre JF - Commentarii informaticae didacticae : (CID) N2 - Pseudo ist eine auf Pseudocode basierende Programmiersprache, welche in der akademischen Lehre zum Einsatz kommen und hier die Vermittlung und Untersuchung von Algorithmen und Datenstrukturen unterstützen soll. Dieser Beitrag geht auf die Besonderheiten der Sprache sowie mögliche didaktische Szenarien ein. Y1 - 2010 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64328 SN - 1868-0844 SN - 2191-1940 IS - 4 SP - 47 EP - 52 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Raimer, Stephan T1 - Aquadrohne, Messdatenerfassung und Co. BT - Interdisziplinäres Projektmanagement als Teil des Wirtschaftsinformatikstudiums JF - Commentarii informaticae didacticae : (CID) N2 - Projektmanagement-Kompetenzen werden von Unternehmen unterschiedlichster Branchen mit wachsender Priorität betrachtet und eingefordert. Als Beitrag zu einer kompetenzorientierten Ausbildung werden in diesem Paper interdisziplinäre Studienmodule als Bestandteil des Wirtschaftsinformatik-Studiums vorgestellt. Zielsetzung der Studienmodule ist die Befähigung der Studierenden, konkrete Projekte unter Nutzung von standardisierten Werkzeugen und Methoden nach dem IPMA-Standard planen und durchführen zu können. Y1 - 2010 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64345 SN - 1868-0844 SN - 2191-1940 IS - 4 SP - 59 EP - 64 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Jahnke, Isa A1 - Haertel, Tobias A1 - Mattik, Volker A1 - Lettow, Karsten T1 - Was ist eine kreative Leistung Studierender? BT - Erfahrungen eines kreativitätsförderlichen Lehrbeispiels JF - Commentarii informaticae didacticae : (CID) N2 - Was ist eine kreative Leistung von Studierenden? Dies ist die Ausgangsfrage, wenn Lehre kreativitätsförderlicher als bislang gestaltet werden soll. In diesem Beitrag wird ein Modell zur Förderung von Kreativität in der Hochschullehre vorgestellt und mit einem Beispiel verdeutlicht. Es wird die veränderte Konzeption der Vorlesung Informatik & Gesellschaft illustriert: Studierende hatten die Aufgabe, eine „e-Infrastruktur für die Universität NeuDoBoDu“ zu entwickeln. Hierzu werden die Ergebnisse der Evaluation und Erfahrungen erläutert. Y1 - 2010 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64386 SN - 1868-0844 SN - 2191-1940 IS - 4 SP - 87 EP - 92 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Abke, Jörg A1 - Schwirtlich, Vincent A1 - Sedelmaier, Yvonne T1 - Kompetenzförderung im Software Engineering durch ein mehrstufiges Lehrkonzept im Studiengang Mechatronik JF - Commentarii informaticae didacticae : (CID) N2 - Dieser Beitrag stellt das Lehr-Lern-Konzept zur Kompetenzförderung im Software Engineering im Studiengang Mechatronik der Hochschule Aschaffenburg dar. Dieses Konzept ist mehrstufig mit Vorlesungs-, Seminar- und Projektsequenzen. Dabei werden Herausforderungen und Verbesserungspotentiale identifiziert und dargestellt. Abschließend wird ein Überblick gegeben, wie im Rahmen eines gerade gestarteten Forschungsprojektes Lehr-Lernkonzepte weiterentwickelt werden können. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64899 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 79 EP - 84 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Dörge, Christina T1 - Entwicklung eines methodologischen Verfahrens zur Ermittlung von informatischen Kompetenzen JF - Commentarii informaticae didacticae : (CID) N2 - Der traditionelle Weg in der Informatik besteht darin, Kompetenzen entweder normativ durch eine Expertengruppe festzulegen oder als Ableitungsergebnis eines Bildungsstandards aus einem externen Feld. Dieser Artikel stellt einen neuartigen und alternativen Ansatz vor, der sich der Methodik der Qualitativen Inhaltsanalyse (QI) bedient. Das Ziel war die Ableitung von informatischen Schlüsselkompetenzen anhand bereits etablierter und erprobter didaktischer Ansätze der Informatikdidaktik. Dazu wurde zunächst aus einer Reihe von Informatikdidaktikbüchern eine Liste mit möglichen Kandidaten für Kompetenzen generiert. Diese Liste wurde als QI-Kategoriensystem verwendet, mit der sechs verschiedene didaktische Ansätze analysiert wurden. Ein abschließender Verfeinerungsschritt erfolgte durch die Überprüfung, welche der gefundenen Kompetenzen in allen vier Kernbereichen der Informatik (theoretische, technische, praktische und angewandte Informatik) Anwendung finden. Diese Methode wurde für die informatische Schulausbildung exemplarisch entwickelt und umgesetzt, ist aber ebenfalls ein geeignetes Vorgehen für die Identifizierung von Schlüsselkompetenzen in anderen Gebieten, wie z. B. in der informatischen Hochschulausbildung, und soll deshalb hier kurz vorgestellt werden. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64906 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 85 EP - 90 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Müller, Dorothee A1 - Frommer, Andreas A1 - Humbert, Ludger T1 - Informatik im Alltag BT - Durchblicken statt Rumklicken JF - Commentarii informaticae didacticae : (CID) N2 - Die Fachwissenschaft Informatik stellt Mittel bereit, deren Nutzung für Studierende heutzutage selbstverständlich ist. Diese Tatsache darf uns allerdings nicht dar- über hinwegtäuschen, dass Studierende in der Regel keine Grundlage im Sinne einer informatischen Allgemeinbildung gemäÿ der Bildungsstandards der Gesellschaft für Informatik besitzen. Das Schulfach Informatik hat immer noch keinen durchgängigen Platz in den Stundentafeln der allgemein bildenden Schule gefunden. Zukünftigen Lehrkräften ist im Rahmen der bildungswissenschaftlichen Anteile im Studium eine hinreichende Medienkompetenz zu vermitteln. Mit der überragenden Bedeutung der digitalen Medien kann dies nur auf der Grundlage einer ausreichenden informatischen Grundbildung erfolgen. Damit ist es angezeigt, ein Studienangebot bereitzustellen, das allen Studierenden ein Eintauchen in Elemente (Fachgebiete) der Fachwissenschaft Informatik aus der Sicht des Alltags bietet. An diesen Elementen werden exemplarisch verschiedene Aspekte der Fachwissenschaft beleuchtet, um einen Einblick in die Vielgestaltigkeit der Fragen und Lösungsstrategien der Informatik zu erlauben und so die informatische Grundbildung zu befördern. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64959 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 98 EP - 104 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Ehlenz, Matthias A1 - Bergner, Nadine A1 - Schroeder, Ulrik T1 - Synergieeffekte zwischen Fach- und Lehramtsstudierenden in Softwarepraktika JF - Commentarii informaticae didacticae (CID) N2 - Dieser Beitrag diskutiert die Konzeption eines Software-Projektpraktikums im Bereich E-Learning, welches Lehramts- und Fachstudierenden der Informatik ermöglicht, voneinander zu profitieren und praxisrelevante Ergebnisse generiert. Vorbereitungen, Organisation und Durchführung werden vorgestellt und diskutiert. Den Abschluss bildet ein Ausblick auf die Fortführung des Konzepts und den Ausbau des Forschungsgebietes. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94875 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 99 EP - 102 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Dennert-Möller, Elisabeth A1 - Garmann, Robert T1 - Das „Startprojekt“ BT - Entwicklung überfachlicher Kompetenzen von Anfang an JF - Commentarii informaticae didacticae (CID) N2 - Absolventinnen und Absolventen unserer Informatik-Bachelorstudiengänge benötigen für kompetentes berufliches Handeln sowohl fachliche als auch überfachliche Kompetenzen. Vielfach verlangen wir von Erstsemestern in Grundlagen-Lehrveranstaltungen fast ausschließlich den Aufbau von Fachkompetenz und vernachlässigen dabei häufig Selbstkompetenz, Methodenkompetenz und Sozialkompetenz. Gerade die drei letztgenannten sind für ein erfolgreiches Studium unabdingbar und sollten von Anfang an entwickelt werden. Wir stellen unser „Startprojekt“ als einen Beitrag vor, im ersten Semester die eigenverantwortliche, überfachliche Kompetenzentwicklung in einem fachlichen Kontext zu fördern. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94780 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 11 EP - 23 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Kujath, Bertold T1 - Lernwirksamkeits- und Zielgruppenanalyse für ein Lehrvideo zum informatischen Problemlösen JF - Commentarii informaticae didacticae (CID) N2 - Aus einer Vergleichsstudie mit starken und schwachen Problemlösern konnten Erkenntnisse über die effizienten Herangehensweisen von Hochleistern an Informatikprobleme gewonnen werden. Diese Erkenntnisse wurden in einem Lehrvideo zum informatischen Problemlösen didaktisch aufgearbeitet, sodass Lernenden der Einsatz von Baumstrukturen und Rekursion im konkreten Kontext gezeigt werden kann. Nun wurde die tatsächliche Lernwirksamkeit des Videos sowie die Definition der Zielgruppe in einer Vergleichsstudie mit 66 Studienanfängern überprüft. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94797 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 25 EP - 39 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Zscheyge, Oliver A1 - Weicker, Karsten T1 - Werkzeugunterstützung bei der Vermittlung der Grundlagen wissenschaftlichen Schreibens JF - Commentarii informaticae didacticae (CID) N2 - Der Unterricht großer Studierendengruppen im wissenschaftlichen Schreiben birgt vielfältige organisatorische Herausforderungen und eine zeitintensive Betreuung durch die Dozenten. Diese Arbeit stellt ein Lehrkonzept mit Peer-Reviews vor, in dem das Feedback der Peers durch eine automatisierte Analyse ergänzt wird. Die Software Confopy liefert metrik- und strukturbasierte Hinweise für die Verbesserung des wissenschaftlichen Schreibstils. Der Nutzen von Confopy wird an 47 studentischen Arbeiten in Draft- und Final-Version illustriert. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94814 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 57 EP - 68 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Böhne, Sebastian A1 - Kreitz, Christoph A1 - Knobelsdorf, Maria T1 - Mathematisches Argumentieren und Beweisen mit dem Theorembeweiser Coq JF - Commentarii informaticae didacticae (CID) N2 - Informatik-Studierende haben in der Mehrzahl Schwierigkeiten, einen Einstieg in die Theoretische Informatik zu finden und die Leistungsanforderungen in den Endklausuren der zugehörigen Lehrveranstaltungen zu erfüllen. Wir argumentieren, dass dieser Symptomatik mangelnde Kompetenzen im Umgang mit abstrakten und stark formalisierten Themeninhalten zugrunde liegen und schlagen vor, einen Beweisassistenten als interaktives Lernwerkzeug in der Eingangslehre der Theoretischen Informatik zu nutzen, um entsprechende Kompetenzen zu stärken. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94824 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 69 EP - 80 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Steen, Alexander A1 - Wisniewski, Max A1 - Benzmüller, Christoph T1 - Einsatz von Theorembeweisern in der Lehre JF - Commentarii informaticae didacticae (CID) N2 - Dieser Beitrag diskutiert den Einsatz von interaktiven und automatischen Theorembeweisern in der universitären Lehre. Moderne Theorembeweiser scheinen geeignet zur Implementierung des dialogischen Lernens und als E-Assessment-Werkzeug in der Logikausbilding. Exemplarisch skizzieren wir ein innovaties Lehrprojekt zum Thema „Komputationale Metaphysik“, in dem die zuvor genannten Werkzeuge eingesetzt werden. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94853 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 81 EP - 92 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Gebhardt, Kai T1 - Kooperative und kompetenzorientierte Übungen in der Softwaretechnik JF - Commentarii informaticae didacticae (CID) N2 - Die Unterrichtsmethode Stationsarbeit kann verwendet werden, um Individualisierung und Differenzierung im Lernprozess zu ermöglichen. Dieser Beitrag schlägt Aufgabenformate vor, die in einer Stationsarbeit über das Klassendiagramm aus der Unified Modeling Language verwendet werden können. Die Aufgabenformate wurden bereits mit Studierenden erprobt. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94867 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 95 EP - 98 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Teske, Daniel T1 - Geocoder accuracy ranking JF - Process design for natural scientists: an agile model-driven approach N2 - Finding an address on a map is sometimes tricky: the chosen map application may be unfamiliar with the enclosed region. There are several geocoders on the market, they have different databases and algorithms to compute the query. Consequently, the geocoding results differ in their quality. Fortunately the geocoders provide a rich set of metadata. The workflow described in this paper compares this metadata with the aim to find out which geocoder is offering the best-fitting coordinate for a given address. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 161 EP - 174 PB - Springer CY - Berlin ER - TY - JOUR A1 - Sens, Henriette T1 - Web-Based map generalization tools put to the test: a jABC workflow JF - Process Design for Natural Scientists: an agile model-driven approach N2 - Geometric generalization is a fundamental concept in the digital mapping process. An increasing amount of spatial data is provided on the web as well as a range of tools to process it. This jABC workflow is used for the automatic testing of web-based generalization services like mapshaper.org by executing its functionality, overlaying both datasets before and after the transformation and displaying them visually in a .tif file. Mostly Web Services and command line tools are used to build an environment where ESRI shapefiles can be uploaded, processed through a chosen generalization service and finally visualized in Irfanview. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 175 EP - 185 PB - Springer CY - Berlin ER - TY - JOUR A1 - Noack, Franziska T1 - CREADED: Colored-Relief application for digital elevation data JF - Process design for natural scientists: an agile model-driven approach N2 - In the geoinformatics field, remote sensing data is often used for analyzing the characteristics of the current investigation area. This includes DEMs, which are simple raster grids containing grey scales representing the respective elevation values. The project CREADED that is presented in this paper aims at making these monochrome raster images more significant and more intuitively interpretable. For this purpose, an executable interactive model for creating a colored and relief-shaded Digital Elevation Model (DEM) has been designed using the jABC framework. The process is based on standard jABC-SIBs and SIBs that provide specific GIS functions, which are available as Web services, command line tools and scripts. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 186 EP - 199 PB - Springer CY - Berlin ER - TY - JOUR A1 - Respondek, Tobias T1 - A workflow for computing potential areas for wind turbines JF - Process design for natural scientists: an agile model-driven approach N2 - This paper describes the implementation of a workflow model for service-oriented computing of potential areas for wind turbines in jABC. By implementing a re-executable model the manual effort of a multi-criteria site analysis can be reduced. The aim is to determine the shift of typical geoprocessing tools of geographic information systems (GIS) from the desktop to the web. The analysis is based on a vector data set and mainly uses web services of the “Center for Spatial Information Science and Systems” (CSISS). This paper discusses effort, benefits and problems associated with the use of the web services. Y1 - 2014 SN - 978-3-662-45005-5 IS - 500 SP - 200 EP - 215 PB - Springer CY - Berlin ER - TY - JOUR A1 - Scheele, Lasse T1 - Location analysis for placing artificial reefs JF - Process design for natural scientists: an agile model-driven approach N2 - Location analyses are among the most common tasks while working with spatial data and geographic information systems. Automating the most frequently used procedures is therefore an important aspect of improving their usability. In this context, this project aims to design and implement a workflow, providing some basic tools for a location analysis. For the implementation with jABC, the workflow was applied to the problem of finding a suitable location for placing an artificial reef. For this analysis three parameters (bathymetry, slope and grain size of the ground material) were taken into account, processed, and visualized with the The Generic Mapping Tools (GMT), which were integrated into the workflow as jETI-SIBs. The implemented workflow thereby showed that the approach to combine jABC with GMT resulted in an user-centric yet user-friendly tool with high-quality cartographic outputs. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 216 EP - 228 PB - Springer CY - Berlin ER - TY - JOUR A1 - Holler, Robin T1 - GraffDok - a graffiti documentation application JF - Process design for natural scientists: an agile model-driven approach N2 - GraffDok is an application helping to maintain an overview over sprayed images somewhere in a city. At the time of writing it aims at vandalism rather than at beautiful photographic graffiti in an underpass. Looking at hundreds of tags and scribbles on monuments, house walls, etc. it would be interesting to not only record them in writing but even make them accessible electronically, including images. GraffDok’s workflow is simple and only requires an EXIF-GPS-tagged photograph of a graffito. It automatically determines its location by using reverse geocoding with the given GPS-coordinates and the Gisgraphy WebService. While asking the user for some more meta data, GraffDok analyses the image in parallel with this and tries to detect fore- and background – before extracting the drawing lines and make them stand alone. The command line based tool ImageMagick is used here as well as for accessing EXIF data. Any meta data is written to csv-files, which will stay easily accessible and can be integrated in TeX-files as well. The latter ones are converted to PDF at the end of the workflow, containing a table about all graffiti and a summary for each – including the generated characteristic graffiti pattern image. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 239 EP - 251 PB - Springer CY - Berlin ER - TY - JOUR A1 - Reso, Judith ED - Lambrecht, Anna-Lena ED - Margaria, Tiziana T1 - Protein Classification Workflow JF - Process Design for Natural Scientists: an agile model-driven approach N2 - The protein classification workflow described in this report enables users to get information about a novel protein sequence automatically. The information is derived by different bioinformatic analysis tools which calculate or predict features of a protein sequence. Also, databases are used to compare the novel sequence with known proteins. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 65 EP - 72 PB - Springer Verlag CY - Berlin ER - TY - JOUR A1 - Schulze, Gunnar T1 - Workflow for rapid metagenome analysis JF - Process design for natural scientists: an agile model-driven approach N2 - Analyses of metagenomes in life sciences present new opportunities as well as challenges to the scientific community and call for advanced computational methods and workflows. The large amount of data collected from samples via next-generation sequencing (NGS) technologies render manual approaches to sequence comparison and annotation unsuitable. Rather, fast and efficient computational pipelines are needed to provide comprehensive statistics and summaries and enable the researcher to choose appropriate tools for more specific analyses. The workflow presented here builds upon previous pipelines designed for automated clustering and annotation of raw sequence reads obtained from next-generation sequencing technologies such as 454 and Illumina. Employing specialized algorithms, the sequence reads are processed at three different levels. First, raw reads are clustered at high similarity cutoff to yield clusters which can be exported as multifasta files for further analyses. Independently, open reading frames (ORFs) are predicted from raw reads and clustered at two strictness levels to yield sets of non-redundant sequences and ORF families. Furthermore, single ORFs are annotated by performing searches against the Pfam database Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 88 EP - 100 PB - Springer CY - Berlin ER - TY - JOUR A1 - Vierheller, Janine ED - Lambrecht, Anna-Lena ED - Margaria, Tiziana T1 - Exploratory Data Analysis JF - Process Design for Natural Scientists: an agile model-driven approach N2 - In bioinformatics the term exploratory data analysis refers to different methods to get an overview of large biological data sets. Hence, it helps to create a framework for further analysis and hypothesis testing. The workflow facilitates this first important step of the data analysis created by high-throughput technologies. The results are different plots showing the structure of the measurements. The goal of the workflow is the automatization of the exploratory data analysis, but also the flexibility should be guaranteed. The basic tool is the free software R. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 110 EP - 126 PB - Axel Springer Verlag CY - Berlin ER - TY - JOUR A1 - Schütt, Christine T1 - Identification of differentially expressed genes JF - Process design for natural scientists: an agile model-driven approach N2 - With the jABC it is possible to realize workflows for numerous questions in different fields. The goal of this project was to create a workflow for the identification of differentially expressed genes. This is of special interest in biology, for it gives the opportunity to get a better insight in cellular changes due to exogenous stress, diseases and so on. With the knowledge that can be derived from the differentially expressed genes in diseased tissues, it becomes possible to find new targets for treatment. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 127 EP - 139 PB - Springer CY - Berlin ER - TY - JOUR A1 - Kuntzsch, Christian T1 - Visualization of data transfer paths JF - Process design for natural scientists: an agile model-driven approach N2 - A workflow for visualizing server connections using the Google Maps API was built in the jABC. It makes use of three basic services: An XML-based IP address geolocation web service, a command line tool and the Static Maps API. The result of the workflow is an URL leading to an image file of a map, showing server connections between a client and a target host. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 140 EP - 148 PB - Springer CY - Berlin ER - TY - JOUR A1 - Hibbe, Marcel ED - Lambrecht, Anna-Lena ED - Margaria, Tiziana T1 - Spotlocator - Guess Where the Photo Was Taken! JF - Process Design for Natural Scientists: an agile model-driven approach N2 - Spotlocator is a game wherein people have to guess the spots of where photos were taken. The photos of a defined area for each game are from panoramio.com. They are published at http://spotlocator. drupalgardens.com with an ID. Everyone can guess the photo spots by sending a special tweet via Twitter that contains the hashtag #spotlocator, the guessed coordinates and the ID of the photo. An evaluation is published for all tweets. The players are informed about the distance to the real photo spots and the positions are shown on a map. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 149 EP - 160 PB - Springer Verlag CY - Berlin ER - TY - JOUR A1 - Blaese, Leif T1 - Data mining for unidentified protein squences JF - Process design for natural scientists: an agile model-driven approach N2 - Through the use of next generation sequencing (NGS) technology, a lot of newly sequenced organisms are now available. Annotating those genes is one of the most challenging tasks in sequence biology. Here, we present an automated workflow to find homologue proteins, annotate sequences according to function and create a three-dimensional model. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 73 EP - 87 PB - Springer CY - Berlin ER - TY - JOUR A1 - Lis, Monika ED - Lambrecht, Anna-Lena ED - Margaria, Tiziana T1 - Constructing a Phylogenetic Tree JF - Process Design for Natural Scientists: an agile model-driven approach N2 - In this project I constructed a workflow that takes a DNA sequence as input and provides a phylogenetic tree, consisting of the input sequence and other sequences which were found during a database search. In this phylogenetic tree the sequences are arranged depending on similarities. In bioinformatics, constructing phylogenetic trees is often used to explore the evolutionary relationships of genes or organisms and to understand the mechanisms of evolution itself. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 101 EP - 109 PB - Springer Verlag CY - Berlin ER - TY - JOUR A1 - Froitzheim, Manuel A1 - Bergner, Nadine A1 - Schroeder, Ulrik ED - Schwill, Andreas T1 - Android-Workshop zur Vertiefung der Kenntnisse bezüglich Datenstrukturen und Programmierung in der Studieneingangsphase JF - HDI 2014 : Gestalten von Übergängen N2 - Die Studieneingangsphase stellt für Studierende eine Schlüsselphase des tertiären Ausbildungsabschnitts dar. Fachwissenschaftliches Wissen wird praxisfern vermittelt und die Studierenden können die Zusammenhänge zwischen den Themenfeldern der verschiedenen Vorlesungen nicht erkennen. Zur Verbesserung der Situation wurde ein Workshop entwickelt, der die Verbindung der Programmierung und der Datenstrukturen vertieft. Dabei wird das Spiel Go-Moku1 als Android-App von den Studierenden selbständig entwickelt. Die Kombination aus Software (Java, Android-SDK) und Hardware (Tablet-Computer) für ein kleines realistisches Softwareprojekt stellt für die Studierenden eine neue Erfahrung dar. Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-80247 VL - 2015 IS - 9 SP - 11 EP - 26 ER - TY - JOUR A1 - Shenoy, Pradeep A1 - Krauledat, Matthias A1 - Blankertz, Benjamin A1 - Rao, Rajesh P. N. A1 - Müller, Klaus-Robert T1 - Towards adaptive classification for BCI N2 - Non-stationarities are ubiquitous in EEG signals. They are especially apparent in the use of EEG-based brain- computer interfaces (BCIs): (a) in the differences between the initial calibration measurement and the online operation of a BCI, or (b) caused by changes in the subject's brain processes during an experiment (e.g. due to fatigue, change of task involvement, etc). In this paper, we quantify for the first time such systematic evidence of statistical differences in data recorded during offline and online sessions. Furthermore, we propose novel techniques of investigating and visualizing data distributions, which are particularly useful for the analysis of (non-) stationarities. Our study shows that the brain signals used for control can change substantially from the offline calibration sessions to online control, and also within a single session. In addition to this general characterization of the signals, we propose several adaptive classification schemes and study their performance on data recorded during online experiments. An encouraging result of our study is that surprisingly simple adaptive methods in combination with an offline feature selection scheme can significantly increase BCI performance Y1 - 2006 UR - http://iopscience.iop.org/1741-2552/3/1/R02/ U6 - https://doi.org/10.1088/1741-2560/3/1/R02 ER - TY - JOUR A1 - Bobda, Christophe T1 - Special issue on ReCoSoC 2007 : editorial Y1 - 2009 UR - http://www.sciencedirect.com/science/journal/01419331 U6 - https://doi.org/10.1016/j.micpro.2009.01.001 SN - 0141-9331 ER - TY - JOUR A1 - Blankertz, Benjamin A1 - Dornhege, Guido A1 - Krauledat, Matthias A1 - Müller, Klaus-Robert A1 - Kunzmann, Volker A1 - Losch, Florian A1 - Curio, Gabriel T1 - The Berlin brain-computer interface : EEG-based communication without subject training N2 - The Berlin Brain-Computer Interface (BBCI) project develops a noninvasive BCI system whose key features are 1) the use of well-established motor competences as control paradigms, 2) high-dimensional features from 128-channel electroencephalogram (EEG), and 3) advanced machine learning techniques. As reported earlier, our experiments demonstrate that very high information transfer rates can be achieved using the readiness potential (RP) when predicting the laterality of upcoming left-versus right-hand movements in healthy subjects. A more recent study showed that the RP similarily accompanies phantom movements in arm amputees, but the signal strength decreases with longer loss of the limb. In a complementary approach, oscillatory features are used to discriminate imagined movements (left hand versus right hand versus foot). In a recent feedback study with six healthy subjects with no or very little experience with BCI control, three subjects achieved an information transfer rate above 35 bits per minute (bpm), and further two subjects above 24 and 15 bpm, while one subject could not achieve any BCI control. These results are encouraging for an EEG-based BCI system in untrained subjects that is independent of peripheral nervous system activity and does not rely on evoked potentials even when compared to results with very well-trained subjects operating other BCI systems Y1 - 2006 UR - http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7333 U6 - https://doi.org/10.1109/Tnsre.2006.875557 SN - 1534-4320 ER - TY - JOUR A1 - Willig, Andreas A1 - Mitschke, Robert T1 - Results of bit error measurements with sensor nodes and casuistic consequences for design of energy-efficient error control schemes N2 - For the proper design of energy-efficient error control schemes some insight into channel error patterns is needed. This paper presents bit error and packet loss measurements taken with sensor nodes running the popular RFM Y1 - 2006 SN - 978-3-540-32158-3 ER - TY - JOUR A1 - Rozinat, A A1 - Van der Aalst, Wil M. P. T1 - Conformance testing: Measuring the fit and appropriateness of event logs and process models N2 - Most information systems log events (e.g., transaction logs, audit traits) to audit and monitor the processes they support. At the same time, many of these processes have been explicitly modeled. For example, SAP R/3 logs events in transaction logs and there are EPCs (Event-driven Process Chains) describing the so-called reference models. These reference models describe how the system should be used. The coexistence of event logs and process models raises an interesting question: "Does the event log conform to the process model and vice versa?". This paper demonstrates that there is not a simple answer to this question. To tackle the problem, we distinguish two dimensions of conformance: fitness (the event log may be the result of the process modeled) and appropriateness (the model is a likely candidate from a structural and behavioral point of view). Different metrics have been defined and a Conformance Checker has been implemented within the ProM Framework Y1 - 2006 ER - TY - JOUR A1 - Konczak, Kathrin T1 - Voting Theory in Answer Set Programming Y1 - 2006 ER - TY - JOUR A1 - Gerbser, Martin A1 - Lee, Joohyung A1 - Lierler, Yuliya T1 - Elementary sets for logic programs Y1 - 2006 SN - 978-1-57735-281-5 ER - TY - JOUR A1 - Konczak, Kathrin T1 - Weak order equivalence for Logic Programs with Prefernces Y1 - 2006 ER - TY - JOUR A1 - Konczak, Kathrin A1 - Vogel, Ralf T1 - Abduction and Preferences in Linguistics Y1 - 2005 UR - http://www.cs.uni-potsdam.de/~konczak/Papers/konvog05a.pdf ER - TY - JOUR A1 - Calude, C. S. A1 - Jurgensen, Helmut T1 - Is complexity a source of incompleteness? N2 - In this paper we prove Chaitin's "heuristic principle," the theorems of a finitely-specified theory cannot be significantly more complex than the theory itself, for an appropriate measure of complexity. We show that the measure is invariant under the change of the Godel numbering. For this measure, the theorems of a finitely-specified, sound, consistent theory strong enough to formalize arithmetic which is arithmetically sound (like Zermelo-Fraenkel set theory with choice or Peano Arithmetic) have bounded complexity, hence every sentence of the theory which is significantly more complex than the theory is unprovable. Previous results showing that incompleteness is not accidental, but ubiquitous are here reinforced in probabilistic terms: the probability that a true sentence of length n is provable in the theory tends to zero when n tends to infinity, while the probability that a sentence of length n is true is strictly positive. (c) 2004 Elsevier Inc. All rights reserved Y1 - 2005 SN - 0196-8858 ER - TY - JOUR A1 - Bordihn, Henning T1 - On the number of components in cooperating distributed grammar systems N2 - It is proved that the number of components in context-free cooperating distributed (CD) grammar systems can be reduced to 3 when they are working in the so-called sf-mode of derivation, which is the cooperation protocol which has been considered first for CD grammar systems. In this derivation mode, a component continues the derivation until and unless there is a nonterminal in the sentential form which cannot be rewritten according to that component. Moreover, it is shown that CD grammar systems in sf-mode with only one component can generate only the context-free languages but they can generate non-context-free languages if two components are used. The sf-mode of derivation is compared with other well-known cooperation protocols with respect to the hierarchies induced by the number of components. (C) 2004 Elsevier B.V. All rights reserved Y1 - 2005 SN - 0304-3975 ER - TY - JOUR A1 - Beerenwinkel, Niko A1 - Sing, Tobias A1 - Lengauer, Thomas A1 - Rahnenfuhrer, Joerg A1 - Roomp, Kirsten A1 - Savenkov, Igor A1 - Fischer, Roman A1 - Hoffmann, Daniel A1 - Selbig, Joachim A1 - Korn, Klaus A1 - Walter, Hauke A1 - Berg, Thomas A1 - Braun, Patrick A1 - Faetkenheuer, Gerd A1 - Oette, Mark A1 - Rockstroh, Juergen A1 - Kupfer, Bernd A1 - Kaiser, Rolf A1 - Daeumer, Martin T1 - Computational methods for the design of effective therapies against drug resistant HIV strains N2 - The development of drug resistance is a major obstacle to successful treatment of HIV infection. The extraordinary replication dynamics of HIV facilitates its escape from selective pressure exerted by the human immune system and by combination drug therapy. We have developed several computational methods whose combined use can support the design of optimal antiretroviral therapies based on viral genomic data Y1 - 2005 ER - TY - JOUR A1 - Brzozowski, J. A. A1 - Jürgensen, Helmut T1 - Representation of semiautomata by canonical words and equivalences N2 - We study a novel representation of semiautomata, which is motivated by the method of trace-assertion specifications of software modules. Each state of the semiautomaton is represented by an arbitrary word leading to that state, the canonical word. The transitions of the semiautomaton give rise to a right congruence, the state-equivalence, on the set of input words of the semiautomaton: two words are state-equivalent if and only if they lead to the same state. We present a simple algorithm for finding a set of generators for state-equivalence. Directly from this set of generators, we construct a confluent prefix-rewriting system which permits us to transform any word to its canonical representative. In general, the rewriting system may allow infinite derivations. To address this issue, we impose the condition of prefix-continuity on the set of canonical words. A set is prefix-continuous if, whenever a word w and a prefix u of w axe in the set, then all the prefixes of w longer than u are also in the set. Prefix-continuous sets include prefix-free and prefix-closed sets as special cases. We prove that the rewriting system is Noetherian if and only if the set of canonical words is prefix-continuous. Furthermore, if the set of canonical words is prefix- continuous, then the set of rewriting rules is irredundant. We show that each prefix-continuous canonical set corresponds to a spanning forest of the semiautomaton Y1 - 2005 SN - 0129-0541 ER - TY - JOUR A1 - Bordihn, Henning A1 - Holzer, Markus A1 - Kutrib, Martin T1 - Unsolvability levels of operation problems for subclasses of context-free languages N2 - We investigate the operation problem for linear and deterministic context-free languages: Fix an operation on formal languages. Given linear (deterministic, respectively) context-free languages, is the application of this operation to the given languages still a linear (deterministic, respectively) context-free language? Besides the classical operations, for which the linear and deterministic context-free languages are not closed, we also consider the recently introduced root and power operation. We show non-semidecidability, to be more precise, we show completeness for the second level of the arithmetic hierarchy for all of the aforementioned operations, except for the power operation, if the underlying alphabet contains at least two letters. The result for the power opera, tion solves an open problem stated in Theoret. Comput. Sci. 314 (2004) 445-449 Y1 - 2005 SN - 0129-0541 ER - TY - JOUR A1 - Bruggemeier, M. A1 - Dovifat, A. A1 - Kubisch, D. T1 - Micropolitical innovation arenas as a tool for analyzing innovation processes in the context of electronic government N2 - E-Government requires technical and organizational innovation. Research has already shown that the respective innovation process is complex and contingent upon specific organizational structures. Managing such innovation processes successfully is difficult. Drawing on assumptions of micropolitical behavior, a framework of innovation arenas is proposed. It supports the analysis of ongoing E-Government projects as well as the ex post investigation of successful or failed projects. Testing this framework in case studies already demonstrates its usefulness for individual actors making strategic choices about change management. Furthermore, the results indicate that many commonly held assumptions about successful change management have to be reconsidered Y1 - 2005 SN - 0937-6429 ER - TY - JOUR A1 - Meinecke, Frank C. A1 - Ziehe, Andreas A1 - Kurths, Jürgen A1 - Müller, Klaus-Robert T1 - Measuring phase synchronization of superimposed signals N2 - Phase synchronization is an important phenomenon that occurs in a wide variety of complex oscillatory processes. Measuring phase synchronization can therefore help to gain fundamental insight into nature. In this Letter we point out that synchronization analysis techniques can detect spurious synchronization, if they are fed with a superposition of signals such as in electroencephalography or magnetoencephalography data. We show how techniques from blind source separation can help to nevertheless measure the true synchronization and avoid such pitfalls Y1 - 2005 SN - 0031-9007 ER - TY - JOUR A1 - Scholz, Matthias A1 - Kaplan, F. A1 - Guy, C. L. A1 - Kopka, Joachim A1 - Selbig, Joachim T1 - Non-linear PCA : a missing data approach N2 - Motivation: Visualizing and analysing the potential non-linear structure of a dataset is becoming an important task in molecular biology. This is even more challenging when the data have missing values. Results: Here, we propose an inverse model that performs non-linear principal component analysis (NLPCA) from incomplete datasets. Missing values are ignored while optimizing the model, but can be estimated afterwards. Results are shown for both artificial and experimental datasets. In contrast to linear methods, non-linear methods were able to give better missing value estimations for non-linear structured data. Application: We applied this technique to a time course of metabolite data from a cold stress experiment on the model plant Arabidopsis thaliana, and could approximate the mapping function from any time point to the metabolite responses. Thus, the inverse NLPCA provides greatly improved information for better understanding the complex response to cold stress Y1 - 2005 SN - 1367-4803 ER - TY - JOUR A1 - Cordes, Frank A1 - Kaiser, Rolf A1 - Selbig, Joachim T1 - Bioinformatics approach to predicting HIV drug resistance N2 - The emergence of drug resistance remains one of the most challenging issues in the treatment of HIV-1 infection. The extreme replication dynamics of HIV facilitates its escape from the selective pressure exerted by the human immune system and by the applied combination drug therapy. This article reviews computational methods whose combined use can support the design of optimal antiretroviral therapies based on viral genotypic and phenotypic data. Genotypic assays are based on the analysis of mutations associated with reduced drug susceptibility, but are difficult to interpret due to the numerous mutations and mutational patterns that confer drug resistance. Phenotypic resistance or susceptibility can be experimentally evaluated by measuring the inhibition of the viral replication in cell culture assays. However, this procedure is expensive and time consuming Y1 - 2006 UR - http://www.expert-reviews.com/loi/erm U6 - https://doi.org/10.1586/14737159.6.2.207 SN - 1473-7159 ER - TY - JOUR A1 - Lemm, Steven A1 - Curio, Gabriel A1 - Hlushchuk, Yevhen A1 - Müller, Klaus-Robert T1 - Enhancing the signal-to-noise ratio of ICA-based extracted ERPs N2 - When decomposing single trial electroencephalography it is a challenge to incorporate prior physiological knowledge. Here, we develop a method that uses prior information about the phase-locking property of event-related potentials in a regularization framework to bias a blind source separation algorithm toward an improved separation of single-trial phase-locked responses in terms of an increased signal-to-noise ratio. In particular, we suggest a transformation of the data, using weighted average of the single trial and trial-averaged response, that redirects the focus of source separation methods onto the subspace of event-related potentials. The practical benefit with respect to an improved separation of such components from ongoing background activity and extraneous noise is first illustrated on artificial data and finally verified in a real-world application of extracting single-trial somatosensory evoked potentials from multichannel EEG-recordings Y1 - 2006 UR - http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=10 U6 - https://doi.org/10.1109/Tbme.2006.870258 SN - 0018-9294 ER - TY - JOUR A1 - Laub, Julian A1 - Roth, Volker A1 - Buhmann, Joachim A1 - Müller, Klaus-Robert T1 - On the information and representation of non-Euclidean pairwise data N2 - Two common data representations are mostly used in intelligent data analysis, namely the vectorial and the pairwise representation. Pairwise data which satisfy the restrictive conditions of Euclidean spaces can be faithfully translated into a Euclidean vectorial representation by embedding. Non-metric pairwise data with violations of symmetry, reflexivity or triangle inequality pose a substantial conceptual problem for pattern recognition since the amount of predictive structural information beyond what can be measured by embeddings is unclear. We show by systematic modeling of non-Euclidean pairwise data that there exists metric violations which can carry valuable problem specific information. Furthermore, Euclidean and non-metric data can be unified on the level of structural information contained in the data. Stable component analysis selects linear subspaces which are particularly insensitive to data fluctuations. Experimental results from different domains support our pattern recognition strategy. Y1 - 2006 UR - http://www.sciencedirect.com/science/journal/00313203 U6 - https://doi.org/10.1016/j.patcog.2006.04.016 SN - 0031-3203 ER - TY - JOUR A1 - Kawanabe, Motoaki A1 - Blanchard, Gilles A1 - Sugiyama, Masashi A1 - Spokoiny, Vladimir G. A1 - Müller, Klaus-Robert T1 - A novel dimension reduction procedure for searching non-Gaussian subspaces N2 - In this article, we consider high-dimensional data which contains a low-dimensional non-Gaussian structure contaminated with Gaussian noise and propose a new linear method to identify the non-Gaussian subspace. Our method NGCA (Non-Gaussian Component Analysis) is based on a very general semi-parametric framework and has a theoretical guarantee that the estimation error of finding the non-Gaussian components tends to zero at a parametric rate. NGCA can be used not only as preprocessing for ICA, but also for extracting and visualizing more general structures like clusters. A numerical study demonstrates the usefulness of our method Y1 - 2006 UR - http://www.springerlink.com/content/105633/ U6 - https://doi.org/10.1007/11679363_19 SN - 0302-9743 ER - TY - JOUR A1 - Pernici, Barbara A1 - Weske, Mathias T1 - Business process management Y1 - 2006 SN - 0169-023X ER - TY - JOUR A1 - Camales, Renaud T1 - Explicit formulation of the solution of Hamada-Leray-Wagschal's theorem N2 - In this paper, an explicit formula of the solution of Hainada-Leray-Wagschal's theorem is given. For this, only structure's theorem of finite dimensional determination's function and linear algebra technics developped in [1] are used Y1 - 2005 SN - 0034-5318 ER - TY - JOUR A1 - Häger, Sebastian A1 - Schubert, Wolfgang T1 - Assoziationen in Softwarearchitekturen JF - Preprint / Universität Potsdam, Institut für Informatik Y1 - 2005 SN - 0946-7580 VL - 2005, 2 PB - Univ. CY - Potsdam ER - TY - JOUR A1 - Faber, Wolfgang A1 - Konczak, Kathrin T1 - Strong Equivalence for Logic Programs with Preferences Y1 - 2005 UR - http://www.cs.uni-potsdam.de/~konczak/Papers/fabkon05a.pdf ER - TY - JOUR A1 - Konczak, Kathrin A1 - Lang, Jerome T1 - Voting procedures with incomplete preferences Y1 - 2005 UR - http://koala.ilog.fr/wiki/pub/Preference05/WsProceedings/Pref05.pdf ER - TY - JOUR A1 - Konczak, Kathrin A1 - Vogel, Ralf T1 - Abduction and preferences in linguistics : Extended abstract Y1 - 2005 UR - http://www.cs.uni-potsdam.de/~konczak/Papers/konvog05b.pdf SN - 0302-9743 ER - TY - JOUR A1 - Goessel, Michael A1 - Morozov, A. V. A1 - Sapozhnikov, V. V. A1 - Sapozhaikov, Vl. V. T1 - Checking combinational circuits by the method of logic complement N2 - Design of fully self-testing combinational circuits was considered. A theorem defining the conditions for guaranteed logic complement-based design of fully self-testing circuit was proved. Examples were presented Y1 - 2005 SN - 0005-1179 ER - TY - JOUR A1 - Meinecke, Frank C. A1 - Harmeling, Stefan A1 - Müller, Klaus-Robert T1 - Inlier-based ICA with an application to superimposed images N2 - This paper proposes a new independent component analysis (ICA) method which is able to unmix overcomplete mixtures of sparce or structured signals like speech, music or images. Furthermore, the method is designed to be robust against outliers, which is a favorable feature for ICA algorithms since most of them are extremely sensitive to outliers. Our approach is based on a simple outlier index. However, instead of robustifying an existing algorithm by some outlier rejection technique we show how this index can be used directly to solve the ICA problem for super-Gaussian sources. The resulting inlier-based ICA (IBICA) is outlier-robust by construction and can be used for standard ICA as well as for overcomplete ICA (i.e. more source signals than observed signals). (c) 2005 Wiley Periodicals, Inc Y1 - 2005 SN - 0899-9457 ER - TY - JOUR A1 - Lemm, Steven A1 - Blankertz, Benjamin A1 - Curio, Gabriel A1 - Müller, Klaus-Robert T1 - Spatio-spectral filters for improving the classification of single trial EEG N2 - Data recorded in electroencephalogram (EEG)-based brain-computer interface experiments is generally very noisy, non-stationary, and contaminated with artifacts that can deteriorate discrimination/classification methods. In this paper, we extend the common spatial pattern (CSP) algorithm with the aim to alleviate these adverse effects. In particular, we suggest an extension of CSP to the state space, which utilizes the method of time delay embedding. As we will show, this allows for individually tuned frequency filters at each electrode position and, thus, yields an improved and more robust machine learning procedure. The advantages of the proposed method over the original CSP method are verified in terms of an improved information transfer rate (bits per trial) on a set of EEG-recordings from experiments of imagined limb movements Y1 - 2005 SN - 0018-9294 ER - TY - JOUR A1 - Willig, Andreas A1 - Matheus, K. A1 - Wolisz, A. T1 - Wireless technology in industrial networks N2 - With the success of wireless technologies in consumer electronics, standard wireless technologies are envisioned for the deployment in industrial environments as well. Industrial applications involving mobile subsystems or just the desire to save cabling make wireless technologies attractive. Nevertheless, these applications often have stringent requirements on reliability and timing. In wired environments, timing and reliability are well catered for by fieldbus systems (which are a mature technology designed to enable communication between digital controllers and the sensors and actuators interfacing to a physical process). When wireless links are included, reliability and timing requirements are significantly more difficult to meet, due to the adverse properties of the radio channels. In this paper we thus discuss some key issues coming up in wireless fieldbus and wireless industrial communication systems:1)fundamental problems like achieving timely and reliable transmission despite channel errors; 2) the usage of existing wireless technologies for this specific field of applications; and 3) the creation of hybrid systems in which wireless stations are included into existing wired systems Y1 - 2005 SN - 0018-9219 ER - TY - JOUR A1 - Weske, Mathias A1 - van der Aalst, Wil M. P. A1 - Verbeek, H. M. W. T1 - Advances in business process management Y1 - 2004 SN - 0169-023X ER - TY - JOUR A1 - Nicolelis, Miguel Angelo L. A1 - Birbaumer, Niels A1 - Muller, K. R. T1 - Untitled Y1 - 2004 SN - 0018-9294 ER - TY - JOUR A1 - Bordihn, Henning T1 - Context-freeness of the power of context-free languages is undecidable N2 - The power of a language L is the set of all powers of the words in L. In this paper, the following decision problem is investigated. Given a context-free language L, is the power of L context-free? We show that this problem is decidable for languages over unary alphabets, but it is undecidable whenever languages over alphabets with at least two letters are considered. (C) 2003 Elsevier B.V. All rights reserved Y1 - 2004 SN - 0304-3975 ER - TY - JOUR A1 - Dornhege, Guido A1 - Blankertz, Benjamin A1 - Curio, Gabriel A1 - Müller, Klaus-Robert T1 - Boosting bit rates in noninvasive EEG single-trial classifications by feature combination and multiclass paradigms N2 - Noninvasive electroencephalogram (EEG) recordings provide for easy and safe access to human neocortical processes which can be exploited for a brain-computer interface (BCI). At present, however, the use of BCIs is severely limited by low bit-transfer rates. We systematically analyze and develop two recent concepts, both capable of enhancing the information gain from multichannel scalp EEG recordings: 1) the combination of classifiers, each specifically tailored for different physiological phenomena, e.g., slow cortical potential shifts, such as the premovement Bereitschaftspotential or differences in spatio-spectral distributions of brain activity (i.e., focal event-related desynchronizations) and 2) behavioral paradigms inducing the subjects to generate one out of several brain states (multiclass approach) which all bare a distinctive spatio-temporal signature well discriminable in the standard scalp EEG. We derive information-theoretic predictions and demonstrate their relevance in experimental data. We will show that a suitably arranged interaction between these concepts can significantly boost BCI performances Y1 - 2004 ER - TY - JOUR A1 - Blankertz, Benjamin A1 - Müller, Klaus-Robert A1 - Curio, Gabriel A1 - Vaughan, Theresa M. A1 - Schalk, Gerwin A1 - Wolpaw, Jonathan R. A1 - Schlogl, Alois A1 - Neuper, Christa A1 - Pfurtscheller, Gert A1 - Hinterberger, Thilo A1 - Schroder, Michael A1 - Birbaumer, Niels T1 - The BCI competition 2003 : Progress and perspectives in detection and discrimination of EEG single trials N2 - Interest in developing a new method of man-to-machine communication-a brain-computer interface (BCI)-has grown steadily over the past few decades. BCIs create a new communication channel between the brain and an output device by bypassing conventional motor output pathways of nerves and muscles. These systems use signals recorded from the scalp, the surface of the cortex, or from inside the brain to enable users to control a variety of applications including simple word-processing software and orthotics. BCI technology could therefore provide a new communication and control option for individuals who cannot otherwise express their wishes to the outside world. Signal processing and classification methods are essential tools in the development of improved BCI technology. We organized the BCI Competition 2003 to evaluate the current state of the art of these tools. Four laboratories well versed in EEG-based BCI research provided six data sets in a documented format. We made these data sets (i.e., labeled training sets and unlabeled test sets) and their descriptions available on the Internet. The goal in the competition was to maximize the performance measure for the test labels. Researchers worldwide tested their algorithms and competed for the best classification results. This paper describes the six data sets and the results and function of the most successful algorithms Y1 - 2004 SN - 0018-9294 ER - TY - JOUR A1 - Harmeling, Stefan A1 - Meinecke, Frank C. A1 - Müller, Klaus-Robert T1 - Injecting noise for analysing the stability of ICA components N2 - Usually, noise is considered to be destructive. We present a new method that constructively injects noise to assess the reliability and the grouping structure of empirical ICA component estimates. Our method can be viewed as a Monte-Carlo-style approximation of the curvature of some performance measure at the solution. Simulations show that the true root-mean-squared angle distances between the real sources and the source estimates can be approximated well by our method. In a toy experiment, we see that we are also able to reveal the underlying grouping structure of the extracted ICA components. Furthermore, an experiment with fetal ECG data demonstrates that our approach is useful for exploratory data analysis of real-world data. (C) 2003 Elsevier B.V. All rights reserved Y1 - 2004 SN - 0165-1684 ER - TY - JOUR A1 - Goessel, Michael A1 - Chakrabarty, Krishnendu A1 - Ocheretnij, V. A1 - Leininger, Andreas T1 - A signature analysis technique for the identification of failing vectors with application to Scan-BIST N2 - We present a new technique for uniquely identifying a single failing vector in an interval of test vectors. This technique is applicable to combinational circuits and for scan-BIST in sequential circuits with multiple scan chains. The proposed method relies on the linearity properties of the MISR and on the use of two test sequences, which are both applied to the circuit under test. The second test sequence is derived from the first in a straightforward manner and the same test pattern source is used for both test sequences. If an interval contains only a single failing vector, the algebraic analysis is guaranteed to identify it. We also show analytically that if an interval contains two failing vectors, the probability that this case is interpreted as one failing vector is very low. We present experimental results for the ISCAS benchmark circuits to demonstrate the use of the proposed method for identifying failing test vectors Y1 - 2004 SN - 0923-8174 ER - TY - JOUR A1 - Müller, Klaus-Robert A1 - Vigario, R. A1 - Meinecke, Frank C. A1 - Ziehe, Andreas T1 - Blind source separation techniques for decomposing event-related brain signals N2 - Recently blind source separation (BSS) methods have been highly successful when applied to biomedical data. This paper reviews the concept of BSS and demonstrates its usefulness in the context of event-related MEG measurements. In a first experiment we apply BSS to artifact identification of raw MEG data and discuss how the quality of the resulting independent component projections can be evaluated. The second part of our study considers averaged data of event-related magnetic fields. Here, it is particularly important to monitor and thus avoid possible overfitting due to limited sample size. A stability assessment of the BSS decomposition allows to solve this task and an additional grouping of the BSS components reveals interesting structure, that could ultimately be used for gaining a better physiological modeling of the data Y1 - 2004 SN - 0218-1274 ER - TY - JOUR A1 - Sugiyama, Masashi A1 - Kawanabe, Motoaki A1 - Müller, Klaus-Robert T1 - Trading variance reduction with unbiasedness : the regularized subspace information criterion for robust model selection in kernel regression N2 - A well-known result by Stein (1956) shows that in particular situations, biased estimators can yield better parameter estimates than their generally preferred unbiased counterparts. This letter follows the same spirit, as we will stabilize the unbiased generalization error estimates by regularization and finally obtain more robust model selection criteria for learning. We trade a small bias against a larger variance reduction, which has the beneficial effect of being more precise on a single training set. We focus on the subspace information criterion (SIC), which is an unbiased estimator of the expected generalization error measured by the reproducing kernel Hilbert space norm. SIC can be applied to the kernel regression, and it was shown in earlier experiments that a small regularization of SIC has a stabilization effect. However, it remained open how to appropriately determine the degree of regularization in SIC. In this article, we derive an unbiased estimator of the expected squared error, between SIC and the expected generalization error and propose determining the degree of regularization of SIC such that the estimator of the expected squared error is minimized. Computer simulations with artificial and real data sets illustrate that the proposed method works effectively for improving the precision of SIC, especially in the high-noise-level cases. We furthermore compare the proposed method to the original SIC, the cross-validation, and an empirical Bayesian method in ridge parameter selection, with good results Y1 - 2004 SN - 0899-7667 ER - TY - JOUR A1 - Ziehe, Andreas A1 - Kawanabe, Motoaki A1 - Harmeling, Stefan T1 - Blind separation of post-nonlinear mixtures using linearizing transformations and temporal decorrelation N2 - We propose two methods that reduce the post-nonlinear blind source separation problem (PNL-BSS) to a linear BSS problem. The first method is based on the concept of maximal correlation: we apply the alternating conditional expectation (ACE) algorithm-a powerful technique from nonparametric statistics-to approximately invert the componentwise nonlinear functions. The second method is a Gaussianizing transformation, which is motivated by the fact that linearly mixed signals before nonlinear transformation are approximately Gaussian distributed. This heuristic, but simple and efficient procedure works as good as the ACE method. Using the framework provided by ACE, convergence can be proven. The optimal transformations obtained by ACE coincide with the sought-after inverse functions of the nonlinearitics. After equalizing the nonlinearities, temporal decorrelation separation (TDSEP) allows us to recover the source signals. Numerical simulations testing "ACE-TD" and "Gauss-TD" on realistic examples are performed with excellent results Y1 - 2004 SN - 1532-4435 ER - TY - JOUR A1 - Linke, Thomas T1 - Suitable graphs for answer set programming Y1 - 2003 UR - http://sunsite.informatik.rwth-aachen.de/Publications/CEUR-WS/Vol-78/ SN - 1613-0073 ER - TY - JOUR A1 - Linke, Thomas T1 - Using nested logic programs for answer set programming Y1 - 2003 UR - http://sunsite.informatik.rwth-aachen.de/Publications/CEUR-WS/Vol-78/ SN - 1613-0073 ER - TY - JOUR A1 - Anger, Christian A1 - Konczak, Kathrin A1 - Linke, Thomas T1 - NoMoRe: A system for non-monotonic reasoning with logic programs under answer set semantics Y1 - 2002 SN - 3-540-42254-4 ER - TY - JOUR A1 - Linke, Thomas A1 - Anger, Christian A1 - Konczak, Kathrin T1 - More on nomore Y1 - 2002 SN - 3-540-44190-5 ER - TY - JOUR A1 - Arnold, Holger T1 - A linearized DPLL calculus with learning N2 - This paper describes the proof calculus LD for clausal propositional logic, which is a linearized form of the well-known DPLL calculus extended by clause learning. It is motivated by the demand to model how current SAT solvers built on clause learning are working, while abstracting from decision heuristics and implementation details. The calculus is proved sound and terminating. Further, it is shown that both the original DPLL calculus and the conflict-directed backtracking calculus with clause learning, as it is implemented in many current SAT solvers, are complete and proof-confluent instances of the LD calculus. N2 - Dieser Artikel beschreibt den Beweiskalkül LD für aussagenlogische Formeln in Klauselform. Dieser Kalkül ist eine um Klausellernen erweiterte linearisierte Variante des bekannten DPLL-Kalküls. Er soll dazu dienen, das Verhalten von auf Klausellernen basierenden SAT-Beweisern zu modellieren, wobei von Entscheidungsheuristiken und Implementierungsdetails abstrahiert werden soll. Es werden Korrektheit und Terminierung des Kalküls bewiesen. Weiterhin wird gezeigt, dass sowohl der ursprüngliche DPLL-Kalkül als auch der konfliktgesteuerte Rücksetzalgorithmus mit Klausellernen, wie er in vielen aktuellen SAT-Beweisern implementiert ist, vollständige und beweiskonfluente Spezialisierungen des LD-Kalküls sind. KW - SAT KW - DPLL KW - Klausellernen KW - Automatisches Beweisen KW - SAT KW - DPLL KW - Clause Learning KW - Automated Theorem Proving Y1 - 2007 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-15421 ER - TY - JOUR A1 - Frank, Mario T1 - Axiom relevance decision engine : technical report N2 - This document presents an axiom selection technique for classic first order theorem proving based on the relevance of axioms for the proof of a conjecture. It is based on unifiability of predicates and does not need statistical information like symbol frequency. The scope of the technique is the reduction of the set of axioms and the increase of the amount of provable conjectures in a given time. Since the technique generates a subset of the axiom set, it can be used as a preprocessor for automated theorem proving. This technical report describes the conception, implementation and evaluation of ARDE. The selection method, which is based on a breadth-first graph search by unifiability of predicates, is a weakened form of the connection calculus and uses specialised variants or unifiability to speed up the selection. The implementation of the concept is evaluated with comparison to the results of the world championship of theorem provers of the year 2012 (CASC J6). It is shown that both the theorem prover leanCoP which uses the connection calculus and E which uses equality reasoning, can benefit from the selection approach. Also, the evaluation shows that the concept is applyable for theorem proving problems with thousands of formulae and that the selection is independent from the calculus used by the theorem prover. N2 - Dieser technische Report beschreibt die Konzeption, Implementierung und Evaluation eines Verfahrens zur Auswahl von logischen Formeln bezüglich derer Relevanz für den Beweis einer logischen Formel. Das Verfahren wird ausschließlich für die Prädikatenlogik erster Ordnung angewandt, wenngleich es auch für höherstufige Prädikatenlogiken geeignet ist. Das Verfahren nutzt eine unifikationsbasierte Breitensuche im Graphen wobei jeder Knoten im Graphen ein Prädikat und jede existierende Kante eine Unifizierbarkeitsrelation ist. Ziel des Verfahrens ist die Reduktion einer gegebenen Menge von Formeln auf eine für aktuelle Theorembeweiser handhabbare Größe. Daher ist das Verfahren als Präprozess-Schritt für das automatische Theorembeweisen geeignet. Zur Beschleunigung der Suche wird neben der Standard-Unifikation eine abgeschwächte Unifikation verwendet. Das System wurde während der Weltmeisterschaft der Theorembeweiser im Jahre 2014 (CASC J6) in Manchester zusammen mit dem Theorembeweiser leanCoP eingereicht und konnte leanCoP dabei unterstützen, Probleme zu lösen, die leanCoP alleine nicht handhaben kann. Die Tests mit leanCoP und dem Theorembeweiser E im Nachgang zu der Weltmeisterschaft zeigen, dass das Verfahren unabhängig von dem verwendeten Kalkül ist und bei beiden Theorembeweisern positive Auswirkungen auf die Beweisbarkeit von Problemen mit großen Formelmengen hat. KW - Relevanz KW - Graphensuche KW - Theorembeweisen KW - Preprocessing KW - Unifikation KW - relevance KW - graph-search KW - preprocessing KW - unification KW - theorem Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-72128 ER - TY - JOUR A1 - Knobelsdorf, Maria A1 - Kreitz, Christoph T1 - Ein konstruktivistischer Lehransatz für die Einführungsveranstaltung der Theoretischen Informatik JF - Commentarii informaticae didacticae : (CID) N2 - Ausgehend von einem sozial-konstruktivistischen Verständnis von Lernprozessen und unter der besonderen Berücksichtigung der durch die Bologna-Studienreform angeregten Kompetenzorientierung, haben wir in den letzten Jahren einen hochschuldidaktischen Ansatz für die Einführungsveranstaltung im Bereich der Theoretischen Informatik an der Universität Potsdam entwickelt und praktisch erprobt. Nach zahlreichen Experimenten und mit einer Durchfallquote von zuletzt 6% im Wintersemester 2011/2012 haben wir den Eindruck, dass der Ansatz den Studierenden jene Lernumgebung und -anregung bietet, die ihnen hilft, die entsprechenden Fachkompetenzen in der Veranstaltung zu entwickeln. In diesem Artikel stellen wir unseren Ansatz vor und skizzieren abschlieÿend, wie wir diesen im nächsten Wintersemester empirisch evaluieren werden. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64845 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 21 EP - 32 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Göttel, Timo T1 - Schnupperveranstaltungen Informatik in der Hochschullandschaft BT - Angebot vs. Nachfrage? JF - Commentarii informaticae didacticae : (CID) N2 - Die vorliegende Arbeit erörtert die Frage, wie Nachwuchs für das Informatikstudium nachhaltig gesichert werden kann. Dazu werden Befragungen unter Schülerinnen und Schülern (13-16 Jahre), sowie aktuelle Informatik-Schnupperangebote für Schülerinnen und Schüler an deutschsprachigen Hochschulen vorgestellt und untersucht. Diese Gegenüberstellung zeigt deutlich, dass die Angebote nur bedingt eine breite Zielgruppe ansprechen und dass weitere Formate und Inhalte notwendig sind, um Schülerinnen und Schüler frühzeitig und in voller Breite zu erreichen und für das Informatikstudium zu begeistern. Daraus wird abgeleitet, dass Missverständnisse und Probleme mit der Informatik im Schulkontext aufgegriffen werden müssen. Das vorgestellte Programm Schulbotschafter Informatik stellt einen möglichen Weg dar, um dies zu erreichen und übliche Schnupperangebote zu ergänzen. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64860 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 45 EP - 55 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Götz, Christian A1 - Brinda, Torsten T1 - Sind soziale Netzwerke geeignet, um darin für Informatikstudiengänge zu werben? JF - Commentarii informaticae didacticae : (CID) N2 - Durch den bundesweiten Rückgang der Schülerzahlen und einer steigenden Zahl von Bildungsangeboten geraten Universitäten und Hochschulen in den nächsten Jahren weiter in eine Wettbewerbssituation, weshalb sie effektive Marketingmaßnahmen entwickeln müssen, um Schülerinnen und Schüler möglichst frühzeitig für das jeweilige Angebot (z. B. Informatik- und informatiknahe Studiengänge) zu interessieren. Ein Medium, über das sich potenziell sehr viele Jugendliche erreichen lassen, sind dabei soziale Netzwerke. Diese Arbeit präsentiert Ergebnisse einer Studie unter Informatikstudienanfängerinnen und -anfängern zum Nutzungsverhalten sozialer Netzwerke und zieht Schlussfolgerungen zu deren Eignung als Werbe- und Informationskanal für die Zielgruppe der Informatikinteressierten. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-65017 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 137 EP - 142 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Apel, Rebecca A1 - Berg, Tobias A1 - Bergner, Nadine A1 - Chatti, Mhamed Amine A1 - Holz, Jan A1 - Leicht-Scholten, Carmen A1 - Schroeder, Ulrike T1 - Ein vierstufiges Förderkonzept für die Studieneingangsphase in der Informatik JF - Commentarii informaticae didacticae : (CID) N2 - Es wird ein vierstufiges Förderkonzept für die Studieneingangsphase im Fach Informatik beschrieben, das derzeit im Rahmen des Projekts IGaDtools4MINT an der RWTH Aachen auf der Basis einer Literaturanalyse und eines daraus abgeleiteten Indikatorenkatalogs entwickelt wird. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-65025 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 143 EP - 148 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Schirmer, Ingrid A1 - Rick, Detlef T1 - Persönlichkeitsbildung und informatische Professionalisierung BT - ethische Kompetenz als Grundlage nachhaltiger Entscheidungen JF - Commentarii informaticae didacticae : (CID) Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-65053 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 160 EP - 169 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Döllner, Jürgen Roland Friedrich T1 - Geospatial digital rights management in geovisualization N2 - Geovisualization offers powerful tools, techniques, and strategies to present, explore, analyze, and manage geoinformation. Interactive geovirtual environments such as virtual 3D maps or virtual 3D city models, however, raise the question how to control geodata usage and distribution. We present a concept for embedding digital rights in geovisualizations. It is based on geo-documents, an object-oriented scheme to specify a wide range of geo visualizations. Geo-documents are assembled by building blocks categorized into presentation, structure, interaction, animation, and Digital Rights Management (DRM) classes. DRM objects allow for defining permissions and constraints for all objects contained in geo-documents. In this way, authors of geo visualizations can control how their geo-documents are used, personalized, and redistributed by users. The strengths of the presented concept include the ability to integrate heterogeneous 2D and 3D geodata within a compact design scheme and the ability to cope with privacy, security, and copyright issues. Embedded digital rights in geovisualizations can be applied to improve the usability of geodata user interfaces, to implement publisher-subscriber communication systems for geodata, and to establish business models for geodata trading systems Y1 - 2005 SN - 0008-7041 ER - TY - JOUR A1 - Nienhaus, Marc A1 - Döllner, Jürgen Roland Friedrich T1 - Depicting dynamics using principles of visual art and narration's Y1 - 2005 SN - 0272-1716 ER - TY - JOUR A1 - Delikostidis, Ioannis A1 - Engel, Juri A1 - Retsios, Bas A1 - van Elzakker, Corne P. J. M. A1 - Kraak, Menno-Jan A1 - Döllner, Jürgen Roland Friedrich T1 - Increasing the usability of pedestrian navigation interfaces by means of landmark visibility analysis JF - The journal of navigation N2 - Communicating location-specific information to pedestrians is a challenging task which can be aided by user-friendly digital technologies. In this paper, landmark visibility analysis, as a means for developing more usable pedestrian navigation systems, is discussed. Using an algorithmic framework for image-based 3D analysis, this method integrates a 3D city model with identified landmarks and produces raster visibility layers for each one. This output enables an Android phone prototype application to indicate the visibility of landmarks from the user's actual position. Tested in the field, the method achieves sufficient accuracy for the context of use and improves navigation efficiency and effectiveness. KW - Pedestrian navigation KW - Landmark visibility KW - User-centred design KW - Usability testing Y1 - 2013 U6 - https://doi.org/10.1017/S0373463313000209 SN - 0373-4633 VL - 66 IS - 4 SP - 523 EP - 537 PB - Cambridge Univ. Press CY - New York ER - TY - JOUR A1 - Richter, Rico A1 - Kyprianidis, Jan Eric A1 - Döllner, Jürgen Roland Friedrich T1 - Out-of-core GPU-based change detection in massive 3D point clouds JF - Transactions in GIS N2 - If sites, cities, and landscapes are captured at different points in time using technology such as LiDAR, large collections of 3D point clouds result. Their efficient storage, processing, analysis, and presentation constitute a challenging task because of limited computation, memory, and time resources. In this work, we present an approach to detect changes in massive 3D point clouds based on an out-of-core spatial data structure that is designed to store data acquired at different points in time and to efficiently attribute 3D points with distance information. Based on this data structure, we present and evaluate different processing schemes optimized for performing the calculation on the CPU and GPU. In addition, we present a point-based rendering technique adapted for attributed 3D point clouds, to enable effective out-of-core real-time visualization of the computation results. Our approach enables conclusions to be drawn about temporal changes in large highly accurate 3D geodata sets of a captured area at reasonable preprocessing and rendering times. We evaluate our approach with two data sets from different points in time for the urban area of a city, describe its characteristics, and report on applications. Y1 - 2013 U6 - https://doi.org/10.1111/j.1467-9671.2012.01362.x SN - 1361-1682 VL - 17 IS - 5 SP - 724 EP - 741 PB - Wiley-Blackwell CY - Hoboken ER - TY - JOUR A1 - Paredes, E. G. A1 - Boo, M. A1 - Amor, M. A1 - Bruguera, J. D. A1 - Döllner, Jürgen Roland Friedrich T1 - Extended hybrid meshing algorithm for multiresolution terrain models JF - International journal of geographical information science N2 - Hybrid terrains are a convenient approach for the representation of digital terrain models, integrating heterogeneous data from different sources. In this article, we present a general, efficient scheme for achieving interactive level-of-detail rendering of hybrid terrain models, without the need for a costly preprocessing or resampling of the original data. The presented method works with hybrid digital terrains combining regular grid data and local high-resolution triangulated irregular networks. Since grid and triangulated irregular network data may belong to different datasets, a straightforward combination of both geometries would lead to meshes with holes and overlapping triangles. Our method generates a single multiresolution model integrating the different parts in a coherent way, by performing an adaptive tessellation of the region between their boundaries. Hence, our solution is one of the few existing approaches for integrating different multiresolution algorithms within the same terrain model, achieving a simple interactive rendering of complex hybrid terrains. KW - 3D modeling KW - 3D visualization KW - geovisualization KW - triangulated irregular networks Y1 - 2012 U6 - https://doi.org/10.1080/13658816.2011.615317 SN - 1365-8816 VL - 26 IS - 5 SP - 771 EP - 793 PB - Routledge, Taylor & Francis Group CY - Abingdon ER - TY - JOUR A1 - Hecher, Markus T1 - Treewidth-aware reductions of normal ASP to SAT BT - is normal ASP harder than SAT after all? JF - Artificial intelligence N2 - Answer Set Programming (ASP) is a paradigm for modeling and solving problems for knowledge representation and reasoning. There are plenty of results dedicated to studying the hardness of (fragments of) ASP. So far, these studies resulted in characterizations in terms of computational complexity as well as in fine-grained insights presented in form of dichotomy-style results, lower bounds when translating to other formalisms like propositional satisfiability (SAT), and even detailed parameterized complexity landscapes. A generic parameter in parameterized complexity originating from graph theory is the socalled treewidth, which in a sense captures structural density of a program. Recently, there was an increase in the number of treewidth-based solvers related to SAT. While there are translations from (normal) ASP to SAT, no reduction that preserves treewidth or at least keeps track of the treewidth increase is known. In this paper we propose a novel reduction from normal ASP to SAT that is aware of the treewidth, and guarantees that a slight increase of treewidth is indeed sufficient. Further, we show a new result establishing that, when considering treewidth, already the fragment of normal ASP is slightly harder than SAT (under reasonable assumptions in computational complexity). This also confirms that our reduction probably cannot be significantly improved and that the slight increase of treewidth is unavoidable. Finally, we present an empirical study of our novel reduction from normal ASP to SAT, where we compare treewidth upper bounds that are obtained via known decomposition heuristics. Overall, our reduction works better with these heuristics than existing translations. (c) 2021 Elsevier B.V. All rights reserved. KW - Answer set programming KW - Treewidth KW - Parameterized complexity KW - Complexity KW - analysis KW - Tree decomposition KW - Treewidth-aware reductions Y1 - 2022 U6 - https://doi.org/10.1016/j.artint.2021.103651 SN - 0004-3702 SN - 1872-7921 VL - 304 PB - Elsevier CY - Amsterdam ER -