TY - JOUR A1 - Schaub, Torsten T1 - Antwortmengenprogrammierung Y1 - 2003 ER - TY - JOUR A1 - Holz, Jan A1 - Berger, Nadine A1 - Schroeder, Ulrike T1 - Anwendungsorientierte Gestaltung eines Informatik-Vorkurses als Studienmotivator JF - Commentarii informaticae didacticae : (CID) N2 - Zur Unterstützung von Studierenden in der Studieneingangsphase wurde an der RWTH Aachen ein neuartiger und motivierender Einstieg in den Vorkurs Informatik entwickelt und zum Wintersemester 2011/12 erprobt. Dabei wurde die grafische Programmierung mittels App Inventor eingeführt, die zur Umsetzung anwendungsbezogener Projekte genutzt wurde. In diesem Beitrag werden die Motivation für die Neugestaltung, das Konzept und die Evaluation des Testlaufs beschrieben. Diese dienen als Grundlage für eine vollständige Neukonzeption des Vorkurses für das Wintersemester 2012/2013. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64871 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 56 EP - 66 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Steinert, Bastian A1 - Hirschfeld, Robert T1 - Applying design knowledge to programming Y1 - 2012 ER - TY - JOUR A1 - Anger, Christian A1 - Gebser, Martin A1 - Schaub, Torsten T1 - Approaching the core of unfounded sets Y1 - 2006 UR - http://www.cs.uni-potsdam.de/wv/pdfformat/angesc06a.pdf ER - TY - JOUR A1 - Raimer, Stephan T1 - Aquadrohne, Messdatenerfassung und Co. BT - Interdisziplinäres Projektmanagement als Teil des Wirtschaftsinformatikstudiums JF - Commentarii informaticae didacticae : (CID) N2 - Projektmanagement-Kompetenzen werden von Unternehmen unterschiedlichster Branchen mit wachsender Priorität betrachtet und eingefordert. Als Beitrag zu einer kompetenzorientierten Ausbildung werden in diesem Paper interdisziplinäre Studienmodule als Bestandteil des Wirtschaftsinformatik-Studiums vorgestellt. Zielsetzung der Studienmodule ist die Befähigung der Studierenden, konkrete Projekte unter Nutzung von standardisierten Werkzeugen und Methoden nach dem IPMA-Standard planen und durchführen zu können. Y1 - 2010 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-64345 SN - 1868-0844 SN - 2191-1940 IS - 4 SP - 59 EP - 64 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Schneidenbach, Lars A1 - Schnor, Bettina A1 - Petri, Stefan T1 - Architecture and Implementation of the Socket Interface on Top of GAMMA Y1 - 2003 SN - 0-7695-2037-5 ER - TY - JOUR A1 - Steinert, Fritjof A1 - Stabernack, Benno T1 - Architecture of a low latency H.264/AVC video codec for robust ML based image classification how region of interests can minimize the impact of coding artifacts JF - Journal of Signal Processing Systems for Signal, Image, and Video Technology N2 - The use of neural networks is considered as the state of the art in the field of image classification. A large number of different networks are available for this purpose, which, appropriately trained, permit a high level of classification accuracy. Typically, these networks are applied to uncompressed image data, since a corresponding training was also carried out using image data of similar high quality. However, if image data contains image errors, the classification accuracy deteriorates drastically. This applies in particular to coding artifacts which occur due to image and video compression. Typical application scenarios for video compression are narrowband transmission channels for which video coding is required but a subsequent classification is to be carried out on the receiver side. In this paper we present a special H.264/Advanced Video Codec (AVC) based video codec that allows certain regions of a picture to be coded with near constant picture quality in order to allow a reliable classification using neural networks, whereas the remaining image will be coded using constant bit rate. We have combined this feature with the ability to run with lowest latency properties, which is usually also required in remote control applications scenarios. The codec has been implemented as a fully hardwired High Definition video capable hardware architecture which is suitable for Field Programmable Gate Arrays. KW - H.264 KW - Advanced Video Codec (AVC) KW - Low Latency KW - Region of Interest KW - Machine Learning KW - Inference KW - FPGA KW - Hardware accelerator Y1 - 2022 U6 - https://doi.org/10.1007/s11265-021-01727-2 SN - 1939-8018 SN - 1939-8115 VL - 94 IS - 7 SP - 693 EP - 708 PB - Springer CY - New York ER - TY - JOUR A1 - Reinke, Thomas T1 - Architecture-based construction of multiagent systems Y1 - 2000 SN - 1-58603-013-2 ER - TY - JOUR A1 - Wang, Kewen T1 - Argumentation-based abduction in disjunctive logic programming Y1 - 2000 ER - TY - JOUR A1 - Ziehe, Andreas A1 - Müller, Klaus-Robert A1 - Nolte, G. A1 - Mackert, B.-M. A1 - Curio, Gabriel T1 - Artifact reduction in magnetoneurography based on time-delayed second-order correlations Y1 - 2000 ER - TY - JOUR A1 - Ostrowski, Max A1 - Schaub, Torsten T1 - ASP modulo CSP The clingcon system JF - Theory and practice of logic programming N2 - We present the hybrid ASP solver clingcon, combining the simple modeling language and the high performance Boolean solving capacities of Answer Set Programming (ASP) with techniques for using non-Boolean constraints from the area of Constraint Programming (CP). The new clingcon system features an extended syntax supporting global constraints and optimize statements for constraint variables. The major technical innovation improves the interaction between ASP and CP solver through elaborated learning techniques based on irreducible inconsistent sets. A broad empirical evaluation shows that these techniques yield a performance improvement of an order of magnitude. Y1 - 2012 U6 - https://doi.org/10.1017/S1471068412000142 SN - 1471-0684 VL - 12 SP - 485 EP - 503 PB - Cambridge Univ. Press CY - New York ER - TY - JOUR A1 - Hoos, Holger A1 - Kaminski, Roland A1 - Lindauer, Marius A1 - Schaub, Torsten T1 - aspeed: Solver scheduling via answer set programming JF - Theory and practice of logic programming N2 - Although Boolean Constraint Technology has made tremendous progress over the last decade, the efficacy of state-of-the-art solvers is known to vary considerably across different types of problem instances, and is known to depend strongly on algorithm parameters. This problem was addressed by means of a simple, yet effective approach using handmade, uniform, and unordered schedules of multiple solvers in ppfolio, which showed very impressive performance in the 2011 Satisfiability Testing (SAT) Competition. Inspired by this, we take advantage of the modeling and solving capacities of Answer Set Programming (ASP) to automatically determine more refined, that is, nonuniform and ordered solver schedules from the existing benchmarking data. We begin by formulating the determination of such schedules as multi-criteria optimization problems and provide corresponding ASP encodings. The resulting encodings are easily customizable for different settings, and the computation of optimum schedules can mostly be done in the blink of an eye, even when dealing with large runtime data sets stemming from many solvers on hundreds to thousands of instances. Also, the fact that our approach can be customized easily enabled us to swiftly adapt it to generate parallel schedules for multi-processor machines. KW - algorithm schedules KW - answer set programming KW - portfolio-based solving Y1 - 2015 U6 - https://doi.org/10.1017/S1471068414000015 SN - 1471-0684 SN - 1475-3081 VL - 15 SP - 117 EP - 142 PB - Cambridge Univ. Press CY - New York ER - TY - JOUR A1 - Ohrndorf, Laura T1 - Assignments in Computer Science Education BT - Results of an Analysis of Textbooks, Curricula and other Resources JF - KEYCIT 2014 - Key Competencies in Informatics and ICT N2 - In this paper we describe the recent state of our research project concerning computer science teachers’ knowledge on students’ cognition. We did a comprehensive analysis of textbooks, curricula and other resources, which give teachers guidance to formulate assignments. In comparison to other subjects there are only a few concepts and strategies taught to prospective computer science teachers in university. We summarize them and given an overview on our empirical approach to measure this knowledge. KW - Pedagogical content knowledge KW - computer science teachers KW - students’ knowledge KW - students’ conceptions Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-82868 SN - 1868-0844 SN - 2191-1940 IS - 7 SP - 327 EP - 333 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Häger, Sebastian A1 - Schubert, Wolfgang T1 - Assoziationen in Softwarearchitekturen JF - Preprint / Universität Potsdam, Institut für Informatik Y1 - 2005 SN - 0946-7580 VL - 2005, 2 PB - Univ. CY - Potsdam ER - TY - JOUR A1 - Jörges, Sven A1 - Margaria, Tiziana A1 - Steffen, Bernhard T1 - Assuring property conformance of code generators via model checking JF - Formal aspects of computing : the international journal of formal methods N2 - Automatic code generation is an essential cornerstone of today's model-driven approaches to software engineering. Thus a key requirement for the success of this technique is the reliability and correctness of code generators. This article describes how we employ standard model checking-based verification to check that code generator models developed within our code generation framework Genesys conform to (temporal) properties. Genesys is a graphical framework for the high-level construction of code generators on the basis of an extensible library of well-defined building blocks along the lines of the Extreme Model-Driven Development paradigm. We will illustrate our verification approach by examining complex constraints for code generators, which even span entire model hierarchies. We also show how this leads to a knowledge base of rules for code generators, which we constantly extend by e.g. combining constraints to bigger constraints, or by deriving common patterns from structurally similar constraints. In our experience, the development of code generators with Genesys boils down to re-instantiating patterns or slightly modifying the graphical process model, activities which are strongly supported by verification facilities presented in this article. KW - Extreme Model-Driven Development KW - Code generation KW - Model checking KW - Verification Y1 - 2011 U6 - https://doi.org/10.1007/s00165-010-0169-9 SN - 0934-5043 VL - 23 IS - 5 SP - 589 EP - 606 PB - Springer CY - New York ER - TY - JOUR A1 - Chatterjee, M. A1 - Pradhan, D. K. A1 - Kunz, Wolfgang T1 - ATPG-based Transformations for random-pattern testable logic synthesis Y1 - 1995 SN - 0-8186-7213-7 SN - 0-8186-7214-5 SN - 0-8186-7215-3 ER - TY - JOUR A1 - Bieniusa, Annette A1 - Degen, Markus A1 - Heidegger, Phillip A1 - Thiemann, Peter A1 - Wehr, Stefan A1 - Gasbichler, Martin A1 - Crestani, Marcus A1 - Klaeren, Herbert A1 - Knauel, Eric A1 - Sperber, Michael T1 - Auf dem Weg zu einer robusten Programmierausbildung JF - Commentarii informaticae didacticae : (CID) N2 - Die gelungene Durchführung einer Vorlesung „Informatik I – Einführung in die Programmierung“ ist schwierig, trotz einer Vielfalt existierender Materialien und erprobter didaktischer Methoden. Gerade aufgrund dieser vielfältigen Auswahl hat sich bisher noch kein robustes Konzept durchgesetzt, das unabhängig von den Durchführenden eine hohe Erfolgsquote garantiert. An den Universitäten Tübingen und Freiburg wurde die Informatik I aus den gleichen Lehrmaterialien und unter ähnlichen Bedingungen durchgeführt, um das verwendete Konzept auf Robustheit zu überprüfen. Die Grundlage der Vorlesung bildet ein systematischer Ansatz zum Erlernen des Programmierens, der von der PLTGruppe in USA entwickelt worden ist. Hinzu kommen neue Ansätze zur Betreuung, insbesondere das Betreute Programmieren, bei dem die Studierenden eine solide Basis für ihre Programmierfähigkeiten entwickeln. Der vorliegende Bericht beschreibt hierbei gesammelte Erfahrungen, erläutert die Entwicklung der Unterrichtsmethodik und der Inhaltsauswahl im Vergleich zu vorangegangenen Vorlesungen und präsentiert Daten zum Erfolg der Vorlesung. KW - Informatik KW - Ausbildung KW - Didaktik KW - Hochschuldidaktik Y1 - 2009 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-29655 SN - 1868-0844 SN - 2191-1940 IS - 1 SP - 67 EP - 79 ER - TY - JOUR A1 - Ishebabi, Harold A1 - Bobda, Christophe T1 - Automated architecture synthesis for parallel programs on FPGA multiprocessor systems N2 - This paper presents a concept for automated architecture synthesis for adaptive multiprocessors on chip, in particular for Field-Programmable Gate-Array (FPGA) devices. Given a parallel program, the intent is to simultaneously allocate processor resources and the corresponding communication network, and at the same time, to map the parallel application to get an optimum application-specific architecture. This approach builds up on a previously proposed design platform that automates system integration and FPGA synthesis for such architectures. As a result, the overall concept offers an automated design approach from application mapping to system and FPGA configuration. The automated synthesis is based on combinatorial optimization. Automation is possible because a solvable Integer Linear Programming (ILP) model that captures all necessary design trade-off parameters of such systems has been found. Experimental results to study the feasibility of the automated synthesis indicate that problems with sizes that can be encountered in the embedded domain can be readily solved. Results obtained underscore the need for an automated synthesis for design space exploration. Y1 - 2009 UR - http://www.sciencedirect.com/science/journal/01419331 U6 - https://doi.org/10.1016/j.micpro.2008.08.009 SN - 0141-9331 ER - TY - JOUR A1 - Lindauer, Marius A1 - Hoos, Holger A1 - Leyton-Brown, Kevin A1 - Schaub, Torsten T1 - Automatic construction of parallel portfolios via algorithm configuration JF - Artificial intelligence N2 - Since 2004, increases in computational power described by Moore's law have substantially been realized in the form of additional cores rather than through faster clock speeds. To make effective use of modern hardware when solving hard computational problems, it is therefore necessary to employ parallel solution strategies. In this work, we demonstrate how effective parallel solvers for propositional satisfiability (SAT), one of the most widely studied NP-complete problems, can be produced automatically from any existing sequential, highly parametric SAT solver. Our Automatic Construction of Parallel Portfolios (ACPP) approach uses an automatic algorithm configuration procedure to identify a set of configurations that perform well when executed in parallel. Applied to two prominent SAT solvers, Lingeling and clasp, our ACPP procedure identified 8-core solvers that significantly outperformed their sequential counterparts on a diverse set of instances from the application and hard combinatorial category of the 2012 SAT Challenge. We further extended our ACPP approach to produce parallel portfolio solvers consisting of several different solvers by combining their configuration spaces. Applied to the component solvers of the 2012 SAT Challenge gold medal winning SAT Solver pfolioUZK, our ACPP procedures produced a significantly better-performing parallel SAT solver. KW - Algorithm configuration KW - Parallel SAT solving KW - Algorithm portfolios KW - Programming by optimization KW - Automated parallelization Y1 - 2016 U6 - https://doi.org/10.1016/j.artint.2016.05.004 SN - 0004-3702 SN - 1872-7921 VL - 244 SP - 272 EP - 290 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Schmidt, Alexander T1 - Automatic extraction of locking protocols Y1 - 2010 SN - 978-3-86956-036-6 ER - TY - JOUR A1 - Durzinsky, Markus A1 - Marwan, Wolfgang A1 - Ostrowski, Max A1 - Schaub, Torsten A1 - Wagler, Annegret T1 - Automatic network reconstruction using ASP JF - Theory and practice of logic programming N2 - Building biological models by inferring functional dependencies from experimental data is an important issue in Molecular Biology. To relieve the biologist from this traditionally manual process, various approaches have been proposed to increase the degree of automation. However, available approaches often yield a single model only, rely on specific assumptions, and/or use dedicated, heuristic algorithms that are intolerant to changing circumstances or requirements in the view of the rapid progress made in Biotechnology. Our aim is to provide a declarative solution to the problem by appeal to Answer Set Programming (ASP) overcoming these difficulties. We build upon an existing approach to Automatic Network Reconstruction proposed by part of the authors. This approach has firm mathematical foundations and is well suited for ASP due to its combinatorial flavor providing a characterization of all models explaining a set of experiments. The usage of ASP has several benefits over the existing heuristic algorithms. First, it is declarative and thus transparent for biological experts. Second, it is elaboration tolerant and thus allows for an easy exploration and incorporation of biological constraints. Third, it allows for exploring the entire space of possible models. Finally, our approach offers an excellent performance, matching existing, special-purpose systems. Y1 - 2011 U6 - https://doi.org/10.1017/S1471068411000287 SN - 1471-0684 VL - 11 SP - 749 EP - 766 PB - Cambridge Univ. Press CY - New York ER - TY - JOUR A1 - Damnik, Gregor A1 - Gierl, Mark A1 - Proske, Antje A1 - Körndle, Hermann A1 - Narciss, Susanne T1 - Automatische Erzeugung von Aufgaben als Mittel zur Erhöhung von Interaktivität und Adaptivität in digitalen Lernressourcen JF - E-Learning Symposium 2018 N2 - Digitale Medien enthalten bislang vor allem Inhalte in verschiedenen Darstellungsformen. Dies allein erzeugt jedoch nur einen geringen Mehrwert zu klassischen Lernressourcen, da die Kriterien der Interaktivität und Adaptivität nicht mit einbezogen werden. Dies scheitert jedoch oft an dem damit verbundenen Erstellungsaufwand. Der folgende Beitrag zeigt, wie durch die automatische Erzeugung von Aufgaben ein hochwertiger Wissenserwerb mit digitalen Medien ermöglicht wird. Ferner werden Vor- und Nachteile der automatischen Erstellung von Aufgaben erörtert. KW - Lernaufgaben KW - Adaptivität KW - Interaktivität KW - digitale Medien KW - Automatic Item Generation Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-421842 SP - 5 EP - 16 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Brüning, Stefan A1 - Schaub, Torsten T1 - Avoiding non-ground variables Y1 - 1999 SN - 3-540-66131-x ER - TY - JOUR A1 - Frank, Mario T1 - Axiom relevance decision engine : technical report N2 - This document presents an axiom selection technique for classic first order theorem proving based on the relevance of axioms for the proof of a conjecture. It is based on unifiability of predicates and does not need statistical information like symbol frequency. The scope of the technique is the reduction of the set of axioms and the increase of the amount of provable conjectures in a given time. Since the technique generates a subset of the axiom set, it can be used as a preprocessor for automated theorem proving. This technical report describes the conception, implementation and evaluation of ARDE. The selection method, which is based on a breadth-first graph search by unifiability of predicates, is a weakened form of the connection calculus and uses specialised variants or unifiability to speed up the selection. The implementation of the concept is evaluated with comparison to the results of the world championship of theorem provers of the year 2012 (CASC J6). It is shown that both the theorem prover leanCoP which uses the connection calculus and E which uses equality reasoning, can benefit from the selection approach. Also, the evaluation shows that the concept is applyable for theorem proving problems with thousands of formulae and that the selection is independent from the calculus used by the theorem prover. N2 - Dieser technische Report beschreibt die Konzeption, Implementierung und Evaluation eines Verfahrens zur Auswahl von logischen Formeln bezüglich derer Relevanz für den Beweis einer logischen Formel. Das Verfahren wird ausschließlich für die Prädikatenlogik erster Ordnung angewandt, wenngleich es auch für höherstufige Prädikatenlogiken geeignet ist. Das Verfahren nutzt eine unifikationsbasierte Breitensuche im Graphen wobei jeder Knoten im Graphen ein Prädikat und jede existierende Kante eine Unifizierbarkeitsrelation ist. Ziel des Verfahrens ist die Reduktion einer gegebenen Menge von Formeln auf eine für aktuelle Theorembeweiser handhabbare Größe. Daher ist das Verfahren als Präprozess-Schritt für das automatische Theorembeweisen geeignet. Zur Beschleunigung der Suche wird neben der Standard-Unifikation eine abgeschwächte Unifikation verwendet. Das System wurde während der Weltmeisterschaft der Theorembeweiser im Jahre 2014 (CASC J6) in Manchester zusammen mit dem Theorembeweiser leanCoP eingereicht und konnte leanCoP dabei unterstützen, Probleme zu lösen, die leanCoP alleine nicht handhaben kann. Die Tests mit leanCoP und dem Theorembeweiser E im Nachgang zu der Weltmeisterschaft zeigen, dass das Verfahren unabhängig von dem verwendeten Kalkül ist und bei beiden Theorembeweisern positive Auswirkungen auf die Beweisbarkeit von Problemen mit großen Formelmengen hat. KW - Relevanz KW - Graphensuche KW - Theorembeweisen KW - Preprocessing KW - Unifikation KW - relevance KW - graph-search KW - preprocessing KW - unification KW - theorem Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-72128 ER - TY - JOUR A1 - Fichte, Johannes Klaus A1 - Szeider, Stefan T1 - Backdoors to tractable answer set programming JF - Artificial intelligence N2 - Answer Set Programming (ASP) is an increasingly popular framework for declarative programming that admits the description of problems by means of rules and constraints that form a disjunctive logic program. In particular, many Al problems such as reasoning in a nonmonotonic setting can be directly formulated in ASP. Although the main problems of ASP are of high computational complexity, complete for the second level of the Polynomial Hierarchy, several restrictions of ASP have been identified in the literature, under which ASP problems become tractable. In this paper we use the concept of backdoors to identify new restrictions that make ASP problems tractable. Small backdoors are sets of atoms that represent "clever reasoning shortcuts" through the search space and represent a hidden structure in the problem input. The concept of backdoors is widely used in theoretical investigations in the areas of propositional satisfiability and constraint satisfaction. We show that it can be fruitfully adapted to ASP. We demonstrate how backdoors can serve as a unifying framework that accommodates several tractable restrictions of ASP known from the literature. Furthermore, we show how backdoors allow us to deploy recent algorithmic results from parameterized complexity theory to the domain of answer set programming. (C) 2015 Elsevier B.V. All rights reserved. KW - Answer set programming KW - Backdoors KW - Computational complexity KW - Parameterized complexity KW - Kernelization Y1 - 2015 U6 - https://doi.org/10.1016/j.artint.2014.12.001 SN - 0004-3702 SN - 1872-7921 VL - 220 SP - 64 EP - 103 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Haider, Peter A1 - Scheffer, Tobias T1 - Bayesian clustering for email campaign detection Y1 - 2009 SN - 978-1-605-58516-1 ER - TY - JOUR A1 - Bergner, Nadine A1 - Taraschewski, Christian A1 - Schroeder, Ulrik ED - Schubert, Sigrid ED - Schwill, Andreas T1 - Beispiel eines Schülerwettbewerbs zum Thema Projektmanagement und App-Programmierung JF - HDI 2014 : Gestalten von Übergängen N2 - Es wird ein Informatik-Wettbewerb für Schülerinnen und Schüler der Sekundarstufe II beschrieben, der über mehrere Wochen möglichst realitätsnah die Arbeitswelt eines Informatikers vorstellt. Im Wettbewerb erarbeiten die Schülerteams eine Android-App und organisieren ihre Entwicklung durch Projektmanagementmethoden, die sich an professionellen, agilen Prozessen orientieren. Im Beitrag werden der theoretische Hintergrund zu Wettbewerben, die organisatorischen und didaktischen Entscheidung, eine erste Evaluation sowie Reflexion und Ausblick dargestellt. Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-84726 VL - 2015 IS - 9 SP - 161 EP - 168 ER - TY - JOUR A1 - Delgrande, James Patrick A1 - Lang, Jérôme A1 - Schaub, Torsten T1 - Belief change based on global minimisation Y1 - 2007 ER - TY - JOUR A1 - Schwill, Andreas T1 - Bericht zur Arbeitsgruppe "Modellbildung und fächerübergreifender Unterricht" Y1 - 2000 SN - 3-88120-314-1 ER - TY - JOUR A1 - Schulte, Carsten T1 - Biographisches Lernen in der Informatik JF - Commentarii informaticae didacticae : (CID) N2 - Biographisches Lernen betont insbesondere die Rolle individueller biographischer Erfahrungen und deren Auswirkungen auf Selbstbild, Weltbild und Verhaltensmuster. Schlagwortartig kann diese Perspektive als Unterschied zwischen ‚Informatik lernen‘ und ‚Informatiker/in werden‘ beschrieben werden. Im Artikel wird die Perspektive des Biographischen Lernens an Beispielen aus der Informatik skizziert. Biographisches Lernen ist in der Informatik zunächst aus rein pragmatischen Gründen bedeutsam. Der rasche Wandel der Informationstechnologien im Alltag verändert Erfahrungshintergründe der Studierenden (bzw. Schülerinnen und Schüler). Dementsprechend verändern sich Erwartungen, Interessen, Vorkenntnisse, generelle Einstellungen oder auch ganz banal die ‚IT-Ausstattung‘ der Lernenden. KW - Informatik KW - Ausbildung KW - Didaktik KW - Hochschuldidaktik Y1 - 2009 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-29649 SN - 1868-0844 SN - 2191-1940 IS - 1 SP - 47 EP - 63 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Cordes, Frank A1 - Kaiser, Rolf A1 - Selbig, Joachim T1 - Bioinformatics approach to predicting HIV drug resistance N2 - The emergence of drug resistance remains one of the most challenging issues in the treatment of HIV-1 infection. The extreme replication dynamics of HIV facilitates its escape from the selective pressure exerted by the human immune system and by the applied combination drug therapy. This article reviews computational methods whose combined use can support the design of optimal antiretroviral therapies based on viral genotypic and phenotypic data. Genotypic assays are based on the analysis of mutations associated with reduced drug susceptibility, but are difficult to interpret due to the numerous mutations and mutational patterns that confer drug resistance. Phenotypic resistance or susceptibility can be experimentally evaluated by measuring the inhibition of the viral replication in cell culture assays. However, this procedure is expensive and time consuming Y1 - 2006 UR - http://www.expert-reviews.com/loi/erm U6 - https://doi.org/10.1586/14737159.6.2.207 SN - 1473-7159 ER - TY - JOUR A1 - Bogue, Ted A1 - Jürgensen, Helmut A1 - Gössel, Michael T1 - BIST with negligible aliasing through random cover circuits Y1 - 1995 ER - TY - JOUR A1 - Ziehe, Andreas A1 - Kawanabe, Motoaki A1 - Harmeling, Stefan T1 - Blind separation of post-nonlinear mixtures using linearizing transformations and temporal decorrelation N2 - We propose two methods that reduce the post-nonlinear blind source separation problem (PNL-BSS) to a linear BSS problem. The first method is based on the concept of maximal correlation: we apply the alternating conditional expectation (ACE) algorithm-a powerful technique from nonparametric statistics-to approximately invert the componentwise nonlinear functions. The second method is a Gaussianizing transformation, which is motivated by the fact that linearly mixed signals before nonlinear transformation are approximately Gaussian distributed. This heuristic, but simple and efficient procedure works as good as the ACE method. Using the framework provided by ACE, convergence can be proven. The optimal transformations obtained by ACE coincide with the sought-after inverse functions of the nonlinearitics. After equalizing the nonlinearities, temporal decorrelation separation (TDSEP) allows us to recover the source signals. Numerical simulations testing "ACE-TD" and "Gauss-TD" on realistic examples are performed with excellent results Y1 - 2004 SN - 1532-4435 ER - TY - JOUR A1 - Müller, Klaus-Robert A1 - Vigario, R. A1 - Meinecke, Frank C. A1 - Ziehe, Andreas T1 - Blind source separation techniques for decomposing event-related brain signals N2 - Recently blind source separation (BSS) methods have been highly successful when applied to biomedical data. This paper reviews the concept of BSS and demonstrates its usefulness in the context of event-related MEG measurements. In a first experiment we apply BSS to artifact identification of raw MEG data and discuss how the quality of the resulting independent component projections can be evaluated. The second part of our study considers averaged data of event-related magnetic fields. Here, it is particularly important to monitor and thus avoid possible overfitting due to limited sample size. A stability assessment of the BSS decomposition allows to solve this task and an additional grouping of the BSS components reveals interesting structure, that could ultimately be used for gaining a better physiological modeling of the data Y1 - 2004 SN - 0218-1274 ER - TY - JOUR A1 - Dornhege, Guido A1 - Blankertz, Benjamin A1 - Curio, Gabriel A1 - Müller, Klaus-Robert T1 - Boosting bit rates in noninvasive EEG single-trial classifications by feature combination and multiclass paradigms N2 - Noninvasive electroencephalogram (EEG) recordings provide for easy and safe access to human neocortical processes which can be exploited for a brain-computer interface (BCI). At present, however, the use of BCIs is severely limited by low bit-transfer rates. We systematically analyze and develop two recent concepts, both capable of enhancing the information gain from multichannel scalp EEG recordings: 1) the combination of classifiers, each specifically tailored for different physiological phenomena, e.g., slow cortical potential shifts, such as the premovement Bereitschaftspotential or differences in spatio-spectral distributions of brain activity (i.e., focal event-related desynchronizations) and 2) behavioral paradigms inducing the subjects to generate one out of several brain states (multiclass approach) which all bare a distinctive spatio-temporal signature well discriminable in the standard scalp EEG. We derive information-theoretic predictions and demonstrate their relevance in experimental data. We will show that a suitably arranged interaction between these concepts can significantly boost BCI performances Y1 - 2004 ER - TY - JOUR A1 - Tarnick, Steffen T1 - Bounding error masking in linear output space compression schemes Y1 - 1994 ER - TY - JOUR A1 - Baier, Thomas A1 - Mendling, Jan A1 - Weske, Mathias T1 - Bridging abstraction layers in process mining JF - Information systems N2 - While the maturity of process mining algorithms increases and more process mining tools enter the market, process mining projects still face the problem of different levels of abstraction when comparing events with modeled business activities. Current approaches for event log abstraction try to abstract from the events in an automated way that does not capture the required domain knowledge to fit business activities. This can lead to misinterpretation of discovered process models. We developed an approach that aims to abstract an event log to the same abstraction level that is needed by the business. We use domain knowledge extracted from existing process documentation to semi-automatically match events and activities. Our abstraction approach is able to deal with n:m relations between events and activities and also supports concurrency. We evaluated our approach in two case studies with a German IT outsourcing company. (C) 2014 Elsevier Ltd. All rights reserved. KW - Process mining KW - Abstraction KW - Event mapping Y1 - 2014 U6 - https://doi.org/10.1016/j.is.2014.04.004 SN - 0306-4379 SN - 1873-6076 VL - 46 SP - 123 EP - 139 PB - Elsevier CY - Oxford ER - TY - JOUR A1 - Giese, Holger A1 - Hildebrandt, Stephan A1 - Lambers, Leen T1 - Bridging the gap between formal semantics and implementation of triple graph grammars JF - Software and systems modeling N2 - The correctness of model transformations is a crucial element for model-driven engineering of high-quality software. A prerequisite to verify model transformations at the level of the model transformation specification is that an unambiguous formal semantics exists and that the implementation of the model transformation language adheres to this semantics. However, for existing relational model transformation approaches, it is usually not really clear under which constraints particular implementations really conform to the formal semantics. In this paper, we will bridge this gap for the formal semantics of triple graph grammars (TGG) and an existing efficient implementation. While the formal semantics assumes backtracking and ignores non-determinism, practical implementations do not support backtracking, require rule sets that ensure determinism, and include further optimizations. Therefore, we capture how the considered TGG implementation realizes the transformation by means of operational rules, define required criteria, and show conformance to the formal semantics if these criteria are fulfilled. We further outline how static and runtime checks can be employed to guarantee these criteria. Y1 - 2014 U6 - https://doi.org/10.1007/s10270-012-0247-y SN - 1619-1366 SN - 1619-1374 VL - 13 IS - 1 SP - 273 EP - 299 PB - Springer CY - Heidelberg ER - TY - JOUR A1 - Luebbe, Alexander A1 - Weske, Mathias T1 - Bringing design thinking to business process modeling Y1 - 2011 SN - 978-3-642-13756-3 ER - TY - JOUR A1 - Opel, Simone A1 - Kramer, Matthias A1 - Trommen, Michael A1 - Pottbäcker, Florian A1 - Ilaghef, Youssef T1 - BugHunt BT - A Motivating Approach to Self-Directed Problem-solving in Operating Systems JF - KEYCIT 2014 - Key Competencies in Informatics and ICT N2 - Competencies related to operating systems and computer security are usually taught systematically. In this paper we present a different approach, in which students have to remove virus-like behaviour on their respective computers, which has been induced by software developed for this purpose. They have to develop appropriate problem-solving strategies and thereby explore essential elements of the operating system. The approach was implemented exemplarily in two computer science courses at a regional general upper secondary school and showed great motivation and interest in the participating students. KW - Educational software KW - operating system KW - student activation KW - problem-solving KW - interactive course KW - interactive workshop KW - edutainment KW - secondary computer science education Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-82693 SN - 1868-0844 SN - 2191-1940 IS - 7 SP - 217 EP - 233 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Meinel, Christoph A1 - Wang, Long T1 - Building content clusters based on modelling page pairs N2 - We give a new view on building content clusters from page pair models. We measure the heuristic importance within every two pages by computing the distance of their accessed positions in usage sessions. We also compare our page pair models with the classical pair models used in information theories and natural language processing, and give different evaluation methods to build the reasonable content communities. And we finally interpret the advantages and disadvantages of our models from detailed experiment results Y1 - 2006 UR - http://www.springerlink.com/content/105633/ U6 - https://doi.org/10.1007/11610113_85 ER - TY - JOUR A1 - Seuring, Markus T1 - Built-in self test mit multi-mode scannable memory elementen Y1 - 1999 ER - TY - JOUR A1 - Bogue, Ted A1 - Gössel, Michael A1 - Jürgensen, Helmut A1 - Zorian, Yervant T1 - Built-in self-Test with an alternating output Y1 - 1998 SN - 0-8186-8359-7 ER - TY - JOUR A1 - Hellebrand, Sybille A1 - Rajski, Janusz A1 - Tarnick, Steffen A1 - Venkatraman, Srikanth A1 - Courtois, Bernard T1 - Built-in test for circuits with scan based on reseeding of multiole polynomial linear feedback shift registers Y1 - 1995 ER - TY - JOUR A1 - Cobernuss, M. T1 - Bused interconnection network for parallel memory with linear storage Y1 - 1994 SN - 3-05-501602-5 ER - TY - JOUR A1 - Pernici, Barbara A1 - Weske, Mathias T1 - Business process management Y1 - 2006 SN - 0169-023X ER - TY - JOUR A1 - Neumann, I. A1 - Stoffel, Dominik A1 - Hartje, Hendrik A1 - Kunz, Wolfgang T1 - Cell replication and redundancy elimination during placement for cycle time optimization Y1 - 1999 ER - TY - JOUR A1 - Besnard, Philippe A1 - Schaub, Torsten T1 - Characterization of non-monotone non-constructive systems Y1 - 1998 SN - 1012-2443 ER - TY - JOUR A1 - Gerbser, Martin A1 - Schaub, Torsten T1 - Characterizing (ASP) inferences by unit propagation Y1 - 2006 ER - TY - JOUR A1 - Lanfermann, Gerd A1 - Schnor, Bettina A1 - Seidel, Edward T1 - Characterizing Grids N2 - We present a new data model approach to describe the various objects that either represent the Grid infrastructure or make use of it. The data model is based on the experiences and experiments conducted in heterogeneous Grid environments. While very sophisticated data models exist to describe and characterize e.g. compute capacities or web services, we will show that a general description, which combines {em all} of these aspects, is needed to give an adequate representation of objects on a Grid. The Grid Object Description Language (GODsL)} is a generic and extensible approach to unify the various aspects that an object on a Grid can have. GODsL provides the content for the XML based communication in Grid migration scenarios, carried out in the GridLab project. We describe the data model architecture on a general level and focus on the Grid application scenarios. Y1 - 2003 SN - 1-4020-7418-2 ER - TY - JOUR A1 - Goessel, Michael A1 - Morozov, A. V. A1 - Sapozhnikov, V. V. A1 - Sapozhaikov, Vl. V. T1 - Checking combinational circuits by the method of logic complement N2 - Design of fully self-testing combinational circuits was considered. A theorem defining the conditions for guaranteed logic complement-based design of fully self-testing circuit was proved. Examples were presented Y1 - 2005 SN - 0005-1179 ER - TY - JOUR A1 - Kiertscher, Simon A1 - Zinke, Jörg A1 - Schnor, Bettina T1 - CHERUB power consumption aware cluster resource management JF - Cluster computing : the journal of networks, software tools and applications N2 - This paper presents an evaluation of ACPI energy saving modes, and deduces the design and implementation of an energy saving daemon for clusters called cherub. The design of the cherub daemon is modular and extensible. Since the only requirement is a central approach for resource management, cherub is suited for Server Load Balancing (SLB) clusters managed by dispatchers like Linux Virtual Server (LVS), as well as for High Performance Computing (HPC) clusters. Our experimental results show that cherub's scheduling algorithm works well, i.e. it will save energy, if possible, and avoids state-flapping. KW - Green computing KW - Cluster computing Y1 - 2013 U6 - https://doi.org/10.1007/s10586-011-0176-5 SN - 1386-7857 VL - 16 IS - 1 SP - 55 EP - 63 PB - Springer CY - New York ER - TY - JOUR A1 - Besnard, Philippe A1 - Schaub, Torsten T1 - Circumscribing inconsistency Y1 - 1997 SN - 1-558-60480-4 SN - 1045-0823 ER - TY - JOUR A1 - Gebser, Martin A1 - Kaufmann, Benjamin A1 - Neumann, André A1 - Schaub, Torsten T1 - Clasp : a conflict-driven answer set solver Y1 - 2007 SN - 978-3-540- 72199-4 ER - TY - JOUR A1 - Hoos, Holger A1 - Lindauer, Marius A1 - Schaub, Torsten T1 - claspfolio 2 BT - advances in algorithm selection for answer set programming JF - Theory and practice of logic programming N2 - Building on the award-winning, portfolio-based ASP solver claspfolio, we present claspfolio 2, a modular and open solver architecture that integrates several different portfolio-based algorithm selection approaches and techniques. The claspfolio 2 solver framework supports various feature generators, solver selection approaches, solver portfolios, as well as solver-schedule-based pre-solving techniques. The default configuration of claspfolio 2 relies on a light-weight version of the ASP solver clasp to generate static and dynamic instance features. The flexible open design of claspfolio 2 is a distinguishing factor even beyond ASP. As such, it provides a unique framework for comparing and combining existing portfolio-based algorithm selection approaches and techniques in a single, unified framework. Taking advantage of this, we conducted an extensive experimental study to assess the impact of different feature sets, selection approaches and base solver portfolios. In addition to gaining substantial insights into the utility of the various approaches and techniques, we identified a default configuration of claspfolio 2 that achieves substantial performance gains not only over clasp's default configuration and the earlier version of claspfolio, but also over manually tuned configurations of clasp. Y1 - 2014 U6 - https://doi.org/10.1017/S1471068414000210 SN - 1471-0684 SN - 1475-3081 VL - 14 SP - 569 EP - 585 PB - Cambridge Univ. Press CY - New York ER - TY - JOUR A1 - Huang, Yizhen A1 - Richter, Eric A1 - Kleickmann, Thilo A1 - Wiepke, Axel A1 - Richter, Dirk T1 - Classroom complexity affects student teachers’ behavior in a VR classroom JF - Computers & education : an international journal N2 - Student teachers often struggle to keep track of everything that is happening in the classroom, and particularly to notice and respond when students cause disruptions. The complexity of the classroom environment is a potential contributing factor that has not been empirically tested. In this experimental study, we utilized a virtual reality (VR) classroom to examine whether classroom complexity affects the likelihood of student teachers noticing disruptions and how they react after noticing. Classroom complexity was operationalized as the number of disruptions and the existence of overlapping disruptions (multidimensionality) as well as the existence of parallel teaching tasks (simultaneity). Results showed that student teachers (n = 50) were less likely to notice the scripted disruptions, and also less likely to respond to the disruptions in a comprehensive and effortful manner when facing greater complexity. These results may have implications for both teacher training and the design of VR for training or research purpose. This study contributes to the field from two aspects: 1) it revealed how features of the classroom environment can affect student teachers' noticing of and reaction to disruptions; and 2) it extends the functionality of the VR environment-from a teacher training tool to a testbed of fundamental classroom processes that are difficult to manipulate in real-life. KW - Augmented and virtual reality KW - Simulations KW - Improving classroom KW - teaching KW - Media in education KW - Pedagogical issues Y1 - 2021 U6 - https://doi.org/10.1016/j.compedu.2020.104100 SN - 0360-1315 SN - 1873-782X VL - 163 PB - Elsevier CY - Oxford ER - TY - JOUR A1 - Al Laban, Firas A1 - Reger, Martin A1 - Lucke, Ulrike T1 - Closing the Policy Gap in the Academic Bridge JF - Education sciences N2 - The highly structured nature of the educational sector demands effective policy mechanisms close to the needs of the field. That is why evidence-based policy making, endorsed by the European Commission under Erasmus+ Key Action 3, aims to make an alignment between the domains of policy and practice. Against this background, this article addresses two issues: First, that there is a vertical gap in the translation of higher-level policies to local strategies and regulations. Second, that there is a horizontal gap between educational domains regarding the policy awareness of individual players. This was analyzed in quantitative and qualitative studies with domain experts from the fields of virtual mobility and teacher training. From our findings, we argue that the combination of both gaps puts the academic bridge from secondary to tertiary education at risk, including the associated knowledge proficiency levels. We discuss the role of digitalization in the academic bridge by asking the question: which value does the involved stakeholders expect from educational policies? As a theoretical basis, we rely on the model of value co-creation for and by stakeholders. We describe the used instruments along with the obtained results and proposed benefits. Moreover, we reflect on the methodology applied, and we finally derive recommendations for future academic bridge policies. KW - policy evaluation KW - higher education KW - virtual mobility KW - teacher training Y1 - 2022 U6 - https://doi.org/10.3390/educsci12120930 SN - 2227-7102 VL - 12 IS - 12 PB - MDPI CY - Basel ER - TY - JOUR A1 - Delgrande, James Patrick A1 - Liu, Daphne H. A1 - Schaub, Torsten A1 - Thiele, Sven T1 - COBA 2.0 : a consistency-based belief change system Y1 - 2006 UR - http://www2.in.tu-clausthal.de/~tmbehrens/NMR_Proc_TR4.pdf ER - TY - JOUR A1 - Delgrande, James Patrick A1 - Liu, Daphne H. A1 - Schaub, Torsten A1 - Thiele, Sven T1 - COBA 2.0 : a consistency-based belief change system Y1 - 2007 ER - TY - JOUR A1 - Delgrande, James Patrick A1 - Hunter, Anthony A1 - Schaub, Torsten T1 - COBA: a consistency-based belief revision system Y1 - 2002 SN - 3-540-44190-5 ER - TY - JOUR A1 - Hartje, Hendrik A1 - Sogomonyan, Egor S. A1 - Gössel, Michael T1 - Code disjoint circuits for partity codes Y1 - 1997 ER - TY - JOUR A1 - Gössel, Michael A1 - Sogomonyan, Egor S. T1 - Code disjoint self-parity combinational circuits for self-testing, concurrent fault detection and parity scan design Y1 - 1994 ER - TY - JOUR A1 - Steinert, Bastian A1 - Cassou, Damien A1 - Hirschfeld, Robert T1 - CoExist overcoming aversion to change preserving immediate access to source code and run-time information of previous development states JF - ACM SIGPLAN notices N2 - Programmers make many changes to the program to eventually find a good solution for a given task. In this course of change, every intermediate development state can of value, when, for example, a promising ideas suddenly turn out inappropriate or the interplay of objects turns out more complex than initially expected before making changes. Programmers would benefit from tool support that provides immediate access to source code and run-time of previous development states of interest. We present IDE extensions, implemented for Squeak/Smalltalk, to preserve, retrieve, and work with this information. With such tool support, programmers can work without worries because they can rely on tools that help them with whatever their explorations will reveal. They no longer have to follow certain best practices only to avoid undesired consequences of changing code. KW - Design KW - Experimentation KW - Human Factors KW - Continuous Testing KW - Continuous Versioning KW - Debugging KW - Evolution KW - Explore-first Programming KW - Fault Localization KW - Prototyping Y1 - 2013 U6 - https://doi.org/10.1145/2480360.2384591 SN - 0362-1340 VL - 48 IS - 2 SP - 107 EP - 117 PB - Association for Computing Machinery CY - New York ER - TY - JOUR A1 - Dornhege, Guido A1 - Blankertz, Benjamin A1 - Krauledat, Matthias A1 - Losch, Florian A1 - Curio, Gabriel A1 - Müller, Klaus-Robert T1 - Combined optimization of spatial and temporal filters for improving brain-computer interfacing JF - IEEE transactions on bio-medical electronics N2 - Brain-computer interface (BCI) systems create a novel communication channel from the brain to an output de ice by bypassing conventional motor output pathways of nerves and muscles. Therefore they could provide a new communication and control option for paralyzed patients. Modern BCI technology is essentially based on techniques for the classification of single-trial brain signals. Here we present a novel technique that allows the simultaneous optimization of a spatial and a spectral filter enhancing discriminability rates of multichannel EEG single-trials. The evaluation of 60 experiments involving 22 different subjects demonstrates the significant superiority of the proposed algorithm over to its classical counterpart: the median classification error rate was decreased by 11%. Apart from the enhanced classification, the spatial and/or the spectral filter that are determined by the algorithm can also be used for further analysis of the data, e.g., for source localization of the respective brain rhythms. KW - brain-computer interface KW - common spatial patterns KW - EEG KW - event-related desynchronization KW - single-trial-analysis Y1 - 2006 U6 - https://doi.org/10.1109/TBME.2006.883649 SN - 0018-9294 VL - 53 IS - 11 SP - 2274 EP - 2281 PB - IEEE CY - New York ER - TY - JOUR A1 - Lagriffoul, Fabien A1 - Andres, Benjamin T1 - Combining task and motion planning BT - A culprit detection problem JF - The international journal of robotics research N2 - Solving problems combining task and motion planning requires searching across a symbolic search space and a geometric search space. Because of the semantic gap between symbolic and geometric representations, symbolic sequences of actions are not guaranteed to be geometrically feasible. This compels us to search in the combined search space, in which frequent backtracks between symbolic and geometric levels make the search inefficient.We address this problem by guiding symbolic search with rich information extracted from the geometric level through culprit detection mechanisms. KW - combined task and motion planning KW - manipulation planning Y1 - 2016 U6 - https://doi.org/10.1177/0278364915619022 SN - 1741-3176 SN - 0278-3649 VL - 35 IS - 8 SP - 890 EP - 927 PB - Sage Science Press CY - Thousand Oaks ER - TY - JOUR A1 - Afantenos, Stergos A1 - Peldszus, Andreas A1 - Stede, Manfred T1 - Comparing decoding mechanisms for parsing argumentative structures JF - Argument & Computation N2 - Parsing of argumentative structures has become a very active line of research in recent years. Like discourse parsing or any other natural language task that requires prediction of linguistic structures, most approaches choose to learn a local model and then perform global decoding over the local probability distributions, often imposing constraints that are specific to the task at hand. Specifically for argumentation parsing, two decoding approaches have been recently proposed: Minimum Spanning Trees (MST) and Integer Linear Programming (ILP), following similar trends in discourse parsing. In contrast to discourse parsing though, where trees are not always used as underlying annotation schemes, argumentation structures so far have always been represented with trees. Using the ‘argumentative microtext corpus’ [in: Argumentation and Reasoned Action: Proceedings of the 1st European Conference on Argumentation, Lisbon 2015 / Vol. 2, College Publications, London, 2016, pp. 801–815] as underlying data and replicating three different decoding mechanisms, in this paper we propose a novel ILP decoder and an extension to our earlier MST work, and then thoroughly compare the approaches. The result is that our new decoder outperforms related work in important respects, and that in general, ILP and MST yield very similar performance. KW - Argumentation structure KW - argument mining KW - parsing Y1 - 2018 U6 - https://doi.org/10.3233/AAC-180033 SN - 1946-2166 SN - 1946-2174 VL - 9 IS - 3 SP - 177 EP - 192 PB - IOS Press CY - Amsterdam ER - TY - JOUR A1 - Bröker, Kathrin A1 - Kastens, Uwe A1 - Magenheim, Johannes T1 - Competences of Undergraduate Computer Science Students JF - KEYCIT 2014 - Key Competencies in Informatics and ICT N2 - The paper presents two approaches to the development of a Computer Science Competence Model for the needs of curriculum development and evaluation in Higher Education. A normativetheoretical approach is based on the AKT and ACM/IEEE curriculum and will be used within the recommendations of the German Informatics Society (GI) for the design of CS curricula. An empirically oriented approach refines the categories of the first one with regard to specific subject areas by conducting content analysis on CS curricula of important universities from several countries. The refined model will be used for the needs of students’ e-assessment and subsequent affirmative action of the CS departments. KW - Competences KW - Competence Measurement KW - Curriculum Development KW - Computer Science Education KW - Recommendations for CS-Curricula in Higher Education Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-82613 SN - 1868-0844 SN - 2191-1940 IS - 7 SP - 77 EP - 96 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Wildner, Uwe T1 - Compiler assisted self-checking of structural integrity using return adress hashing Y1 - 1996 ER - TY - JOUR A1 - Delgrande, James Patrick A1 - Schaub, Torsten T1 - Compiling reasoning with and about preferences into default logic Y1 - 1997 SN - 1-558-60480-4 SN - 1045-0823 ER - TY - JOUR A1 - Delgrande, James Patrick A1 - Schaub, Torsten T1 - Compiling specificity into approaches to nonmonotonic reasoning Y1 - 1997 SN - 0004-3702 ER - TY - JOUR A1 - Gebser, Martin A1 - Kaminski, Roland A1 - Schaub, Torsten T1 - Complex optimization in answer set programming JF - Theory and practice of logic programming N2 - Preference handling and optimization are indispensable means for addressing nontrivial applications in Answer Set Programming (ASP). However, their implementation becomes difficult whenever they bring about a significant increase in computational complexity. As a consequence, existing ASP systems do not offer complex optimization capacities, supporting, for instance, inclusion-based minimization or Pareto efficiency. Rather, such complex criteria are typically addressed by resorting to dedicated modeling techniques, like saturation. Unlike the ease of common ASP modeling, however, these techniques are rather involved and hardly usable by ASP laymen. We address this problem by developing a general implementation technique by means of meta-prpogramming, thus reusing existing ASP systems to capture various forms of qualitative preferences among answer sets. In this way, complex preferences and optimization capacities become readily available for ASP applications. KW - Answer Set Programming KW - Preference Handling KW - Complex optimization KW - Meta-Programming Y1 - 2011 U6 - https://doi.org/10.1017/S1471068411000329 SN - 1471-0684 VL - 11 IS - 3 SP - 821 EP - 839 PB - Cambridge Univ. Press CY - New York ER - TY - JOUR A1 - Bibel, Wolfgang A1 - Brüning, Stefan A1 - Otten, Jens A1 - Rath, Thomas A1 - Schaub, Torsten T1 - Compressions and extensions Y1 - 1998 ER - TY - JOUR A1 - Uflacker, Matthias T1 - Computational analysis of virtual team collaboration in teh early stages of engineering design Y1 - 2010 SN - 978-3-86956-036-6 ER - TY - JOUR A1 - Beerenwinkel, Niko A1 - Sing, Tobias A1 - Lengauer, Thomas A1 - Rahnenfuhrer, Joerg A1 - Roomp, Kirsten A1 - Savenkov, Igor A1 - Fischer, Roman A1 - Hoffmann, Daniel A1 - Selbig, Joachim A1 - Korn, Klaus A1 - Walter, Hauke A1 - Berg, Thomas A1 - Braun, Patrick A1 - Faetkenheuer, Gerd A1 - Oette, Mark A1 - Rockstroh, Juergen A1 - Kupfer, Bernd A1 - Kaiser, Rolf A1 - Daeumer, Martin T1 - Computational methods for the design of effective therapies against drug resistant HIV strains N2 - The development of drug resistance is a major obstacle to successful treatment of HIV infection. The extraordinary replication dynamics of HIV facilitates its escape from selective pressure exerted by the human immune system and by combination drug therapy. We have developed several computational methods whose combined use can support the design of optimal antiretroviral therapies based on viral genomic data Y1 - 2005 ER - TY - JOUR A1 - Bottino, Rosa A1 - Chioccariello, Augusto T1 - Computational Thinking BT - Videogames, Educational Robotics, and other Powerful Ideas to Think with JF - KEYCIT 2014 - Key Competencies in Informatics and ICT N2 - Digital technology has radically changed the way people work in industry, finance, services, media and commerce. Informatics has contributed to the scientific and technological development of our society in general and to the digital revolution in particular. Computational thinking is the term indicating the key ideas of this discipline that might be included in the key competencies underlying the curriculum of compulsory education. The educational potential of informatics has a history dating back to the sixties. In this article, we briefly revisit this history looking for lessons learned. In particular, we focus on experiences of teaching and learning programming. However, computational thinking is more than coding. It is a way of thinking and practicing interactive dynamic modeling with computers. We advocate that learners can practice computational thinking in playful contexts where they can develop personal projects, for example building videogames and/or robots, share and discuss their construction with others. In our view, this approach allows an integration of computational thinking in the K-12 curriculum across disciplines. KW - Computational thinking KW - programming in context KW - informatics education Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-82820 SN - 1868-0844 SN - 2191-1940 IS - 7 SP - 301 EP - 309 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Schwill, Andreas T1 - Computer science education based on fundamental ideas Y1 - 1997 ER - TY - JOUR A1 - Tscherejkina, Anna A1 - Morgiel, Anna A1 - Moebert, Tobias T1 - Computergestütztes Training von sozio-emotionalen Kompetenzen durch Minispiele JF - E-Learning Symposium 2018 N2 - Das Training sozioemotionaler Kompetenzen ist gerade für Menschen mit Autismus nützlich. Ein solches Training kann mithilfe einer spielbasierten Anwendung effektiv gestaltet werden. Zwei Minispiele, Mimikry und Emo-Mahjong, wurden realisiert und hinsichtlich User Experience evaluiert. Die jeweiligen Konzepte und die Evaluationsergebnisse sollen hier vorgestellt werden. KW - Computergestützes Training KW - User Experience KW - Digital Game Based Learning KW - Autismus Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-421937 SP - 41 EP - 52 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Petre, Marian T1 - Computing is not a spectator sport BT - rethinking how we introduce our discipline to students JF - Commentarii informaticae didacticae : (CID) N2 - This talk will describe My Digital Life (TU100), a distance learning module that introduces computer science through immediate engagement with ubiquitous computing (ubicomp). This talk will describe some of the principles and concepts we have adopted for this modern computing introduction: the idea of the ‘informed digital citizen’; engagement through narrative; playful pedagogy; making the power of ubicomp available to novices; setting technical skills in real contexts. It will also trace how the pedagogy is informed by experiences and research in Computer Science education. Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-65045 SN - 1868-0844 SN - 2191-1940 IS - 5 SP - 155 EP - 159 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Dimitriev, Alexej A1 - Saposhnikov, V. V. A1 - Saposhnikov, Vl. V. A1 - Gössel, Michael T1 - Concurrent checking of sequential circuits by alternating inputs Y1 - 1999 ER - TY - JOUR A1 - Sogomonyan, Egor S. A1 - Gössel, Michael T1 - Concurrently self-testing embedded checkers for ultra-reliable fault-tolerant systems Y1 - 1996 ER - TY - JOUR A1 - Gebser, Martin A1 - Kaufmann, Benjamin A1 - Neumann, André A1 - Schaub, Torsten T1 - Conflict-driven answer set enumeration Y1 - 2007 SN - 978-3-540- 72199-4 ER - TY - JOUR A1 - Gebser, Martin A1 - Kaufmann, Benjamin A1 - Neumann, André A1 - Schaub, Torsten T1 - Conflict-driven answer set solving Y1 - 2007 SN - 978-1-57735-323-2 ER - TY - JOUR A1 - Gebser, Martin A1 - Kaufmann, Benjamin A1 - Schaub, Torsten T1 - Conflict-driven answer set solving: From theory to practice JF - Artificial intelligence N2 - We introduce an approach to computing answer sets of logic programs, based on concepts successfully applied in Satisfiability (SAT) checking. The idea is to view inferences in Answer Set Programming (ASP) as unit propagation on nogoods. This provides us with a uniform constraint-based framework capturing diverse inferences encountered in ASP solving. Moreover, our approach allows us to apply advanced solving techniques from the area of SAT. As a result, we present the first full-fledged algorithmic framework for native conflict-driven ASP solving. Our approach is implemented in the ASP solver clasp that has demonstrated its competitiveness and versatility by winning first places at various solver contests. KW - Answer set programming KW - Logic programming KW - Nonmonotonic reasoning Y1 - 2012 U6 - https://doi.org/10.1016/j.artint.2012.04.001 SN - 0004-3702 VL - 187 IS - 8 SP - 52 EP - 89 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Rozinat, A A1 - Van der Aalst, Wil M. P. T1 - Conformance testing: Measuring the fit and appropriateness of event logs and process models N2 - Most information systems log events (e.g., transaction logs, audit traits) to audit and monitor the processes they support. At the same time, many of these processes have been explicitly modeled. For example, SAP R/3 logs events in transaction logs and there are EPCs (Event-driven Process Chains) describing the so-called reference models. These reference models describe how the system should be used. The coexistence of event logs and process models raises an interesting question: "Does the event log conform to the process model and vice versa?". This paper demonstrates that there is not a simple answer to this question. To tackle the problem, we distinguish two dimensions of conformance: fitness (the event log may be the result of the process modeled) and appropriateness (the model is a likely candidate from a structural and behavioral point of view). Different metrics have been defined and a Conformance Checker has been implemented within the ProM Framework Y1 - 2006 ER - TY - JOUR A1 - Polyvyanyy, Artem A1 - Weidlich, Matthias A1 - Weske, Mathias T1 - Connectivity of workflow nets the foundations of stepwise verification JF - Acta informatica N2 - Behavioral models capture operational principles of real-world or designed systems. Formally, each behavioral model defines the state space of a system, i.e., its states and the principles of state transitions. Such a model is the basis for analysis of the system's properties. In practice, state spaces of systems are immense, which results in huge computational complexity for their analysis. Behavioral models are typically described as executable graphs, whose execution semantics encodes a state space. The structure theory of behavioral models studies the relations between the structure of a model and the properties of its state space. In this article, we use the connectivity property of graphs to achieve an efficient and extensive discovery of the compositional structure of behavioral models; behavioral models get stepwise decomposed into components with clear structural characteristics and inter-component relations. At each decomposition step, the discovered compositional structure of a model is used for reasoning on properties of the whole state space of the system. The approach is exemplified by means of a concrete behavioral model and verification criterion. That is, we analyze workflow nets, a well-established tool for modeling behavior of distributed systems, with respect to the soundness property, a basic correctness property of workflow nets. Stepwise verification allows the detection of violations of the soundness property by inspecting small portions of a model, thereby considerably reducing the amount of work to be done to perform soundness checks. Besides formal results, we also report on findings from applying our approach to an industry model collection. Y1 - 2011 U6 - https://doi.org/10.1007/s00236-011-0137-8 SN - 0001-5903 VL - 48 IS - 4 SP - 213 EP - 242 PB - Springer CY - New York ER - TY - JOUR A1 - Kupries, Mario T1 - Connector-aided coordination in agent systems Y1 - 1999 ER - TY - JOUR A1 - Webb, Mary T1 - Considerations for the Design of Computing Curricula JF - KEYCIT 2014 - Key Competencies in Informatics and ICT N2 - This paper originated from discussions about the need for important changes in the curriculum for Computing including two focus group meetings at IFIP conferences over the last two years. The paper examines how recent developments in curriculum, together with insights from curriculum thinking in other subject areas, especially mathematics and science, can inform curriculum design for Computing. The analysis presented in the paper provides insights into the complexity of curriculum design as well as identifying important constraints and considerations for the ongoing development of a vision and framework for a Computing curriculum. KW - Curriculum KW - Computer Science KW - Informatics KW - curriculum theory Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-82723 SN - 1868-0844 SN - 2191-1940 IS - 7 SP - 267 EP - 283 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Delgrande, James Patrick A1 - Schaub, Torsten T1 - Consistency-based approaches to merging knowledge based : preliminary report Y1 - 2004 UR - http://www.pims.math.ca/science/2004/NMR/papers/paper17.pdf SN - 92-990021-0-X ER - TY - JOUR A1 - Lis, Monika ED - Lambrecht, Anna-Lena ED - Margaria, Tiziana T1 - Constructing a Phylogenetic Tree JF - Process Design for Natural Scientists: an agile model-driven approach N2 - In this project I constructed a workflow that takes a DNA sequence as input and provides a phylogenetic tree, consisting of the input sequence and other sequences which were found during a database search. In this phylogenetic tree the sequences are arranged depending on similarities. In bioinformatics, constructing phylogenetic trees is often used to explore the evolutionary relationships of genes or organisms and to understand the mechanisms of evolution itself. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 101 EP - 109 PB - Springer Verlag CY - Berlin ER - TY - JOUR A1 - Alnemr, Rehab T1 - Context-aware Reputation in SOA and future internet Y1 - 2010 SN - 978-3-86956-036-6 ER - TY - JOUR A1 - Bordihn, Henning T1 - Context-freeness of the power of context-free languages is undecidable N2 - The power of a language L is the set of all powers of the words in L. In this paper, the following decision problem is investigated. Given a context-free language L, is the power of L context-free? We show that this problem is decidable for languages over unary alphabets, but it is undecidable whenever languages over alphabets with at least two letters are considered. (C) 2003 Elsevier B.V. All rights reserved Y1 - 2004 SN - 0304-3975 ER - TY - JOUR A1 - Margaria, Tiziana A1 - Steffen, Bernhard T1 - Continuous model-driven engineering N2 - Agility at the customer, user, and application level will prove key to aligning and linking business and IT Y1 - 2009 UR - http://www.computer.org/computer/ SN - 0018-9162 ER - TY - JOUR A1 - Lorenz, Claas A1 - Clemens, Vera Elisabeth A1 - Schrötter, Max A1 - Schnor, Bettina T1 - Continuous verification of network security compliance JF - IEEE transactions on network and service management N2 - Continuous verification of network security compliance is an accepted need. Especially, the analysis of stateful packet filters plays a central role for network security in practice. But the few existing tools which support the analysis of stateful packet filters are based on general applicable formal methods like Satifiability Modulo Theories (SMT) or theorem prover and show runtimes in the order of minutes to hours making them unsuitable for continuous compliance verification. In this work, we address these challenges and present the concept of state shell interweaving to transform a stateful firewall rule set into a stateless rule set. This allows us to reuse any fast domain specific engine from the field of data plane verification tools leveraging smart, very fast, and domain specialized data structures and algorithms including Header Space Analysis (HSA). First, we introduce the formal language FPL that enables a high-level human-understandable specification of the desired state of network security. Second, we demonstrate the instantiation of a compliance process using a verification framework that analyzes the configuration of complex networks and devices - including stateful firewalls - for compliance with FPL policies. Our evaluation results show the scalability of the presented approach for the well known Internet2 and Stanford benchmarks as well as for large firewall rule sets where it outscales state-of-the-art tools by a factor of over 41. KW - Security KW - Tools KW - Network security KW - Engines KW - Benchmark testing; KW - Analytical models KW - Scalability KW - Network KW - security KW - compliance KW - formal KW - verification Y1 - 2021 U6 - https://doi.org/10.1109/TNSM.2021.3130290 SN - 1932-4537 VL - 19 IS - 2 SP - 1729 EP - 1745 PB - Institute of Electrical and Electronics Engineers CY - New York ER - TY - JOUR A1 - Tarnick, Steffen T1 - Controllable self-checking checkers for conditional concurrent checking Y1 - 1995 ER - TY - JOUR A1 - Tarnick, Steffen T1 - Controllable self-checking checkers for conditional concurrent checking Y1 - 1994 ER - TY - JOUR A1 - Noack, Franziska T1 - CREADED: Colored-Relief application for digital elevation data JF - Process design for natural scientists: an agile model-driven approach N2 - In the geoinformatics field, remote sensing data is often used for analyzing the characteristics of the current investigation area. This includes DEMs, which are simple raster grids containing grey scales representing the respective elevation values. The project CREADED that is presented in this paper aims at making these monochrome raster images more significant and more intuitively interpretable. For this purpose, an executable interactive model for creating a colored and relief-shaded Digital Elevation Model (DEM) has been designed using the jABC framework. The process is based on standard jABC-SIBs and SIBs that provide specific GIS functions, which are available as Web services, command line tools and scripts. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 186 EP - 199 PB - Springer CY - Berlin ER - TY - JOUR A1 - Dennert-Möller, Elisabeth A1 - Garmann, Robert T1 - Das „Startprojekt“ BT - Entwicklung überfachlicher Kompetenzen von Anfang an JF - Commentarii informaticae didacticae (CID) N2 - Absolventinnen und Absolventen unserer Informatik-Bachelorstudiengänge benötigen für kompetentes berufliches Handeln sowohl fachliche als auch überfachliche Kompetenzen. Vielfach verlangen wir von Erstsemestern in Grundlagen-Lehrveranstaltungen fast ausschließlich den Aufbau von Fachkompetenz und vernachlässigen dabei häufig Selbstkompetenz, Methodenkompetenz und Sozialkompetenz. Gerade die drei letztgenannten sind für ein erfolgreiches Studium unabdingbar und sollten von Anfang an entwickelt werden. Wir stellen unser „Startprojekt“ als einen Beitrag vor, im ersten Semester die eigenverantwortliche, überfachliche Kompetenzentwicklung in einem fachlichen Kontext zu fördern. Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-94780 SN - 978-3-86956-376-3 SN - 1868-0844 SN - 2191-1940 IS - 10 SP - 11 EP - 23 PB - Universitätsverlag Potsdam CY - Potsdam ER - TY - JOUR A1 - Blaese, Leif T1 - Data mining for unidentified protein squences JF - Process design for natural scientists: an agile model-driven approach N2 - Through the use of next generation sequencing (NGS) technology, a lot of newly sequenced organisms are now available. Annotating those genes is one of the most challenging tasks in sequence biology. Here, we present an automated workflow to find homologue proteins, annotate sequences according to function and create a three-dimensional model. Y1 - 2014 SN - 978-3-662-45005-5 SN - 1865-0929 IS - 500 SP - 73 EP - 87 PB - Springer CY - Berlin ER - TY - JOUR A1 - Brain, Martin A1 - Gebser, Martin A1 - Pührer, Jörg A1 - Schaub, Torsten A1 - Tompits, Hans A1 - Woltran, Stefan T1 - Debugging ASP programs by means of ASP Y1 - 2007 SN - 978-3-540- 72199-4 ER - TY - JOUR A1 - Bordihn, Henning A1 - Holzer, Markus A1 - Kutrib, Martin T1 - Decidability of operation problems for TOL languages and subclasses JF - Information and computation N2 - We investigate the decidability of the operation problem for TOL languages and subclasses. Fix an operation on formal languages. Given languages from the family considered (OL languages, TOL languages, or their propagating variants), is the application of this operation to the given languages still a language that belongs to the same language family? Observe, that all the Lindenmayer language families in question are anti-AFLs, that is, they are not closed under homomorphisms, inverse homomorphisms, intersection with regular languages, union, concatenation, and Kleene closure. Besides these classical operations we also consider intersection and substitution, since the language families under consideration are not closed under these operations, too. We show that for all of the above mentioned language operations, except for the Kleene closure, the corresponding operation problems of OL and TOL languages and their propagating variants are not even semidecidable. The situation changes for unary OL languages. In this case we prove that the operation problems with respect to Kleene star, complementation, and intersection with regular sets are decidable. KW - L systems KW - Operation problem KW - Decidability KW - Unary languages Y1 - 2011 U6 - https://doi.org/10.1016/j.ic.2010.11.008 SN - 0890-5401 VL - 209 IS - 3 SP - 344 EP - 352 PB - Elsevier CY - San Diego ER -