Institut für Informatik und Computational Science
Refine
Year of publication
Document Type
- Article (576)
- Doctoral Thesis (203)
- Monograph/Edited Volume (135)
- Other (28)
- Conference Proceeding (17)
- Part of a Book (12)
- Master's Thesis (10)
- Postprint (10)
- Preprint (4)
- Bachelor Thesis (1)
Is part of the Bibliography
- yes (998) (remove)
Keywords
- answer set programming (13)
- Answer Set Programming (10)
- Answer set programming (10)
- Machine Learning (7)
- Maschinelles Lernen (7)
- Antwortmengenprogrammierung (6)
- E-Learning (6)
- Informatik (6)
- Modellierung (5)
- Informatikdidaktik (4)
- Internet of Things (4)
- MQTT (4)
- higher education (4)
- machine learning (4)
- security (4)
- Algorithms (3)
- Digitale Medien (3)
- EEG (3)
- Equilibrium logic (3)
- Komplexität (3)
- Machine learning (3)
- Modeling (3)
- Ontologie (3)
- Optimization (3)
- Semantic Web (3)
- Softwareentwicklung (3)
- didactics (3)
- education (3)
- formal languages (3)
- image processing (3)
- monitoring (3)
- teacher training (3)
- verification (3)
- 3D visualization (2)
- ASIC (2)
- Adaptivity (2)
- Algorithmen (2)
- Analytical models (2)
- Assessment (2)
- Autismus (2)
- Automata systems (2)
- Bildungstechnologien (2)
- Bildverarbeitung (2)
- Code (2)
- Codierungstheorie (2)
- Complexity (2)
- Computergrafik (2)
- Computersicherheit (2)
- Coq (2)
- Debugging (2)
- Deep Learning (2)
- Didaktik (2)
- Digitalisierung (2)
- E-learning (2)
- Educational Technologies (2)
- Event mapping (2)
- FMC (2)
- FPGA (2)
- Fault tolerance (2)
- Fehlererkennung (2)
- Hochschuldidaktik (2)
- Hochschullehre (2)
- ICA (2)
- IT-Infrastruktur (2)
- Informatikstudium (2)
- Internet (2)
- Knowledge Representation and Reasoning (2)
- Konstruktivismus (2)
- Künstliche Intelligenz (2)
- Lernumgebung (2)
- Lindenmayer systems (2)
- Mensch-Technik-Interaktion (2)
- Methodik (2)
- Middleware (2)
- Modell (2)
- Modelling (2)
- Non-monotonic reasoning (2)
- Onlinelehre (2)
- Ontology (2)
- Optimierung (2)
- Parameterized complexity (2)
- Preference Handling (2)
- Process (2)
- Process mining (2)
- Prototyping (2)
- Prozess (2)
- Prozessmodellierung (2)
- ResNet (2)
- Software Engineering (2)
- Synthese (2)
- Systemstruktur (2)
- TPACK (2)
- Texturen (2)
- Theory (2)
- Tracking (2)
- Treewidth (2)
- User Experience (2)
- Visualisierung (2)
- Vorhersage (2)
- anti-cancer drugs (2)
- argument mining (2)
- automatic feedback (2)
- bioinformatics (2)
- code (2)
- complexity (2)
- computer graphics (2)
- computer science education (2)
- concurrent checking (2)
- cooperating systems (2)
- deep neural networks (2)
- drug-sensitivity prediction (2)
- e-learning (2)
- edge computing (2)
- embedded systems (2)
- face tracking (2)
- facial expression (2)
- firmware update (2)
- geovisualization (2)
- human computer interaction (2)
- informatics (2)
- knowledge representation and nonmonotonic reasoning (2)
- lesson planning (2)
- lesson preparation (2)
- logic programming (2)
- maschinelles Lernen (2)
- model (2)
- non-photorealistic rendering (2)
- online learning (2)
- parsing (2)
- perception (2)
- perception differences (2)
- physical computing (2)
- policy evaluation (2)
- radhard design (2)
- reliability (2)
- scientific workflows (2)
- self-adaptive multiprocessing system (2)
- single event upset (2)
- software development (2)
- solar particle event (2)
- support system (2)
- systems biology (2)
- test (2)
- textures (2)
- virtual 3D city models (2)
- virtual mobility (2)
- virtuelle 3D-Stadtmodelle (2)
- visualization (2)
- 'Peer To Peer' (1)
- (FPGA) (1)
- (SET) count rate (1)
- 13C metabolic flux analysis (1)
- 2-tag system (1)
- 3D Computer Grafik (1)
- 3D Computer Graphics (1)
- 3D Drucken (1)
- 3D Linsen (1)
- 3D Semiotik (1)
- 3D Visualisierung (1)
- 3D computer graphics (1)
- 3D lenses (1)
- 3D modeling (1)
- 3D printing (1)
- 3D semiotics (1)
- 3D-Stadtmodelle (1)
- 3d city models (1)
- 6LoWPAN (1)
- ADFS (1)
- AODV (1)
- ASIC (Applikationsspezifische Integrierte Schaltkreise) (1)
- ASP (Answer Set Programming) (1)
- Abbrecherquote (1)
- Absorbed dose (1)
- Abstraction (1)
- Abstraktion (1)
- Accepting Grammars (1)
- Access control (1)
- Ackerschmalwand (1)
- Active Directory Federation Services (1)
- Active Evaluation (1)
- Active evaluation (1)
- Ad hoc routing (1)
- Adaptivität (1)
- Advanced Video Codec (AVC) (1)
- Adversarial Learning (1)
- Aggregates (1)
- Aktive Evaluierung (1)
- Akzeptierende Grammatiken (1)
- Algorithm configuration (1)
- Algorithm portfolios (1)
- Algorithmenablaufplanung (1)
- Algorithmenkonfiguration (1)
- Algorithmenselektion (1)
- Alignment (1)
- Android Security (1)
- Android hybrid apps (1)
- Angewandte Spieltheorie (1)
- Animal building (1)
- Anisotroper Kuwahara Filter (1)
- Anleitung (1)
- Answer Set Solving modulo Theories (1)
- Anti-cancer drugs (1)
- Antwortmengen Programmierung (1)
- App (1)
- Application Aggregation (1)
- Applications and Software Development (1)
- Applied Game Theory (1)
- Apps (1)
- Arabidopsis thaliana (1)
- Argumentation (1)
- Argumentation structure (1)
- Artificial Intelligence (1)
- Artificial Neuronal Network (1)
- Asynchrone Schaltung (1)
- Attention (1)
- Aufmerksamkeit (1)
- Aufzählung (1)
- Augenbewegungen (1)
- Augmentation (1)
- Augmented and virtual reality (1)
- Ausbildung (1)
- Ausreissererkennung (1)
- Authorization (1)
- Autism (1)
- Automated parallelization (1)
- Automatic UI Generation (1)
- Automatically controlled windows (1)
- Autorisierung (1)
- BCH (1)
- BCH code (1)
- BCH-Code (1)
- BCI (1)
- BSS (1)
- Bachelorstudierende der Informatik (1)
- Backdoors (1)
- Barcode (1)
- Batch processing (1)
- Baumweite (1)
- Bean (1)
- Behavior (1)
- Benchmark testing; (1)
- Benutzeroberfläche (1)
- Benutzungsschnittstellen Ontologien (1)
- Berliner Modell (1)
- Berührungseingaben (1)
- Betrachtungsebenen (1)
- Beweis (1)
- Beweisassistent (1)
- Beweistheorie (1)
- Beweisumgebung (1)
- Bilddatenanalyse (1)
- Bildung (1)
- Binäres Entscheidungsdiagramm (1)
- Bio-jETI (1)
- Biocomputing (1)
- Bioelektrisches Signal (1)
- Bioinformatik (1)
- Blind users (1)
- Boolean constraint solver (1)
- Boolean logic models (1)
- Boosting (1)
- Bot Detection (1)
- Brain Computer Interface (1)
- Business Process (1)
- Business Process Models (1)
- Business process intelligence (1)
- CASP (Constraint Answer Set Programming) (1)
- CP-Logic (1)
- CSC (1)
- Cactus (1)
- Campus (1)
- CertiCoq (1)
- Chipkarten (1)
- Choreographien (1)
- Circuit faults (1)
- CityGML (1)
- Classification (1)
- Clock tree (1)
- Cloud (1)
- Cluster Computing (1)
- Cluster computing (1)
- Clusteranalyse (1)
- Code generation (1)
- Codierung (1)
- Coding theory (1)
- Cognitive Apprenticeship (1)
- Coherent phonons (1)
- Combinatorial multi-objective optimization (1)
- Common Spatial Pattern (1)
- Complementary Circuits (1)
- Complex optimization (1)
- Compliance (1)
- Composed UIs (1)
- Composition (1)
- Computational Complexity (1)
- Computational Science (1)
- Computational complexity (1)
- Computational grid (1)
- Computer Science (1)
- Computer Science Education (1)
- Computer security (1)
- Computergestützes Training (1)
- Computing with DNA (1)
- Conceptual (1)
- Conformant Planning (1)
- Conrad Hal Waddington (1)
- Constraint (1)
- Constraint Solving (1)
- Constraint satisfaction (1)
- Constraint-Programmierung (1)
- Constructive solid geometry (1)
- Context awareness (1)
- Contextualized learning (1)
- Continuous Testing (1)
- Continuous Versioning (1)
- Controlled Derivations (1)
- Convolution (1)
- Course timetabling (1)
- Covariate Shift (1)
- Curry (1)
- Customer ownership (1)
- D-galactosamine (1)
- DDoS (1)
- DMR (1)
- DNA (1)
- DNA computing (1)
- DNA hairpin formation (1)
- DNS (1)
- DPLL (1)
- DRMAA (1)
- DRMS (1)
- Data federation (1)
- Database (1)
- Deal of the Day (1)
- Decidability (1)
- Declarative Problem Solving (1)
- Declare (1)
- Deduction (1)
- Deep learning (1)
- Defining characteristics of physical computing (1)
- Dempster-Shafer-Theorie (1)
- Dempster–Shafer theory (1)
- Denotational semantics (1)
- Description Logics (1)
- Design (1)
- Design for testability (DFT) (1)
- Deskriptive Logik (1)
- Deutschland (1)
- Diagonalisierung (1)
- Dialog-based User Interfaces (1)
- Dialogbasierte Benutzerschnittstellen (1)
- Didaktik der Informatik (1)
- Didaktische Konzepte (1)
- Dienst-Ökosysteme (1)
- Dienstkomposition (1)
- Dienstplattform (1)
- Differenz von Gauss Filtern (1)
- Digital Design (1)
- Digital Game Based Learning (1)
- Digital Media (1)
- Digital image analysis (1)
- Digitalisation (1)
- Digitalization (1)
- Distributed Computing (1)
- Diversität (1)
- Domain-Specific Languages (1)
- Domänenspezifische Sprachen (1)
- Dose rate (1)
- Double cell upsets (DCUs) (1)
- Dreidimensionale Computergraphik (1)
- Dynamic Programming (1)
- Dynamical X-ray theory (1)
- Dynamische Programmierung (1)
- Dynamische Rekonfiguration (1)
- E-Assessment (1)
- E-Government (1)
- E-Klausuren (1)
- E-Portfolio (1)
- E-teaching (1)
- EDC (1)
- Echtzeitanwendung (1)
- Edge Computing (1)
- Educational game (1)
- Educational timetabling (1)
- Eingabegenauigkeit (1)
- Eisenbahnnetz (1)
- Elektroencephalographie (1)
- Elektronisches Prüfen (1)
- Emotionen (1)
- Emotionsforschung (1)
- Encoding (1)
- Engines (1)
- Enterprise Architecture (1)
- Enterprise Search (1)
- Entity Linking (1)
- Entscheidungsbäume (1)
- Entwurf (1)
- Entwurfsmuster (1)
- Entwurfsmuster für SOA-Sicherheit (1)
- Entwurfsprinzipien (1)
- Entwurfsraumexploration (1)
- Epigenetic landscape (1)
- Epistemic Logic Programs (1)
- Erfüllbarkeit einer Formel der Aussagenlogik (1)
- Erfüllbarkeitsproblem (1)
- Erklärbarkeit (1)
- Error Estimation (1)
- Error-Detection Circuits (1)
- European Bioinformatics Institute (1)
- Evaluation (1)
- Evaluierung semantischer Suchmaschinen (1)
- Evidenztheorie (1)
- Evolution (1)
- Experimentation (1)
- Explainability (1)
- Explicit negation (1)
- Exploration (1)
- Explore-first Programming (1)
- Exponential Time Hypothesis (1)
- Exponentialzeit Hypothese (1)
- Extensibility (1)
- Extreme Model-Driven Development (1)
- FEDC (1)
- FMC-QE (1)
- FOSS (1)
- Fault Localization (1)
- Fault tolerant systems (1)
- Feature Combination (1)
- Feature extraction (1)
- Feedback (1)
- Fehlende Daten (1)
- Fehlerkorrektur (1)
- Fehlerschätzung (1)
- Fehlvorstellung (1)
- Field programmable gate arrays (1)
- Finite automata (1)
- Flip-flops (1)
- Flussgesteuerter Bilateraler Filter (1)
- Focus+Context Visualization (1)
- Fokus-&-Kontext Visualisierung (1)
- Forgetting (1)
- Formalismus (1)
- Formalitätsgrad (1)
- Formeln der quantifizierten Aussagenlogik (1)
- Forschendes Lernen (1)
- Forschungsdatenmanagement (1)
- Framework (1)
- Freshmen (1)
- GERBIL (1)
- GIS-Dienstkomposition (1)
- GPU (1)
- Game-Design-Elemente (1)
- Game-based learning (1)
- Gamification (1)
- Gebrauchstauglichkeit (1)
- Gebäudemodelle (1)
- Gehirn-Computer-Schnittstelle (1)
- Geländemodelle (1)
- Generalisierung (1)
- Generative Programmierung (1)
- Generative Programming (1)
- Geodaten (1)
- Geometrieerzeugung (1)
- Geovisualisierung (1)
- Geräte-Treiber (1)
- Geschäftsprozess (1)
- Geschäftsprozessmodelle (1)
- Gesichtsausdruck (1)
- Gesteuerte Ableitungen (1)
- Gesture input (1)
- Gleichheit (1)
- Globus (1)
- Grammar Systems (1)
- Grammatikalische Inferenz (1)
- Grammatiksysteme (1)
- Graph Convolutional Neural Networks (1)
- Graph Embedding (1)
- Graph-basiertes Ranking (1)
- Graphfärbung (1)
- Green computing (1)
- Grid (1)
- Grid Computing (1)
- Grounded theory (1)
- Grounding Theory (1)
- H.264 (1)
- HCI (1)
- HDI (1)
- Hairpin completions (1)
- Hairpin reductions (1)
- Hardware Design (1)
- Hardware accelerator (1)
- Hardware-Software-Co-Design (1)
- Hauptkomponentenanalyse (1)
- Heat diffusion (1)
- Heterogenität (1)
- Hierarchically configurable mask register (1)
- High-Level Synthesis (1)
- Histograms (1)
- Hochschul-Apps (1)
- Hochschul-Cloud (1)
- Hochschulbildung (1)
- Hochschulsystem (1)
- Https traffic (1)
- Human Factors (1)
- Human-Technology Interaction (1)
- Hurricane Sandy (1)
- Hybrid App (1)
- I/O-effiziente Algorithmen (1)
- ICT (1)
- IP core (1)
- IT security (1)
- IT-Security (1)
- IT-Sicherheit (1)
- IaaS (1)
- Identifiers (1)
- Image and video stylization (1)
- Image resolution (1)
- Imperative calculi (1)
- Improving classroom (1)
- Incoherent phonons (1)
- Incremental answer set programming (1)
- Industrie 4.0 (1)
- Industry 4.0 (1)
- Inference (1)
- Informatics (1)
- Informatik-Studiengänge (1)
- Informatiksystem (1)
- Informatikunterricht (1)
- Informatikvoraussetzungen (1)
- Information Transfer Rate (1)
- Information federation (1)
- Information integration (1)
- Information retrieval (1)
- Information security (1)
- Informationsextraktion (1)
- Infrastruktur (1)
- Inkonsistenz (1)
- Innovation (1)
- Inquiry-based learning (1)
- Insurance industry (1)
- Integrated circuit modeling (1)
- Integration (1)
- Interactive Rendering (1)
- Interactive system (1)
- Interaktionsmodel (1)
- Interaktionsmodellierung (1)
- Interaktionstechniken (1)
- Interaktives Rendering (1)
- Interaktives System (1)
- Interface design (1)
- Internet Security (1)
- Internet-Sicherheit (1)
- Interoperability (1)
- Interoperabilität (1)
- Interpretability (1)
- Interpretierbarkeit (1)
- Intuition (1)
- IoT (1)
- Job monitoring (1)
- Job submission (1)
- Kartografisches Design (1)
- Kern-PCA (1)
- Kernel (1)
- Kernelization (1)
- Kernmethoden (1)
- Key Competencies (1)
- Key input (1)
- Klassifikation (1)
- Klassifikation mit großem Margin (1)
- Klassifikator-Kalibrierung (1)
- Klimafolgenanalyse (1)
- Klimawandel (1)
- Knowledge (1)
- Knowledge Management (1)
- Knowledge representation (1)
- Kommunikation (1)
- Kommunismus (1)
- Kompetenz (1)
- Kompilation (1)
- Komplexitätsbewältigung (1)
- Komplexitätstheorie (1)
- Komposition (1)
- Konnektionskalkül (1)
- Konzeptionell (1)
- Kybernetik (1)
- Künstliche Neuronale Netzwerke (1)
- L systems (1)
- LBA problem (1)
- LDPC code (1)
- LDPC-Code (1)
- Landmark visibility (1)
- Landmarken (1)
- Large Margin Classification (1)
- Laser Cutten (1)
- Learning (1)
- Learning Analytics (1)
- Learning environment (1)
- Lebenslanges Lernen (1)
- Leftmost Derivations (1)
- Lehre (1)
- Lehrer (1)
- Lehrkräfteausbildung (1)
- Leistungsvorhersage (1)
- Lernen (1)
- Lernsoftware (1)
- Life-Long Learning (1)
- Linked Data Anwendungen (1)
- Linked Data Application Modelling (1)
- Linksableitungen (1)
- Linux (1)
- Linux device drivers (1)
- Literature mining (1)
- Liver neoplasms (1)
- Load Balancing (1)
- Localization (1)
- Location awareness (1)
- Logic Programming (1)
- Logic programming (1)
- Logics (1)
- Logik (1)
- Logiksynthese (1)
- Loss (1)
- Low Latency (1)
- Lower Bounds (1)
- Loyalty (1)
- M2M (1)
- MEG (1)
- MFA (1)
- Magnetoencephalographie (1)
- Malware (1)
- Markov processes (1)
- Masking of X-values (1)
- Massenklausuren (1)
- Mathematical Optimization (1)
- Mathematikdidaktik (1)
- Mathematikphilosophie (1)
- Mathematische Optimierung (1)
- Matrizen-Eigenwertaufgabe (1)
- Media in education (1)
- Megamodel (1)
- Megamodell (1)
- Mehrklassen-Klassifikation (1)
- Mensch-Computer-Interaktion (1)
- Message Passing Interface (1)
- Meta-Programming (1)
- Metamodell (1)
- Methoden der semantischen Suche (1)
- Methodology (1)
- Metric learning (1)
- Migration (1)
- Minimal perturbation problems (1)
- Mischmodelle (1)
- Mischung <Signalverarbeitung> (1)
- Mobile App (1)
- Mobile Campus Application (1)
- Mobile Learning (1)
- Mobile application (1)
- Mobile devices (1)
- Mobile learning (1)
- Mobiles Lernen (1)
- Mobilgeräte (1)
- Model Based Engineering (1)
- Model Checking (1)
- Model Driven Architecture (1)
- Model Driven UI Development (1)
- Model Management (1)
- Model checking (1)
- Model-Driven Engineering (1)
- Model-Driven Software Development (1)
- Modell Management (1)
- Modell-driven Security (1)
- Modell-getriebene Sicherheit (1)
- Modellbasiert (1)
- Modellgetriebene Architektur (1)
- Modellgetriebene Entwicklung (1)
- Modellgetriebene Softwareentwicklung (1)
- Modellgetriebene UI Entwicklung (1)
- Molekulare Bioinformatik (1)
- Motivation (1)
- Multi Task Learning (1)
- Multi-Class (1)
- Multi-Task-Lernen (1)
- Multi-objective optimization (1)
- Multi-sided platforms (1)
- Multimedia (1)
- Multimodal User Interfaces (1)
- Multimodale Benutzerschnittstellen (1)
- Multiple interpretation scheme (1)
- Multiprocessor (1)
- Multiprozessor (1)
- N-temperature model (1)
- NETCONF (1)
- NFC (1)
- NUI (1)
- Nash equilibrium (1)
- Natural language processing (1)
- Natural ventilation (1)
- Navigation (1)
- Network (1)
- Network Management (1)
- Network security (1)
- Netzwerk Management (1)
- Netzwerke (1)
- Neural networks (1)
- Neuronales Netz (1)
- New On-Line Error-Detection Methode (1)
- Next Generation Network (1)
- Nicht-photorealistisches Rendering (1)
- Nichtfotorealistische Bildsynthese (1)
- Non-Monotonic (1)
- Nonmonotonic reasoning (1)
- Nutzungserlebnis (1)
- Nutzungsinteresse (1)
- OBI (1)
- OCCI (1)
- OSSE (1)
- Objektive Schwierigkeit (1)
- Omega (1)
- Ontologien (1)
- Ontologies (1)
- Open Badge Infrastructure (1)
- Open Badges (1)
- Open Source (1)
- Operation problem (1)
- Optimierungsproblem (1)
- Parallel Programming (1)
- Parallel SAT solving (1)
- Parallel job execution time estimation (1)
- Parallele Datenverarbeitung (1)
- Paralleles Rechnen (1)
- Parallelrechner (1)
- Parameterized Complexity (1)
- Parametrisierte Komplexität (1)
- Parsing (1)
- Particle detector (1)
- Partizipation (1)
- Patterns (1)
- Pedagogical issues (1)
- Pedestrian navigation (1)
- Peer-Review (1)
- Peer-to-Peer-Netz ; GRID computing ; Zuverlässigkeit ; Web Services ; Betriebsmittelverwaltung ; Migration (1)
- Performance Evaluation (1)
- Performance Prediction (1)
- Personal Learning Environment (1)
- Personalization (1)
- Persönliche Lernumgebung (1)
- Pervasive computing (1)
- Pervasive game (1)
- Pervasive learning (1)
- Phantoms (1)
- Planar tactile display (1)
- Planing (1)
- Plant identification (1)
- Platzierung (1)
- Polarization (1)
- Policy Enforcement (1)
- Power Monitoring (1)
- Pre-RS Traceability (1)
- Prediction Game (1)
- Predictive Models (1)
- Privacy Protection (1)
- Probleme in der Studie (1)
- Problemlösen (1)
- Process Management (1)
- Process model analysis (1)
- Process modeling (1)
- Product lifecycle management (1)
- Professionalisierung (1)
- Professoren (1)
- Programmierung (1)
- Programming (1)
- Programming by optimization (1)
- Proof Theory (1)
- Prototyp (1)
- Prozess Verbesserung (1)
- Prozesse (1)
- Prozessmanagement (1)
- Prozessmodell (1)
- Prozesssynchronisierung (1)
- Prädiktionsspiel (1)
- Präferenzen (1)
- Prüfungsoptimierung (1)
- Quantified Boolean Formula (QBF) (1)
- Quantitative Modeling (1)
- Quantitative Modellierung (1)
- Queuing Theory (1)
- RADFET (1)
- RADFETs (1)
- REST (1)
- RSA triangle (1)
- Radiation hardness (1)
- Random access memory (1)
- Ranking (1)
- Reasoning (1)
- Reconfigurable (1)
- Reflexion (1)
- Region of Interest (1)
- Regression (1)
- Regularisierung (1)
- Regularization (1)
- Rekonfiguration (1)
- Rendering (1)
- Reparatur (1)
- Reproducibility of results (1)
- Reuseable UIs (1)
- Reversibility (1)
- SAMR (1)
- SET pulsewidth distribution (1)
- SMT (SAT Modulo Theories) (1)
- SOA (1)
- SOA Security Pattern (1)
- SSO (1)
- STG decomposition (1)
- STG-Dekomposition (1)
- SWOT (1)
- SaaSAbstract (1)
- Sample Selection Bias (1)
- Satisfiability (1)
- Scalability (1)
- Scale-invariant feature transform (SIFT) (1)
- Scene graph systems (1)
- Schlüsselkompetenzen (1)
- Schulmaterial (1)
- Scientific images (1)
- Screen reader (1)
- Seamless learning (1)
- Search problems (1)
- Security (1)
- Security Modelling (1)
- Segmentierung (1)
- Selektionsbias (1)
- Self-Checking Circuits (1)
- Self-adaptive MPSoC (1)
- Semantic Interoperability (1)
- Semantic Search (1)
- Semantic data (1)
- Semantic web (1)
- Semantik Web (1)
- Semantische Suche (1)
- Semilinearity property (1)
- Sensornetzwerke (1)
- Sequence embeddings (1)
- Service Creation (1)
- Service Delivery Platform (1)
- Service Ecosystems (1)
- Service Oriented Architectures (1)
- Service convergence (1)
- Service orientation (1)
- Service-Orientierte Architekturen (1)
- Service-oriented Architecture (1)
- Service-oriented Architectures (1)
- Service-oriented architecture (1)
- Serviceorientierte Architektur (1)
- Shader (1)
- Sharing (1)
- Sicherheitsmodellierung (1)
- Signal Processing (1)
- Signal processing (1)
- Signaling transduction networks (1)
- Signalquellentrennung (1)
- Signaltrennung (1)
- Simulation (1)
- Simulations (1)
- Simultane Diagonalisierung (1)
- Single Event Transient (1)
- Single Sign On (1)
- Single Trial Analysis (1)
- Single event effect (1)
- Single event upsets (1)
- Single-event transient (SET) (1)
- Skelettberechnung (1)
- Social Media Analysis (1)
- Software (1)
- Software architecture (1)
- Software-basierte Cache-Kohärenz (1)
- Softwarearchitektur (1)
- Sonnenteilchen-Ereignis (1)
- Spam (1)
- Spam Filtering (1)
- Spam-Erkennung (1)
- Spam-Filter (1)
- Spam-Filtering (1)
- Spatio-Spectral Filter (1)
- Spawning (1)
- Speicher (1)
- Spielbasiertes Lernen (1)
- Splicing (1)
- Splicing processor (1)
- Sprachdesign (1)
- Static Analysis (1)
- Statistical Tests (1)
- Statistical relational learning (1)
- Statistikprogramm R (1)
- Statistische Tests (1)
- Stilisierung (1)
- Stochastic relational process (1)
- Strahlungshartes Design (1)
- Strahlungshärte Entwurf (1)
- Strategie (1)
- Stromverbrauchüberwachung (1)
- Strong equivalence (1)
- Structural equation modeling (1)
- Structuring (1)
- Strukturierung (1)
- Studentenerwartungen (1)
- Studentenhaltungen (1)
- Studentenjobs (1)
- Studienabbrecher (1)
- Studiendauer (1)
- Studieneinstieg (1)
- Studienorganisation (1)
- Suche (1)
- Support Vectors (1)
- Support-Vector Lernen (1)
- System Biologie (1)
- System structure (1)
- Systembiologie (1)
- Systementwurf (1)
- Systems biology (1)
- Systems of parallel communicating (1)
- Szenengraph (1)
- TMR (1)
- TPTP (1)
- Tailored UI Variants (1)
- Taktik (1)
- Teilnehmerzertifikate (1)
- Telekommunikation (1)
- Temporal Answer Set Solving (1)
- Temporal Logic (1)
- Temporallogik (1)
- Temporäre Anbindung (1)
- Terminologische Logik (1)
- Test (1)
- Theoretische Informatik (1)
- Theoretischen Vorlesungen (1)
- Theory formation (1)
- Thermoelasticity (1)
- Time Augmented Petri Nets (1)
- Time Series Analysis (1)
- Time series (1)
- Tomography (1)
- Tool (1)
- Tools (1)
- Traceability (1)
- Traffic data (1)
- Transformation (1)
- Tree decomposition (1)
- Treewidth-aware reductions (1)
- Triple modular redundancy (TMR) (1)
- Tumor types (1)
- Turing machine (1)
- Type and effect systems (1)
- UAV imagery (1)
- UI Components (1)
- UI Metamodels (1)
- UI-Komponenten (1)
- UX (1)
- Ubiquitous learning (1)
- Ultrafast dynamics (1)
- Unabhängige Komponentenanalyse (1)
- Unary languages (1)
- Uniform Access Principle (1)
- University Service Bus (1)
- Universität Bagdad (1)
- Universität Potsdam (1)
- Universitätseinstellungen (1)
- Untere Schranken (1)
- Unterrichtswerkzeuge (1)
- Unvollständigkeit (1)
- Usability (1)
- Usability testing (1)
- Usage Interest (1)
- User Interface Ontologies (1)
- User Interfaces (1)
- User submission pattern (1)
- User-centred design (1)
- VGG16 (1)
- VM (1)
- Value network (1)
- Verhalten (1)
- Verification (1)
- Verifikation (1)
- Verletzung Auflösung (1)
- Verletzung Erklärung (1)
- Verteiltes Rechnen (1)
- Verteilungsunterschied (1)
- Violation Explanation (1)
- Violation Resolution (1)
- Virtual Reality (1)
- Virtual reality (1)
- Virtuelles 3D Stadtmodell (1)
- Visual metaphor (1)
- Vorhersagemodelle (1)
- Wahrnehmung (1)
- Wahrnehmung von Arousal (1)
- Wahrnehmungsunterschiede (1)
- Warteschlangentheorie (1)
- Web Services (1)
- Web Sites (1)
- Web of Data (1)
- Webanwendung (1)
- Webseite (1)
- Well-structuredness (1)
- Wetterextreme (1)
- Wireless Sensor Networks (1)
- Wirtschaftsinformatik (1)
- Wissen (1)
- Wissenschaftlichesworkflows (1)
- Wissensmanagement (1)
- Wissensrepräsentation und -verarbeitung (1)
- Wissensrepräsentation und Schlussfolgerung (1)
- Wohlstrukturiertheit (1)
- Word embeddings (1)
- Workflow (1)
- X-masking (1)
- X-ray computed (1)
- X-values (1)
- ZQSA (1)
- ZQSAT (1)
- Zeitbehaftete Petri Netze (1)
- Zero-Suppressed Binary Decision Diagram (ZDD) (1)
- Zuverlässigkeitsanalyse (1)
- abstraction (1)
- accepting grammars (1)
- action and change (1)
- activities (1)
- activity (1)
- acute liver failure (1)
- acyclicity properties (1)
- adaptiv (1)
- adaptive (1)
- adversarial classification (1)
- algorithm configuration (1)
- algorithm schedules (1)
- algorithm scheduling (1)
- algorithm selection (1)
- algorithms (1)
- analysis (1)
- animated PCA (1)
- animierte PCA (1)
- anisotropic Kuwahara filter (1)
- annealing (1)
- anxiety (1)
- approximate joint diagonalization (1)
- approximate model counting (1)
- architecture (1)
- argumentation (1)
- argumentation structure (1)
- arithmethische Prozeduren (1)
- arithmetic procedures (1)
- arousal (1)
- arousal perception (1)
- artificial intelligence (1)
- artistic rendering (1)
- asynchronous circuit (1)
- asynchrounous design (1)
- authentication (1)
- autism (1)
- automata (1)
- automated driving (1)
- automated guided vehicle routing (1)
- automated planning (1)
- automatic theorem prover (1)
- automatisierter Theorembeweiser (1)
- behavioral (1)
- behavioral abstraction (1)
- belief merging (1)
- belief revision (1)
- benchmark (1)
- bibliometric analysis (1)
- bild (1)
- bio-computing (1)
- biometrics (1)
- biometrische Identifikation (1)
- blind source separation (1)
- block representation (1)
- bootstrapping (1)
- brain-computer interface (1)
- building models (1)
- bundled data (1)
- business informatics (1)
- camera sensor (1)
- car assembly operations (1)
- cartographic design (1)
- cellular automata (1)
- changing the study field (1)
- changing the university (1)
- choreographies (1)
- circuit Faults (1)
- citation analysis (1)
- classifier calibration (1)
- classroom material (1)
- click controller (1)
- climate change (1)
- climate impact analysis (1)
- clocks (1)
- clustering (1)
- co-citation analysis (1)
- co-occurrence analysis (1)
- code generation (1)
- cognitive apprenticeship (1)
- coherence relation (1)
- coherence-enhancing filtering (1)
- collaborative learning (1)
- combinatorial optimization problems (1)
- combined task and motion planning (1)
- common spatial patterns (1)
- communication (1)
- competition (1)
- compilation (1)
- complex networks (1)
- compliance (1)
- computational biology (1)
- computational methods (1)
- computational thinking (1)
- computer security (1)
- computer vision (1)
- computergestützte Methoden (1)
- concession (1)
- conductive argument (1)
- connection calculus (1)
- connective (1)
- connectivity (1)
- consistency (1)
- consistency checking (1)
- consistency measures (1)
- constraint (1)
- constraint programming (1)
- constraints (1)
- constructivism (1)
- construktivism (1)
- context-free grammar (1)
- context-sensitive (1)
- continuous (1)
- contrast (1)
- controlled vocabularies (1)
- corpus analysis (1)
- correlated errors (1)
- course timetabling (1)
- craters (1)
- crop (1)
- debugging (1)
- decidability questions (1)
- decision trees (1)
- declarative problem solving (1)
- deep learning (1)
- deep residual networks (1)
- degree of formality (1)
- degree of non-context-freeness (1)
- degree of non-regularity (1)
- degree of non-regulation (1)
- depression (1)
- design (1)
- design flow (1)
- design principles (1)
- design space exploration (1)
- determinism (1)
- detrending (1)
- developmental systems (1)
- diagnosis (1)
- didaktische Rekonstruktion (1)
- difference of Gaussians (1)
- digital circuit (1)
- digital design (1)
- digitale Hochschullehre (1)
- digitally-enabled pedagogies (1)
- domain-specific APIs (1)
- dropout (1)
- drug discovery (1)
- dynamic (1)
- dynamic classification (1)
- dynamic reconfiguration (1)
- dynamic service binding (1)
- dynamisch (1)
- dynamische Klassifikation (1)
- e-Learning (1)
- eGovernment (1)
- eLectures (1)
- economic ripples (1)
- educational reconstruction (1)
- educational systems (1)
- educational timetabling (1)
- eingebettete Systeme (1)
- einseitige Kommunikation (1)
- email spam detection (1)
- emission factor (1)
- emotion (1)
- emotion representation (1)
- emotion research (1)
- endothelin (1)
- endothelin-converting enzyme (1)
- ensemble kalman filter (1)
- ensemble methods (1)
- enterprise search (1)
- entity alignment (1)
- enumeration (1)
- epistemic logic programs (1)
- epistemic specifications (1)
- equality (1)
- error correction (1)
- error detection (1)
- error propagation (1)
- evaluation (1)
- event-related desynchronization (1)
- evidence theory (1)
- evolution (1)
- explicit negation (1)
- external ambiguity (1)
- external memory algorithms (1)
- extreme weather (1)
- eye movements (1)
- fading (1)
- fault tolerance (1)
- field-programmable gate array (1)
- finite model computation (1)
- finite state sequential transducers (1)
- flow-based bilateral filter (1)
- formal (1)
- formal argumentation systems (1)
- formalism (1)
- freie Daten (1)
- freie Software (1)
- functions (1)
- gait (1)
- game based learning (1)
- game design elements (1)
- game-based learning (1)
- gap-filling (1)
- generalization (1)
- geometry generation (1)
- geospatial data (1)
- geospatial services (1)
- global constraints (1)
- globale Constraints (1)
- gradient boosting (1)
- grammar (1)
- grammar inference (1)
- graph analysis (1)
- graph clustering (1)
- graph-based ranking (1)
- greenhouse gas (1)
- hardware accelerator (1)
- hardware architecture (1)
- hardware design (1)
- hardware-software-codesign (1)
- high-throughput analysis (1)
- human-technology interaction (1)
- hybrid (1)
- hybrid semantic search (1)
- hybrid solving (1)
- hybride semantische Suche (1)
- hybrides Problemlösen (1)
- ice harboring (1)
- image (1)
- image classification (1)
- image data analysis (1)
- image recognition (1)
- imaging (1)
- impacts (1)
- incompleteness (1)
- inconsistency (1)
- incremental SVM (1)
- independent component analysis (1)
- indirect economic impacts (1)
- indirekte ökonomische Effekte (1)
- informal and formal learning (1)
- informal logic (1)
- information extraction (1)
- information flow control (1)
- information retrieval (1)
- informatische Bildung im Sekundarbereich (1)
- infrastructure (1)
- input accuracy (1)
- interaction modeling (1)
- interaction techniques (1)
- internal ambiguity (1)
- intrusion detection (1)
- intuition (1)
- irradiation (1)
- joint lab (1)
- kernel PCA (1)
- kernel methods (1)
- key competences in physical computing (1)
- kidney cancer (1)
- knowledge representation and reasoning (1)
- konvergente Dienste (1)
- landmarks (1)
- language design (1)
- latches (1)
- lautes Denken (1)
- leanCoP (1)
- learning environment (1)
- lebenslanges Lernen (1)
- leftmost derivations (1)
- linear code (1)
- linear programming (1)
- linearer Code (1)
- locomotion (1)
- logic (1)
- logic programming methodology and applications (1)
- logic synthesis (1)
- logic-based modeling (1)
- logical errors (1)
- logical signaling networks (1)
- logische Ergänzung (1)
- logische Fehler (1)
- logische Programmierung (1)
- logische Signalnetzwerke (1)
- loop formulas (1)
- loose programming (1)
- loss propagation (1)
- lunar exploration (1)
- machine learning algorithms (1)
- macro-economic modelling (1)
- makroökonomische Modellierung (1)
- malware detection (1)
- manipulation planning (1)
- map/reduce (1)
- maschninelles Lernen (1)
- mathematics education (1)
- measure development (1)
- media (1)
- medical (1)
- medizinisch (1)
- meta model (1)
- metabolic network (1)
- metabolism (1)
- metabolomics (1)
- metadata (1)
- metastasis (1)
- methodology (1)
- middleware (1)
- misconception (1)
- mixture models (1)
- mobile Applikationen (1)
- mobile devices (1)
- mobile learning (1)
- mobile technologies and apps (1)
- mobiles lernen (1)
- model-based (1)
- model-driven architecture (1)
- modeling (1)
- molecular networks (1)
- molekulare Netzwerke (1)
- multi core data processing (1)
- multi-class classification (1)
- natural disasters (1)
- natural language generation (1)
- navigation (1)
- neighborhood (1)
- networks-on-chip (1)
- neue Online-Fehlererkennungsmethode (1)
- neural networks (1)
- neutral endopeptidase (1)
- nichtlineare ICA (1)
- nichtlineare PCA (NLPCA) (1)
- nichtlineare Projektionen (1)
- non-monotonic reasoning (1)
- nonlinear ICA (1)
- nonlinear PCA (NLPCA) (1)
- nonlinear projections (1)
- nonphotorealistic rendering (NPR) (1)
- o-ambiguity (1)
- objective difficulty (1)
- omega (1)
- on-chip (1)
- on-farm evaluation (1)
- one-sided communication (1)
- oneM2M (1)
- oneM2M Ontology (1)
- ontologies (1)
- open source (1)
- optimization (1)
- organisational evolution (1)
- outlier detection (1)
- output space compaction (1)
- overcomplete ICA (1)
- pMOS radiation dosimeter (1)
- paper prototyping (1)
- parallel processing (1)
- parallel programming (1)
- parallel rewriting (1)
- parallel solving (1)
- parallele Programmierung (1)
- paralleles Lösen (1)
- parity aggregate operator (1)
- pattern recognition (1)
- pdf forms (1)
- philosophy of mathematics (1)
- physical Computing (1)
- physical computing tools (1)
- placement (1)
- planning (1)
- plug-ins (1)
- portfolio-based solving (1)
- prediction (1)
- predictive models (1)
- preferences (1)
- premise acceptability (1)
- priorities (1)
- probabilistic deep learning (1)
- probabilistic deep metric learning (1)
- probabilistische tiefe neuronale Netze (1)
- probabilistisches tiefes metrisches Lernen (1)
- process (1)
- process improvement (1)
- process model (1)
- process model alignment (1)
- process modeling (1)
- process modelling (1)
- process synchronization (1)
- professors (1)
- program encodings (1)
- programmed grammars (1)
- projection (1)
- proof (1)
- proof assistant (1)
- proof complexity (1)
- proof environment (1)
- propagation probability (1)
- prototype (1)
- pruritus (1)
- pulse stretching inverters (1)
- quality of life (1)
- quantum (1)
- radiation hardness (1)
- radiation hardness design (1)
- railway network (1)
- random forest (1)
- real arguments (1)
- real-time (1)
- real-time application (1)
- real-time mapping (1)
- real-walking (1)
- reconfiguration (1)
- reference (1)
- referential effectiveness (1)
- regression (1)
- regular language (1)
- rekonfigurierbar (1)
- relevance (1)
- reliability analysis (1)
- reliability assessment (1)
- repair (1)
- resources (1)
- restricted parallelism (1)
- risk analysis (1)
- robust ICA (1)
- robuste ICA (1)
- safety (1)
- satisfiability (1)
- scheduling (1)
- search (1)
- secondary computer science education (1)
- segmentation (1)
- selbstanpassendes Multiprozessorsystem (1)
- selbstprüfende Schaltungen (1)
- selective fault tolerance (1)
- self-checking (1)
- semantic domain modeling (1)
- semantic ranking (1)
- semantic search (1)
- semantic search evaluation (1)
- semantic search methods (1)
- semantic web (1)
- semantische Domänenmodellierung (1)
- semantische Suche (1)
- semantisches Netz (1)
- semantisches Ranking (1)
- sensitivity (1)
- service composition (1)
- shader (1)
- simplicity (1)
- single event upsets (1)
- single-event transient (1)
- single-trial-analysis (1)
- site-specific weed management (1)
- skeletonization (1)
- sleep quality (1)
- smart farming (1)
- smartphone (1)
- socio-technical system (1)
- soft errors (1)
- software (1)
- software engineering (1)
- software-based cache coherence (1)
- sozio-technisches System (1)
- space missions (1)
- speed independence (1)
- stable model semantics (1)
- state complexity (1)
- static analysis (1)
- static prediction games (1)
- statistics program R (1)
- strahleninduzierte Einzelereignis-Effekte (1)
- strong equivalence (1)
- structured output prediction (1)
- strukturierte Vorhersage (1)
- study problems (1)
- stylization (1)
- sufficiency (1)
- suicidal ideations (1)
- supply chains (1)
- support vector machines (1)
- synthesis (1)
- systematic (1)
- systematisch (1)
- tableau calculi (1)
- tactic (1)
- teachers (1)
- teaching (1)
- technical notes and rapid communications (1)
- tele-teaching (1)
- temporary binding (1)
- terrain models (1)
- test response compaction (1)
- theory (1)
- theory of computation (1)
- think aloud (1)
- timing (1)
- tools (1)
- tools for teaching (1)
- topics (1)
- touch input (1)
- tptp (1)
- transformation (1)
- transient Faults (1)
- transient analysis (1)
- triangulated irregular networks (1)
- triple modular redundancy (1)
- tutorial section (1)
- unfounded sets (1)
- unidirektionale Fehler (1)
- university education (1)
- user experience (1)
- video annotation (1)
- virtual 3D city model (1)
- virtual machine (1)
- weather extremes (1)
- wheat crops (1)
- work productivity (1)
- workflow management (1)
- xAPI (1)
- yellow rust (1)
- zero-aliasing (1)
- überbestimmte ICA (1)
Institute
- Institut für Informatik und Computational Science (998)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (18)
- Extern (5)
- Institut für Physik und Astronomie (2)
- Universitätsbibliothek (2)
- Zentrum für Qualitätsentwicklung in Lehre und Studium (ZfQ) (2)
- eLiS - E-Learning in Studienbereichen (2)
- Department Erziehungswissenschaft (1)
- Department Linguistik (1)
- Historisches Institut (1)
In a recent line of research, two familiar concepts from logic programming semantics (unfounded sets and splitting) were extrapolated to the case of epistemic logic programs. The property of epistemic splitting provides a natural and modular way to understand programs without epistemic cycles but, surprisingly, was only fulfilled by Gelfond's original semantics (G91), among the many proposals in the literature. On the other hand, G91 may suffer from a kind of self-supported, unfounded derivations when epistemic cycles come into play. Recently, the absence of these derivations was also formalised as a property of epistemic semantics called foundedness. Moreover, a first semantics proved to satisfy foundedness was also proposed, the so-called Founded Autoepistemic Equilibrium Logic (FAEEL). In this paper, we prove that FAEEL also satisfies the epistemic splitting property something that, together with foundedness, was not fulfilled by any other approach up to date. To prove this result, we provide an alternative characterisation of FAEEL as a combination of G91 with a simpler logic we called Founded Epistemic Equilibrium Logic (FEEL), which is somehow an extrapolation of the stable model semantics to the modal logic S5.
Parsing of argumentative structures has become a very active line of research in recent years. Like discourse parsing or any other natural language task that requires prediction of linguistic structures, most approaches choose to learn a local model and then perform global decoding over the local probability distributions, often imposing constraints that are specific to the task at hand. Specifically for argumentation parsing, two decoding approaches have been recently proposed: Minimum Spanning Trees (MST) and Integer Linear Programming (ILP), following similar trends in discourse parsing. In contrast to discourse parsing though, where trees are not always used as underlying annotation schemes, argumentation structures so far have always been represented with trees. Using the 'argumentative microtext corpus' [in: Argumentation and Reasoned Action: Proceedings of the 1st European Conference on Argumentation, Lisbon 2015 / Vol. 2, College Publications, London, 2016, pp. 801-815] as underlying data and replicating three different decoding mechanisms, in this paper we propose a novel ILP decoder and an extension to our earlier MST work, and then thoroughly compare the approaches. The result is that our new decoder outperforms related work in important respects, and that in general, ILP and MST yield very similar performance.
Flux-P
(2012)
Quantitative knowledge of intracellular fluxes in metabolic networks is invaluable for inferring metabolic system behavior and the design principles of biological systems. However, intracellular reaction rates can not often be calculated directly but have to be estimated; for instance, via 13C-based metabolic flux analysis, a model-based interpretation of stable carbon isotope patterns in intermediates of metabolism. Existing software such as FiatFlux, OpenFLUX or 13CFLUX supports experts in this complex analysis, but requires several steps that have to be carried out manually, hence restricting the use of this software for data interpretation to a rather small number of experiments. In this paper, we present Flux-P as an approach to automate and standardize 13C-based metabolic flux analysis, using the Bio-jETI workflow framework. Exemplarily based on the FiatFlux software, it demonstrates how services can be created that carry out the different analysis steps autonomously and how these can subsequently be assembled into software workflows that perform automated, high-throughput intracellular flux analysis of high quality and reproducibility. Besides significant acceleration and standardization of the data analysis, the agile workflow-based realization supports flexible changes of the analysis workflows on the user level, making it easy to perform custom analyses.
Answer Set Programming (ASP) has become a popular and widespread paradigm for practical Knowledge Representation thanks to its expressiveness and the available enhancements of its input language. One of such enhancements is the use of aggregates, for which different semantic proposals have been made. In this paper, we show that any ASP aggregate interpreted under Gelfond and Zhang's (GZ) semantics can be replaced (under strong equivalence) by a propositional formula. Restricted to the original GZ syntax, the resulting formula is reducible to a disjunction of conjunctions of literals but the formulation is still applicable even when the syntax is extended to allow for arbitrary formulas (including nested aggregates) in the condition. Once GZ-aggregates are represented as formulas, we establish a formal comparison (in terms of the logic of Here-and-There) to Ferraris' (F) aggregates, which are defined by a different formula translation involving nested implications. In particular, we prove that if we replace an F-aggregate by a GZ-aggregate in a rule head, we do not lose answer sets (although more can be gained). This extends the previously known result that the opposite happens in rule bodies, i.e., replacing a GZ-aggregate by an F-aggregate in the body may yield more answer sets. Finally, we characterize a class of aggregates for which GZ- and F-semantics coincide.
Detection of malware-infected computers and detection of malicious web domains based on their encrypted HTTPS traffic are challenging problems, because only addresses, timestamps, and data volumes are observable. The detection problems are coupled, because infected clients tend to interact with malicious domains. Traffic data can be collected at a large scale, and antivirus tools can be used to identify infected clients in retrospect. Domains, by contrast, have to be labeled individually after forensic analysis. We explore transfer learning based on sluice networks; this allows the detection models to bootstrap each other. In a large-scale experimental study, we find that the model outperforms known reference models and detects previously unknown malware, previously unknown malware families, and previously unknown malicious domains.
TrainTrap
(2020)
Die Fehlerkorrektur in der Codierungstheorie beschäftigt sich mit der Erkennung und Behebung von Fehlern bei der Übertragung und auch Sicherung von Nachrichten.
Hierbei wird die Nachricht durch zusätzliche Informationen in ein Codewort kodiert.
Diese Kodierungsverfahren besitzen verschiedene Ansprüche, wie zum Beispiel die maximale Anzahl der zu korrigierenden Fehler und die Geschwindigkeit der Korrektur.
Ein gängiges Codierungsverfahren ist der BCH-Code, welches industriell für bis zu vier Fehler korrigiere Codes Verwendung findet. Ein Nachteil dieser Codes ist die technische Durchlaufzeit für die Berechnung der Fehlerstellen mit zunehmender Codelänge.
Die Dissertation stellt ein neues Codierungsverfahren vor, bei dem durch spezielle Anordnung kleinere Codelängen eines BCH-Codes ein langer Code erzeugt wird. Diese Anordnung geschieht über einen weiteren speziellen Code, einem LDPC-Code, welcher für eine schneller Fehlererkennung konzipiert ist.
Hierfür wird ein neues Konstruktionsverfahren vorgestellt, welches einen Code für einen beliebige Länge mit vorgebbaren beliebigen Anzahl der zu korrigierenden Fehler vorgibt. Das vorgestellte Konstruktionsverfahren erzeugt zusätzlich zum schnellen Verfahren der Fehlererkennung auch eine leicht und schnelle Ableitung eines Verfahrens zu Kodierung der Nachricht zum Codewort. Dies ist in der Literatur für die LDPC-Codes bis zum jetzigen Zeitpunkt einmalig.
Durch die Konstruktion eines LDPC-Codes wird ein Verfahren vorgestellt wie dies mit einem BCH-Code kombiniert wird, wodurch eine Anordnung des BCH-Codes in Blöcken erzeugt wird. Neben der allgemeinen Beschreibung dieses Codes, wird ein konkreter Code für eine 2-Bitfehlerkorrektur beschrieben. Diese besteht aus zwei Teilen, welche in verschiedene Varianten beschrieben und verglichen werden. Für bestimmte Längen des BCH-Codes wird ein Problem bei der Korrektur aufgezeigt, welche einer algebraischen Regel folgt.
Der BCH-Code wird sehr allgemein beschrieben, doch existiert durch bestimmte Voraussetzungen ein BCH-Code im engerem Sinne, welcher den Standard vorgibt. Dieser BCH-Code im engerem Sinne wird in dieser Dissertation modifiziert, so dass das algebraische Problem bei der 2-Bitfehler Korrektur bei der Kombination mit dem LDPC-Code nicht mehr existiert. Es wird gezeigt, dass nach der Modifikation der neue Code weiterhin ein BCH-Code im allgemeinen Sinne ist, welcher 2-Bitfehler korrigieren und 3-Bitfehler erkennen kann. Bei der technischen Umsetzung der Fehlerkorrektur wird des Weiteren gezeigt, dass die Durchlaufzeiten des modifizierten Codes im Vergleich zum BCH-Code schneller ist und weiteres Potential für Verbesserungen besitzt.
Im letzten Kapitel wird gezeigt, dass sich dieser modifizierte Code mit beliebiger Länge eignet für die Kombination mit dem LDPC-Code, wodurch dieses Verfahren nicht nur umfänglicher in der Länge zu nutzen ist, sondern auch durch die schnellere Dekodierung auch weitere Vorteile gegenüber einem BCH-Code im engerem Sinne besitzt.
In this work we tackle the problem of checking strong equivalence of logic programs that may contain local auxiliary atoms, to be removed from their stable models and to be forbidden in any external context. We call this property projective strong equivalence (PSE). It has been recently proved that not any logic program containing auxiliary atoms can be reformulated, under PSE, as another logic program or formula without them – this is known as strongly persistent forgetting. In this paper, we introduce a conservative extension of Equilibrium Logic and its monotonic basis, the logic of Here-and-There, in which we deal with a new connective ‘|’ we call fork. We provide a semantic characterisation of PSE for forks and use it to show that, in this extension, it is always possible to forget auxiliary atoms under strong persistence. We further define when the obtained fork is representable as a regular formula.
Multi-sided platforms (MSP) strongly affect markets and play a crucial part within the digital and networked economy. Although empirical evidence indicates their occurrence in many industries, research has not investigated the game-changing impact of MSP on traditional markets to a sufficient extent. More specifically, we have little knowledge of how MSP affect value creation and customer interaction in entire markets, exploiting the potential of digital technologies to offer new value propositions. Our paper addresses this research gap and provides an initial systematic approach to analyze the impact of MSP on the insurance industry. For this purpose, we analyze the state of the art in research and practice in order to develop a reference model of the value network for the insurance industry. On this basis, we conduct a case-study analysis to discover and analyze roles which are occupied or even newly created by MSP. As a final step, we categorize MSP with regard to their relation to traditional insurance companies, resulting in a classification scheme with four MSP standard types: Competition, Coordination, Cooperation, Collaboration.
Background: The biological interpretation of large-scale gene expression data is one of the paramount challenges in current bioinformatics. In particular, placing the results in the context of other available functional genomics data, such as existing bio-ontologies, has already provided substantial improvement for detecting and categorizing genes of interest. One common approach is to look for functional annotations that are significantly enriched within a group or cluster of genes, as compared to a reference group. Results: In this work, we suggest the information-theoretic concept of mutual information to investigate the relationship between groups of genes, as given by data-driven clustering, and their respective functional categories. Drawing upon related approaches (Gibbons and Roth, Genome Research 12: 1574-1581, 2002), we seek to quantify to what extent individual attributes are sufficient to characterize a given group or cluster of genes. Conclusion: We show that the mutual information provides a systematic framework to assess the relationship between groups or clusters of genes and their functional annotations in a quantitative way. Within this framework, the mutual information allows us to address and incorporate several important issues, such as the interdependence of functional annotations and combinatorial combinations of attributes. It thus supplements and extends the conventional search for overrepresented attributes within a group or cluster of genes. In particular taking combinations of attributes into account, the mutual information opens the way to uncover specific functional descriptions of a group of genes or clustering result. All datasets and functional annotations used in this study are publicly available. All scripts used in the analysis are provided as additional files.
Incremental Support Vector Machines (SVM) are instrumental in practical applications of online learning. This work focuses on the design and analysis of efficient incremental SVM learning, with the aim of providing a fast, numerically stable and robust implementation. A detailed analysis of convergence and of algorithmic complexity of incremental SVM learning is carried out. Based on this analysis, a new design of storage and numerical operations is proposed, which speeds up the training of an incremental SVM by a factor of 5 to 20. The performance of the new algorithm is demonstrated in two scenarios: learning with limited resources and active learning. Various applications of the algorithm, such as in drug discovery, online monitoring of industrial devices and and surveillance of network traffic, can be foreseen.
A central insight from psychological studies on human eye movements is that eye movement patterns are highly individually characteristic. They can, therefore, be used as a biometric feature, that is, subjects can be identified based on their eye movements. This thesis introduces new machine learning methods to identify subjects based on their eye movements while viewing arbitrary content. The thesis focuses on probabilistic modeling of the problem, which has yielded the best results in the most recent literature. The thesis studies the problem in three phases by proposing a purely probabilistic, probabilistic deep learning, and probabilistic deep metric learning approach. In the first phase, the thesis studies models that rely on psychological concepts about eye movements. Recent literature illustrates that individual-specific distributions of gaze patterns can be used to accurately identify individuals. In these studies, models were based on a simple parametric family of distributions. Such simple parametric models can be robustly estimated from sparse data, but have limited flexibility to capture the differences between individuals. Therefore, this thesis proposes a semiparametric model of gaze patterns that is flexible yet robust for individual identification. These patterns can be understood as domain knowledge derived from psychological literature. Fixations and saccades are examples of simple gaze patterns. The proposed semiparametric densities are drawn under a Gaussian process prior centered at a simple parametric distribution. Thus, the model will stay close to the parametric class of densities if little data is available, but it can also deviate from this class if enough data is available, increasing the flexibility of the model. The proposed method is evaluated on a large-scale dataset, showing significant improvements over the state-of-the-art. Later, the thesis replaces the model based on gaze patterns derived from psychological concepts with a deep neural network that can learn more informative and complex patterns from raw eye movement data. As previous work has shown that the distribution of these patterns across a sequence is informative, a novel statistical aggregation layer called the quantile layer is introduced. It explicitly fits the distribution of deep patterns learned directly from the raw eye movement data. The proposed deep learning approach is end-to-end learnable, such that the deep model learns to extract informative, short local patterns while the quantile layer learns to approximate the distributions of these patterns. Quantile layers are a generic approach that can converge to standard pooling layers or have a more detailed description of the features being pooled, depending on the problem. The proposed model is evaluated in a large-scale study using the eye movements of subjects viewing arbitrary visual input. The model improves upon the standard pooling layers and other statistical aggregation layers proposed in the literature. It also improves upon the state-of-the-art eye movement biometrics by a wide margin. Finally, for the model to identify any subject — not just the set of subjects it is trained on — a metric learning approach is developed. Metric learning learns a distance function over instances. The metric learning model maps the instances into a metric space, where sequences of the same individual are close, and sequences of different individuals are further apart. This thesis introduces a deep metric learning approach with distributional embeddings. The approach represents sequences as a set of continuous distributions in a metric space; to achieve this, a new loss function based on Wasserstein distances is introduced. The proposed method is evaluated on multiple domains besides eye movement biometrics. This approach outperforms the state of the art in deep metric learning in several domains while also outperforming the state of the art in eye movement biometrics.
Combined optimization of spatial and temporal filters for improving brain-computer interfacing
(2006)
Brain-computer interface (BCI) systems create a novel communication channel from the brain to an output de ice by bypassing conventional motor output pathways of nerves and muscles. Therefore they could provide a new communication and control option for paralyzed patients. Modern BCI technology is essentially based on techniques for the classification of single-trial brain signals. Here we present a novel technique that allows the simultaneous optimization of a spatial and a spectral filter enhancing discriminability rates of multichannel EEG single-trials. The evaluation of 60 experiments involving 22 different subjects demonstrates the significant superiority of the proposed algorithm over to its classical counterpart: the median classification error rate was decreased by 11%. Apart from the enhanced classification, the spatial and/or the spectral filter that are determined by the algorithm can also be used for further analysis of the data, e.g., for source localization of the respective brain rhythms.
We consider generating and accepting programmed grammars with bounded degree of non-regulation, that is, the maximum number of elements in success or in failure fields of the underlying grammar. In particular, it is shown that this measure can be restricted to two without loss of descriptional capacity, regardless of whether arbitrary derivations or left-most derivations are considered. Moreover, in some cases, precise characterizations of the linear bounded automaton problem in terms of programmed grammars are obtained. Thus, the results presented in this paper shed new light on some longstanding open problem in the theory of computational complexity.
Emotions are a central element of human experience. They occur with high frequency in everyday life and play an important role in decision making. However, currently there is no consensus among researchers on what constitutes an emotion and on how emotions should be investigated. This dissertation identifies three problems of current emotion research: the problem of ground truth, the problem of incomplete constructs and the problem of optimal representation. I argue for a focus on the detailed measurement of emotion manifestations with computer-aided methods to solve these problems. This approach is demonstrated in three research projects, which describe the development of methods specific to these problems as well as their application to concrete research questions.
The problem of ground truth describes the practice to presuppose a certain structure of emotions as the a priori ground truth. This determines the range of emotion descriptions and sets a standard for the correct assignment of these descriptions. The first project illustrates how this problem can be circumvented with a multidimensional emotion perception paradigm which stands in contrast to the emotion recognition paradigm typically employed in emotion research. This paradigm allows to calculate an objective difficulty measure and to collect subjective difficulty ratings for the perception of emotional stimuli. Moreover, it enables the use of an arbitrary number of emotion stimuli categories as compared to the commonly used six basic emotion categories. Accordingly, we collected data from 441 participants using dynamic facial expression stimuli from 40 emotion categories. Our findings suggest an increase in emotion perception difficulty with increasing actor age and provide evidence to suggest that young adults, the elderly and men underestimate their emotion perception difficulty. While these effects were predicted from the literature, we also found unexpected and novel results. In particular, the increased difficulty on the objective difficulty measure for female actors and observers stood in contrast to reported findings. Exploratory analyses revealed low relevance of person-specific variables for the prediction of emotion perception difficulty, but highlighted the importance of a general pleasure dimension for the ease of emotion perception.
The second project targets the problem of incomplete constructs which relates to vaguely defined psychological constructs on emotion with insufficient ties to tangible manifestations. The project exemplifies how a modern data collection method such as face tracking data can be used to sharpen these constructs on the example of arousal, a long-standing but fuzzy construct in emotion research. It describes how measures of distance, speed and magnitude of acceleration can be computed from face tracking data and investigates their intercorrelations. We find moderate to strong correlations among all measures of static information on one hand and all measures of dynamic information on the other. The project then investigates how self-rated arousal is tied to these measures in 401 neurotypical individuals and 19 individuals with autism. Distance to the neutral face was predictive of arousal ratings in both groups. Lower mean arousal ratings were found for the autistic group, but no difference in correlation of the measures and arousal ratings could be found between groups. Results were replicated in a high autistic traits group consisting of 41 participants. The findings suggest a qualitatively similar perception of arousal for individuals with and without autism. No correlations between valence ratings and any of the measures could be found which emphasizes the specificity of our tested measures for the construct of arousal.
The problem of optimal representation refers to the search for the best representation of emotions and the assumption that there is a one-fits-all solution. In the third project we introduce partial least squares analysis as a general method to find an optimal representation to relate two high-dimensional data sets to each other. The project demonstrates its applicability to emotion research on the question of emotion perception differences between men and women. The method was used with emotion rating data from 441 participants and face tracking data computed on 306 videos. We found quantitative as well as qualitative differences in the perception of emotional facial expressions between these groups. We showed that women’s emotional perception systematically captured more of the variance in facial expressions. Additionally, we could show that significant differences exist in the way that women and men perceive some facial expressions which could be visualized as concrete facial expression sequences. These expressions suggest differing perceptions of masked and ambiguous facial expressions between the sexes. In order to facilitate use of the developed method by the research community, a package for the statistical environment R was written. Furthermore, to call attention to the method and its usefulness for emotion research, a website was designed that allows users to explore a model of emotion ratings and facial expression data in an interactive fashion.
Iterated finite state sequential transducers are considered as language generating devices. The hierarchy induced by the size of the state alphabet is proved to collapse to the fourth level. The corresponding language families are related to the families of languages generated by Lindenmayer systems and Chomsky grammars. Finally, some results on deterministic and extended iterated finite state transducers are established.
Im Rahmen eines interdisziplinären studentischen Projekts wurde ein Framework für mobile pervasive Lernspiele entwickelt. Am Beispiel des historischen Lernortes Park Sanssouci wurde auf dieser Grundlage ein Lernspiel für Schülerinnen und Schüler implementiert. Die geplante Evaluation soll die Lernwirksamkeit von geobasierten mobilen Lernspielen messen. Dazu wird die Intensität des Flow-Erlebens mit einer ortsgebundenen alternativen Umsetzung verglichen.
Deutsche Universitäten erweitern ihre E-Learning-Angebote als Service für die Studierenden und Lehrenden. Diese sind je nach Fakultät unterschiedlich ausgeprägt. Dieser Artikel zeigt, wie durch technische Erweiterung der Infrastruktur, einer Anpassung der Organisationsstruktur und einer gezielten Inhaltsentwicklung eine durchgängige und personalisierbare Lehr- und Lernumgebung (Personal Learning Environment, PLE) geschaffen und damit die Akzeptanz bei den Lehrenden und Studierenden für E-Learning erhöht werden kann. Aus der vorausgehenden, systematischen Anforderungsanalyse können Kennzahlen für die Qualitätssicherung von E-Learning-Angeboten abgeleitet werden.
PLATON
(2019)
Lesson planning is both an important and demanding task—especially as part of teacher training. This paper presents the requirements for a lesson planning system and evaluates existing systems regarding these requirements. One major drawback of existing software tools is that most are limited to a text- or form-based representation of the lesson designs. In this article, a new approach with a graphical, time-based representation with (automatic) analyses methods is proposed and the system architecture and domain model are described in detail. The approach is implemented in an interactive, web-based prototype called PLATON, which additionally supports the management of lessons in units as well as the modelling of teacher and student-generated resources. The prototype was evaluated in a study with 61 prospective teachers (bachelor’s and master’s preservice teachers as well as teacher trainees in post-university teacher training) in Berlin, Germany, with a focus on usability. The results show that this approach proofed usable for lesson planning and offers positive effects for the perception of time and self-reflection.
PLATON
(2019)
Lesson planning is both an important and demanding task—especially as part of teacher training. This paper presents the requirements for a lesson planning system and evaluates existing systems regarding these requirements. One major drawback of existing software tools is that most are limited to a text- or form-based representation of the lesson designs. In this article, a new approach with a graphical, time-based representation with (automatic) analyses methods is proposed and the system architecture and domain model are described in detail. The approach is implemented in an interactive, web-based prototype called PLATON, which additionally supports the management of lessons in units as well as the modelling of teacher and student-generated resources. The prototype was evaluated in a study with 61 prospective teachers (bachelor’s and master’s preservice teachers as well as teacher trainees in post-university teacher training) in Berlin, Germany, with a focus on usability. The results show that this approach proofed usable for lesson planning and offers positive effects for the perception of time and self-reflection.
The usage of mobile devices is rapidly growing with Android being the most prevalent mobile operating system. Thanks to the vast variety of mobile applications, users are preferring smartphones over desktops for day to day tasks like Internet surfing. Consequently, smartphones store a plenitude of sensitive data. This data together with the high values of smartphones make them an attractive target for device/data theft (thieves/malicious applications).
Unfortunately, state-of-the-art anti-theft solutions do not work if they do not have an active network connection, e.g., if the SIM card was removed from the device. In the majority of these cases, device owners permanently lose their smartphone together with their personal data, which is even worse.
Apart from that malevolent applications perform malicious activities to steal sensitive information from smartphones. Recent research considered static program analysis to detect dangerous data leaks. These analyses work well for data leaks due to inter-component communication, but suffer from shortcomings for inter-app communication with respect to precision, soundness, and scalability.
This thesis focuses on enhancing users' privacy on Android against physical device loss/theft and (un)intentional data leaks. It presents three novel frameworks: (1) ThiefTrap, an anti-theft framework for Android, (2) IIFA, a modular inter-app intent information flow analysis of Android applications, and (3) PIAnalyzer, a precise approach for PendingIntent vulnerability analysis.
ThiefTrap is based on a novel concept of an anti-theft honeypot account that protects the owner's data while preventing a thief from resetting the device.
We implemented the proposed scheme and evaluated it through an empirical user study with 35 participants. In this study, the owner's data could be protected, recovered, and anti-theft functionality could be performed unnoticed from the thief in all cases.
IIFA proposes a novel approach for Android's inter-component/inter-app communication (ICC/IAC) analysis. Our main contribution is the first fully automatic, sound, and precise ICC/IAC information flow analysis that is scalable for realistic apps due to modularity, avoiding combinatorial explosion: Our approach determines communicating apps using short summaries rather than inlining intent calls between components and apps, which requires simultaneously analyzing all apps installed on a device.
We evaluate IIFA in terms of precision, recall, and demonstrate its scalability to a large corpus of real-world apps. IIFA reports 62 problematic ICC-/IAC-related information flows via two or more apps/components.
PIAnalyzer proposes a novel approach to analyze PendingIntent related vulnerabilities. PendingIntents are a powerful and universal feature of Android for inter-component communication. We empirically evaluate PIAnalyzer on a set of 1000 randomly selected applications and find 1358 insecure usages of PendingIntents, including 70 severe vulnerabilities.
In this paper two new methods for the design of fault-tolerant pipelined sequential and combinational circuits, called Error Detection and Partial Error Correction (EDPEC) and Full Error Detection and Correction (FEDC), are described. The proposed methods are based on an Error Detection Logic (EDC) in the combinational circuit part combined with fault tolerant memory elements implemented using fault tolerant master–slave flip-flops. If a transient error, due to a transient fault in the combinational circuit part is detected by the EDC, the error signal controls the latching stage of the flip-flops such that the previous correct state of the register stage is retained until the transient error disappears. The system can continue to work in its previous correct state and no additional recovery procedure (with typically reduced clock frequency) is necessary. The target applications are dataflow processing blocks, for which software-based recovery methods cannot be easily applied. The presented architectures address both single events as well as timing faults of arbitrarily long duration. An example of this architecture is developed and described, based on the carry look-ahead adder. The timing conditions are carefully investigated and simulated up to the layout level. The enhancement of the baseline architecture is demonstrated with respect to the achieved fault tolerance for the single event and timing faults. It is observed that the number of uncorrected single events is reduced by the EDPEC architecture by 2.36 times compared with previous solution. The FEDC architecture further reduces the number of uncorrected events to zero and outperforms the Triple Modular Redundancy (TMR) with respect to correction of timing faults. The power overhead of both new architectures is about 26–28% lower than the TMR.
Die Projektierung und Abwicklung sowie die statische und dynamische Analyse von Geschäftsprozessen im Bereich des Verwaltens und Regierens auf kommunaler, Länder- wie auch Bundesebene mit Hilfe von Informations- und Kommunikationstechniken beschäftigen Politiker und Strategen für Informationstechnologie ebenso wie die Öffentlichkeit seit Langem.
Der hieraus entstandene Begriff E-Government wurde in der Folge aus den unterschiedlichsten technischen, politischen und semantischen Blickrichtungen beleuchtet.
Die vorliegende Arbeit konzentriert sich dabei auf zwei Schwerpunktthemen:
• Das erste Schwerpunktthema behandelt den Entwurf eines hierarchischen Architekturmodells, für welches sieben hierarchische Schichten identifiziert werden können. Diese erscheinen notwendig, aber auch hinreichend, um den allgemeinen Fall zu beschreiben.
Den Hintergrund hierfür liefert die langjährige Prozess- und Verwaltungserfahrung als Leiter der EDV-Abteilung der Stadtverwaltung Landshut, eine kreisfreie Stadt mit rund 69.000 Einwohnern im Nordosten von München. Sie steht als Repräsentant für viele Verwaltungsvorgänge in der Bundesrepublik Deutschland und ist dennoch als Analyseobjekt in der Gesamtkomplexität und Prozessquantität überschaubar.
Somit können aus der Analyse sämtlicher Kernabläufe statische und dynamische Strukturen extrahiert und abstrakt modelliert werden.
Die Schwerpunkte liegen in der Darstellung der vorhandenen Bedienabläufe in einer Kommune. Die Transformation der Bedienanforderung in einem hierarchischen System, die Darstellung der Kontroll- und der Operationszustände in allen Schichten wie auch die Strategie der Fehlererkennung und Fehlerbehebung schaffen eine transparente Basis für umfassende Restrukturierungen und Optimierungen.
Für die Modellierung wurde FMC-eCS eingesetzt, eine am Hasso-Plattner-Institut für Softwaresystemtechnik GmbH (HPI) im Fachgebiet Kommunikationssysteme entwickelte Methodik zur Modellierung zustandsdiskreter Systeme unter Berücksichtigung möglicher Inkonsistenzen (Betreuer: Prof. Dr.-Ing. Werner Zorn [ZW07a, ZW07b]).
• Das zweite Schwerpunktthema widmet sich der quantitativen Modellierung und Optimierung von E-Government-Bediensystemen, welche am Beispiel des Bürgerbüros der Stadt Landshut im Zeitraum 2008 bis 2015 durchgeführt wurden. Dies erfolgt auf Basis einer kontinuierlichen Betriebsdatenerfassung mit aufwendiger Vorverarbeitung zur Extrahierung mathematisch beschreibbarer Wahrscheinlichkeitsverteilungen.
Der hieraus entwickelte Dienstplan wurde hinsichtlich der erzielbaren Optimierungen im dauerhaften Echteinsatz verifiziert.
[ZW07a] Zorn, Werner: «FMC-QE A New Approach in Quantitative Modeling», Vortrag anlässlich: MSV'07- The 2007 International Conference on Modeling, Simulation and Visualization Methods WorldComp2007, Las Vegas, 28.6.2007.
[ZW07b] Zorn, Werner: «FMC-QE, A New Approach in Quantitative Modeling», Veröffentlichung, Hasso-Plattner-Institut für Softwaresystemtechnik an der Universität Potsdam, 28.6.2007.
In this thesis we introduce the concept of the degree of formality. It is directed against a dualistic point of view, which only distinguishes between formal and informal proofs. This dualistic attitude does not respect the differences between the argumentations classified as informal and it is unproductive because the individual potential of the respective argumentation styles cannot be appreciated and remains untapped.
This thesis has two parts. In the first of them we analyse the concept of the degree of formality (including a discussion about the respective benefits for each degree) while in the second we demonstrate its usefulness in three case studies. In the first case study we will repair Haskell B. Curry's view of mathematics, which incidentally is of great importance in the first part of this thesis, in light of the different degrees of formality. In the second case study we delineate how awareness of the different degrees of formality can be used to help students to learn how to prove. Third, we will show how the advantages of proofs of different degrees of formality can be combined by the development of so called tactics having a medium degree of formality. Together the three case studies show that the degrees of formality provide a convincing solution to the problem of untapped potential.
Das Training sozioemotionaler Kompetenzen ist gerade für Menschen mit Autismus nützlich. Ein solches Training kann mithilfe einer spielbasierten Anwendung effektiv gestaltet werden. Zwei Minispiele, Mimikry und Emo-Mahjong, wurden realisiert und hinsichtlich User Experience evaluiert. Die jeweiligen Konzepte und die Evaluationsergebnisse sollen hier vorgestellt werden.
E-Learning Symposium 2018
(2018)
In den vergangenen Jahren sind viele E-Learning-Innovationen entstanden. Einige davon wurden auf den vergangenen E-Learning Symposien der Universität Potsdam präsentiert: Das erste E-Learning Symposium im Jahr 2012 konzentrierte sich auf unterschiedliche Möglichkeiten der Studierendenaktivierung und Lehrgestaltung. Das Symposium 2014 rückte vor allem die Studierenden ins Zentrum der Aufmerksamkeit. Im Jahr 2016 kam es durch das Zusammengehen des Symposiums mit der DeLFI-Tagung zu einer Fokussierung auf technische Innovationen. Doch was ist aus den Leuchttürmen von gestern geworden, und brauchen wir überhaupt noch neue Leuchttürme? Das Symposium setzt sich in diesem Jahr unter dem Motto „Innovation und Nachhaltigkeit – (k)ein Gegensatz?“ mit mediengestützten Lehr- und Lernprozessen im universitären Kontext auseinander und reflektiert aktuelle technische sowie didaktische Entwicklungen mit Blick auf deren mittel- bis langfristigen Einsatz in der Praxis.
Dieser Tagungsband zum E-Learning Symposium 2018 an der Universität Potsdam beinhaltet eine Mischung von Forschungs- und Praxisbeiträgen aus verschiedenen Fachdisziplinen und eröffnet vielschichtige Perspektiven auf das Thema E-Learning. Dabei werden die Vielfalt der didaktischen Einsatzszenarien als auch die Potentiale von Werk-zeugen und Methoden der Informatik in ihrem Zusammenspiel beleuchtet.
In seiner Keynote widmet sich Reinhard Keil dem Motto des Symposiums und geht der Nachhaltigkeit bei E-Learning-Projekten auf den Grund. Dabei analysiert und beleuchtet er anhand seiner über 15-jährigen Forschungspraxis die wichtigsten Wirkfaktoren und formuliert Empfehlungen zur Konzeption von E-Learning-Projekten. Im Gegensatz zu rein auf Kostenersparnis ausgerichteten (hochschul-)politischen Forderungen proklamiert er den Ansatz der hypothesengeleiteten Technikgestaltung, in der Nachhaltigkeit als Leitfrage oder Forschungsstrategie verstanden werden kann. In eine ähnliche Richtung geht der Beitrag von Iris Braun et al., die über Erfolgsfaktoren beim Einsatz von Audience Response Systemen in der universitären Lehre berichten.
Ein weiteres aktuelles Thema, sowohl für die Bildungstechnologie als auch in den Bildungswissenschaften allgemein, ist die Kompetenzorientierung und –modellierung. Hier geht es darum (Problemlöse-)Fähigkeiten gezielt zu beschreiben und in den Mittelpunkt der Lehre zu stellen. Johannes Konert stellt in einem eingeladenen Vortrag zwei Projekte vor, die den Prozess beginnend bei der Definition von Kompetenzen, deren Modellierung in einem semantischen maschinenlesbaren Format bis hin zur Erarbeitung von Methoden zur Kompetenzmessung und der elektronischen Zertifizierung aufzeigen. Dabei geht er auf technische Möglichkeiten, aber auch Grenzen ein. Auf einer spezifischeren Ebene beschäftigt sich Sarah Stumpf mit digitalen bzw. mediendidaktischen Kompetenzen im Lehramtsstudium und stellt ein Framework für die Förderung ebensolcher Kompetenzen bei angehenden Lehrkräften vor.
Der Einsatz von E-Learning birgt noch einige Herausforderungen. Dabei geht es oft um die Verbindung von Didaktik und Technik, den Erhalt von Aufmerksamkeit oder den Aufwand für das Erstellen von interaktiven Lehr- und Lerninhalten. Drei Beiträge in diesem Tagungsband beschäftigen sich mit dieser Thematik in unterschiedlichen Kontexten und zeigen Best-Practices und Lösungsansätze auf: Der Beitrag von Martina Wahl und Michael Hölscher behandelt den besonderen Kontext von Blended Learning-Szenarien in berufsbegleitenden Studiengängen. Um die Veröffentlichung eines global frei verfügbaren Onlinekurses abseits der großen MOOC Plattformen und den didaktischen Herausforderungen auch hinsichtlich der Motivation geht es im Beitrag von Ennio Marani und Isabel Jaisli. Schließlich schlagen Gregor Damnik et al. die automatische Erzeugung von Aufgaben zur Erhöhung von Interaktivität und Adaptivität in digitalen Lernressourcen vor, um den teilweise erheblichen Erstellungsaufwand zu reduzieren.
Zum Thema E-Learning zählen auch immer mobile Apps bzw. Spiele. Gleich zwei Beiträge beschäftigen sich mit dem Einsatz von E-Learning-Tools im Gesundheitskontext: Anna Tscherejkina und Anna Morgiel stellen in ihrem Beitrag Minispiele zum Training von sozio-emotionalen Kompetenzen für Menschen mit Autismus vor, und Stephanie Herbstreit et al. berichten vom Einsatz einer mobilen Lern-App zur Verbesserung von klinisch-praktischem Unterricht.
Physical computing covers the design and realization of interactive objects and installations and allows learners to develop concrete, tangible products of the real world, which arise from their imagination. This can be used in computer science education to provide learners with interesting and motivating access to the different topic areas of the subject in constructionist and creative learning environments. However, if at all, physical computing has so far mostly been taught in afternoon clubs or other extracurricular settings. Thus, for the majority of students so far there are no opportunities to design and create their own interactive objects in regular school lessons.
Despite its increasing popularity also for schools, the topic has not yet been clearly and sufficiently characterized in the context of computer science education. The aim of this doctoral thesis therefore is to clarify physical computing from the perspective of computer science education and to adequately prepare the topic both content-wise and methodologically for secondary school teaching. For this purpose, teaching examples, activities, materials and guidelines for classroom use are developed, implemented and evaluated in schools.
In the theoretical part of the thesis, first the topic is examined from a technical point of view. A structured literature analysis shows that basic concepts used in physical computing can be derived from embedded systems, which are the core of a large field of different application areas and disciplines. Typical methods of physical computing in professional settings are analyzed and, from an educational perspective, elements suitable for computer science teaching in secondary schools are extracted, e. g. tinkering and prototyping. The investigation and classification of suitable tools for school teaching show that microcontrollers and mini computers, often with extensions that greatly facilitate the handling of additional components, are particularly attractive tools for secondary education. Considering the perspectives of science, teachers, students and society, in addition to general design principles, exemplary teaching approaches for school education and suitable learning materials are developed and the design, production and evaluation of a physical computing construction kit suitable for teaching is described.
In the practical part of this thesis, with “My Interactive Garden”, an exemplary approach to integrate physical computing in computer science teaching is tested and evaluated in different courses and refined based on the findings in a design-based research approach. In a series of workshops on physical computing, which is based on a concept for constructionist professional development that is developed specifically for this purpose, teachers are empowered and encouraged to develop and conduct physical computing lessons suitable for their particular classroom settings. Based on their in-class experiences, a process model of physical computing teaching is derived. Interviews with those teachers illustrate that benefits of physical computing, including the tangibility of crafted objects and creativity in the classroom, outweigh possible drawbacks like longer preparation times, technical difficulties or difficult assessment. Hurdles in the classroom are identified and possible solutions discussed.
Empirical investigations in the different settings reveal that “My Interactive Garden” and physical computing in general have a positive impact, among others, on learner motivation, fun and interest in class and perceived competencies.
Finally, the results from all evaluations are combined to evaluate the design principles for physical computing teaching and to provide a perspective on the development of decision-making aids for physical computing activities in school education.
Mobile Endgeräte und die dazugehörigen Applikationen sind zu einem unverzichtbaren Bestandteil des täglichen Lebens geworden und ermöglichen den ortsund zeitunabhängigen Zugriff auf wichtige Informationen. Hochschulspezifische An- gebote sind im mobilen Bereich hingegen noch immer nicht flächendeckend anzutreffen und lassen sich i. d. R. nur auf Einzelaktivitäten Studierender und Lehrender zurückführen. Dabei können mobile Applikationen einen essentiellen Beitrag zur Verbesserung der studentischen Selbstorganisation sowie für die Ausgestaltung und Ergänzung von konkreten Lehr-/Lernszenarien leisten. Dieser Artikel stellt ein modulares Hochschul-App-Framework vor, das sowohl zentrale campusbezogene Dienste als auch dezentrale Lernapplikationen unter einer Oberfläche vereint anbietet. Anhand einer Analyse von Stärken und Schwächen werden verschiedene Ansätze in Hinblick auf Anforderungen, Entwicklung, Wartung und Betrieb der Hochschul-App zusammengefasst und bewertet. Es wird auf die zugrundeliegende serviceorientierte Architektur eingegangen, die eine Portierung der Applikation auf andere Hochschulen mit einem vertretbaren Aufwand ermöglicht. Der Beitrag schließt mit einer Darstellung der ersten Ergebnisse und weiterführender Überlegungen und Arbeiten.
Institutions are facing the challenge to integrate legacy systems with steadily growing new ones, using different technologies and interaction patterns. With the demand of offering the best potential of all systems, several not matching systems including their functions have to be aggregated and offered in a useable way. This paper presents an adaptive, generalizable and self-organized Personal Learning Environment (PLE) framework with the potential to integrate several heterogeneous services using a service-oriented architecture. First, a general overview over the field is given, followed by the description of the core components of the PLE framework. A prototypical implementation is presented. Finally, it’s shown how the PLE framework can be dynamically adapted to a changing system environment, reflecting experiences from first user studies.
Mobile devices and associated applications (apps) are an indispensable part of daily life and provide access to important information anytime and anywhere. However, the availability of university-wide services in the mobile sector is still poor. If they exist they usually result from individual activities of students and teachers. Mobile applications can have an essential impact on the improvement of students’ self-organization as well as on the design and enhancement of specific learning scenarios, though. This article introduces a mobile campus app framework, which integrates central campus services and decentralized learning applications. An analysis of strengths and weaknesses of different approaches is presented to summarize and evaluate them in terms of requirements, development, maintenance and operation. The article discusses the underlying service-oriented architecture that allows transferring the campus app to other universities or institutions at reasonable cost. It concludes with a presentation of the results as well as ongoing discussions and future work
Nach wie vor stellen Klausuren mit hunderten Studierenden, sogenannte Massenklausuren, insbesondere Lehrende vor organisatorische Herausforderungen, die es jedes Semester aufs Neue zu bewältigen gilt. Dieser Beitrag zeigt technische Möglichkeiten zur Unterstützung und Durchführung von Massenklausuren auf. Hierzu werden zunächst verschiedene Verfahren der Klausurvorbereitung und -organisation aus unterschiedlichen Fachdisziplinen erfasst und analysiert.
Im Anschluss wird ein verallgemeinerbarer Prozess konzipiert, der sämtliche Schritte von der Planung, Erstellung, Durchführung, Kontrolle bis hin zur Archivierung berücksichtigt. Zur Unterstützung der einzelnen Prozessschritte werden technische Systeme vorgestellt, die mit den Bedürfnissen der Prüfungsverantwortlichen abgestimmt sind. Die entstandenen Systeme werden vorgestellt und die durchgeführten Zielgruppentests reflektiert.
In universities, diverse tools and software systems exist that each facilitates a different teaching and learning scenario. A deviating approach is taken by Personal Learning Environments (PLE) that aim to provide a common platform. Considering e-portfolios as an integral part of PLEs, especially portfolio-based learning and assessment have to be supported. Therefore, the concept of a PLE is developed further by enabling the products of different software systems to be integrated in portfolio pages and finally submitted for feedback and assessment. It is further elaborated how the PLE approach is used to support the continuous formative assessment within portfolio-based learning scenarios.
Während Qualifikationen und Kompetenzen, die auf informellem Wege erworben werden, immer mehr Beachtung finden, stellt sowohl deren Darstellung als auch die Anerkennung ein meist unüberwindbares Hindernis für Ausstellende und Erwerbende dar. Vermehrt wird unterdessen von klassisch papiergebundenen auf digitale Teilnahmezertifikate umgestellt, um den Nachweis von Kompetenz- und Qualifikationserwerb zu vereinfachen. In diesem Zusammenhang kann die Verbindung von digitalen Teilnahmezertifikaten und Open Badges einen Mehrwert für die öffentliche Darstellung und Verifikation bieten.
Mobile Applikationen eignen sich als strukturelle Unterstützungsangebote für Studierende während des Studieneinstiegs. Durch die App Reflect.UP werden Studienorganisation, Studieninhalte und -ziele von Studierenden reflektiert. Der bewusste Umgang mit dem studentischen Kompetenzerwerb als wissenschaftliche eflexionskompetenz ist immanenter Bestandteil der akademischen Professionalisierung und steht in diesem Beitrag im Vordergrund. Gezeigt wird, wie aus Studienordnungen und Modulbeschreibungen systematisch Fragen zur studentischen Reflexion herausgearbeitet werden und dadurch ein Kompetenzraster entsteht. Die durch den praktischen Einsatz von Reflect.UP gewonnenen Daten werden ausgewertet und dahingehend diskutiert, welche Rückschlüsse sich hieraus auf die Problemlagen und Lernprozesse der Studierenden sowie für die Studiengangsorganisation(en) ziehen lassen. Darüber hinaus werden die Stärken und Schwächen einer mobilen Applikation als sozial- und informationswissenschaftliches Amalgam zur strukturellen Unterstützung der Studieneingangsphase reflektiert.
The term Personal Learning Environment (PLE) is associated with the desire to put the learner in control of his own learning process, so that he is able to set and accomplish the desired learning goals at the right time with the learning environment chosen by him. Gradually, such a learning environment includes several digital content, services and tools. It is thus summarized as the Virtual Learning Environment (VLE). Even though the construction of an individual PLE is a complex task, several approaches to support this process already exist. They mostly occur under the umbrella term PLE or with little accentuations like iPLE, which especially live within the context of institutions. This paper sums up the variety of attempts and technical approaches to establish a PLE and suggests a categorization for them.
Die zunehmende Digitalisierung des Lebens hält in vielen Bereichen Einzug. Im Sinne der Forderung nach lebenslangem Lernen und mit dem Ziel den vielfältigen sich ändernden Anforderungen des täglichen Lebens erfolgreich zu begegnen, bedarf es der Schaffung einer individuellen und persönlichen Lernumgebung jedes Einzelnen. Dieser Beitrag setzt sich zunächst kritisch mit dem Begriff der persönlichen Lernumgebung auseinander. Darauf aufbauend wird der Betrachtungsrahmen auf die Verwendung innerhalb der Institution Hochschule eingeengt. Hieraus erwachsen sowohl Herausforderungen als auch Divergenzen im Spannungsfeld zwischen institutioneller und persönlicher Lernumgebung, die innerhalb eines offenen Gestaltungsprozesses zu lösen sind, sodass sich allgemeingültige Designprinzipien institutioneller persönlicher Lernumgebungen ableiten lassen.
The ongoing digitalization leads to a need of continuous change of ICT (Information and Communi-cation Technology) in all university domains and therefore affects all stakeholders in this arena. More and more ICT components, systems and tools occur and have to be integrated into the existing processes and infrastructure of the institutions. These tasks include the transfer of resources and information across multiple ICT systems. By using so-called virtual environments for domains of re-search, education, learning and work, the performance of daily tasks can be aided. Based on a user requirement analysis different short- and long-term objectives were identified and are tackled now in the context of a federal research project. In order to be prepared for the ongoing digitalization, new systems have to be provided. Both, a service-oriented infrastructure and a related web-based virtual learning environment constitute the platform Campus.UP and creates the necessary basis to be ready for future challenges. The current focus lies on e-portfolio work, hence we will present a related focus group evaluation. The results indicate a tremendous need to extend the possibilities of sharing resources across system boundaries, in order to enable a comfortable participation of exter-nal cooperating parties and to clarify the focus of each connected system. The introduction of such an infrastructure implies far-reaching changes for traditional data centers. Therefore, the challenges and risks of faculty conducting innovation projects for the ICT organization are taken as a starting point to stimulate a discussion, how data centers can utilize projects to be ready for the future needs. We are confident that Campus.UP will provide the basis for ensuring the persistent transfer of innovation to the ICT organization and thus will contribute to tackle the future challenges of digitalization.
Ob Online-Kurse, videobasierte Lehrangebote, mobile Applikationen, eigenentwickelte oder kommerzielle Web 2.0-Anwendungen, die Fülle digitaler Unterstützungsangebote ist kaum zu überblicken. Dabei bieten mobile Endgeräte, Web-Anwendungen und Apps Chancen Lehre, Studium und Forschung maßgeblich neu zu gestalten. Im Beitrag wird ein Beschreibungsrahmen für die mediendidaktische Ausgestaltung von Lehr-, Lern- und Forschungsarrangements vorgestellt, der die technischen Gesichtspunkte hervorhebt. Anschließend werden unterschiedliche Nutzungsszenarien unter Einbeziehung digitaler Medien skizziert. Diese werden als Ausgangspunkt genommen um das Konzept einer Systemarchitektur vorzustellen, die es zum einen ermöglicht beliebige Applikationen automatisiert bereit zu stellen und zum anderen die anfallenden Nutzendendaten plattformübergreifend zu aggregieren und für eine Ausgestaltung virtueller Lehr- und Lernräumen zu nutzen.
Mobile Endgeräte und Applikationen (Apps) sind dank vielfältiger Kommunikations-, Informations- und Assistenzfunktionen zu einem unverzichtbaren Bestandteil unseres täglichen Lebens geworden. Inzwischen hat sich insbesondere im Hochschulumfeld eine bunte Vielfalt an mobilen Unterstützungsangeboten etabliert, beginnend bei zentral angebotenen Uni-Apps bis hin zu unterschiedlichen Apps zur Ausgestaltung einzelner Lehrveranstaltungen oder individueller Lehr- und Lernszenarien. Angesichts der großen Aufwände zur Entwicklung, Distribution und Pflege mobiler Anwendungen ist ein Einsatz für eine möglichst große Zielgruppe wünschenswert. Dies kann jedoch mit dem Charakter mobiler Endgeräte als persönliche, individualisierte Assistenten kollidieren.
In diesem Beitrag werden entlang dieses Spektrums zwischen (fach-)spezifischen Einzellösungen und breiten Allroundern verschiedene mobile Unterstützungsangebote aus dem Hochschulbereich vorgestellt, hinsichtlich ihres Einsatzes kontextuell eingeordnet und systematisiert. Dies umfasst mobile Anwendungen, die allgemeine organisatorische Aspekte des Studiums, bestimmte Felder wie die Studieneingangsphase oder die konkrete Begleitung hybrider Lernszenarien fokussieren. Es schließt sich eine App-Auswahl an, die fachspezifischen Aspekten Rechnung trägt und in denen Inhalte in Form von Serious Games, Simulationen und Inhaltsmodulen aufbereitet sind. Neben Lehre und Studium wird auch die Forschung in den Fokus gerückt, wo Apps gleichermaßen als Forschungsgegenstand und Datenerhebungsinstrument wirken. Aus der Fülle dieser Entwicklungen resultiert eine App-Vielfalt, die verschiedene Herausforderungen aufwirft. Der Beitrag stellt die spezifischen Herausforderungen zusammen und spricht Empfehlungen aus. Dabei werden sowohl organisatorische, inhaltliche und technische Fragestellungen thematisiert als auch rechtliche Gesichtspunkte bezüglich Datenschutz und Copyright tangiert.
Die 8. Fachtagung für Hochschuldidaktik der Informatik (HDI) fand im September 2018 zusammen mit der Deutschen E-Learning Fachtagung Informatik (DeLFI) unter dem gemeinsamen Motto „Digitalisierungswahnsinn? - Wege der Bildungstransformationen“ in Frankfurt statt.
Dabei widmet sich die HDI allen Fragen der informatischen Bildung im Hochschulbereich. Schwerpunkte bildeten in diesem Jahr u. a.:
- Analyse der Inhalte und anzustrebenden Kompetenzen in Informatikveranstaltungen
- Programmieren lernen & Einstieg in Softwareentwicklung
- Spezialthemen: Data Science, Theoretische Informatik und Wissenschaftliches Arbeiten
Die Fachtagung widmet sich ausgewählten Fragestellungen dieser Themenkomplexe, die durch Vorträge ausgewiesener Experten und durch eingereichte Beiträge intensiv behandelt werden.
Solving problems combining task and motion planning requires searching across a symbolic search space and a geometric search space. Because of the semantic gap between symbolic and geometric representations, symbolic sequences of actions are not guaranteed to be geometrically feasible. This compels us to search in the combined search space, in which frequent backtracks between symbolic and geometric levels make the search inefficient.We address this problem by guiding symbolic search with rich information extracted from the geometric level through culprit detection mechanisms.
Answer Set Programming (ASP) is a declarative problem solving approach, combining a rich yet simple modeling language with high-performance solving capabilities. Although this has already resulted in various applications, certain aspects of such applications are more naturally modeled using variables over finite domains, for accounting for resources, fine timings, coordinates, or functions. Our goal is thus to extend ASP with constraints over integers while preserving its declarative nature. This allows for fast prototyping and elaboration tolerant problem descriptions of resource related applications. The resulting paradigm is called Constraint Answer Set Programming (CASP).
We present three different approaches for solving CASP problems. The first one, a lazy, modular approach combines an ASP solver with an external system for handling constraints. This approach has the advantage that two state of the art technologies work hand in hand to solve the problem, each concentrating on its part of the problem. The drawback is that inter-constraint dependencies cannot be communicated back to the ASP solver, impeding its learning algorithm. The second approach translates all constraints to ASP. Using the appropriate encoding techniques, this results in a very fast, monolithic system. Unfortunately, due to the large, explicit representation of constraints and variables, translation techniques are restricted to small and mid-sized domains. The third approach merges the lazy and the translational approach, combining the strength of both while removing their weaknesses. To this end, we enhance the dedicated learning techniques of an ASP solver with the inferences of the translating approach in a lazy way. That is, the important knowledge is only made explicit when needed.
By using state of the art techniques from neighboring fields, we provide ways to tackle real world, industrial size problems. By extending CASP to reactive solving, we open up new application areas such as online planning with continuous domains and durations.
Das Thema der vorliegenden Arbeit ist die semantische Suche im Kontext heutiger Informationsmanagementsysteme. Zu diesen Systemen zählen Intranets, Web 3.0-Anwendungen sowie viele Webportale, die Informationen in heterogenen Formaten und Strukturen beinhalten. Auf diesen befinden sich einerseits Daten in strukturierter Form und andererseits Dokumente, die inhaltlich mit diesen Daten in Beziehung stehen. Diese Dokumente sind jedoch in der Regel nur teilweise strukturiert oder vollständig unstrukturiert. So beschreiben beispielsweise Reiseportale durch strukturierte Daten den Zeitraum, das Reiseziel, den Preis einer Reise und geben in unstrukturierter Form weitere Informationen, wie Beschreibungen zum Hotel, Zielort, Ausflugsziele an.
Der Fokus heutiger semantischer Suchmaschinen liegt auf dem Finden von Wissen entweder in strukturierter Form, auch Faktensuche genannt, oder in semi- bzw. unstrukturierter Form, was üblicherweise als semantische Dokumentensuche bezeichnet wird. Einige wenige Suchmaschinen versuchen die Lücke zwischen diesen beiden Ansätzen zu schließen. Diese durchsuchen zwar gleichzeitig strukturierte sowie unstrukturierte Daten, werten diese jedoch entweder weitgehend voneinander unabhängig aus oder schränken die Suchmöglichkeiten stark ein, indem sie beispielsweise nur bestimmte Fragemuster unterstützen. Hierdurch werden die im System verfügbaren Informationen nicht ausgeschöpft und gleichzeitig unterbunden, dass Zusammenhänge zwischen einzelnen Inhalten der jeweiligen Informationssysteme und sich ergänzende Informationen den Benutzer erreichen.
Um diese Lücke zu schließen, wurde in der vorliegenden Arbeit ein neuer hybrider semantischer Suchansatz entwickelt und untersucht, der strukturierte und semi- bzw. unstrukturierte Inhalte während des gesamten Suchprozesses kombiniert. Durch diesen Ansatz werden nicht nur sowohl Fakten als auch Dokumente gefunden, es werden auch Zusammenhänge, die zwischen den unterschiedlich strukturierten Daten bestehen, in jeder Phase der Suche genutzt und fließen in die Suchergebnisse mit ein. Liegt die Antwort zu einer Suchanfrage nicht vollständig strukturiert, in Form von Fakten, oder unstrukturiert, in Form von Dokumenten vor, so liefert dieser Ansatz eine Kombination der beiden. Die Berücksichtigung von unterschiedlich Inhalten während des gesamten Suchprozesses stellt jedoch besondere Herausforderungen an die Suchmaschine. Diese muss in der Lage sein, Fakten und Dokumente in Abhängigkeit voneinander zu durchsuchen, sie zu kombinieren sowie die unterschiedlich strukturierten Ergebnisse in eine geeignete Rangordnung zu bringen. Weiterhin darf die Komplexität der Daten nicht an die Endnutzer weitergereicht werden. Die Darstellung der Inhalte muss vielmehr sowohl bei der Anfragestellung als auch bei der Darbietung der Ergebnisse verständlich und leicht interpretierbar sein.
Die zentrale Fragestellung der Arbeit ist, ob ein hybrider Ansatz auf einer vorgegebenen Datenbasis die Suchanfragen besser beantworten kann als die semantische Dokumentensuche und die Faktensuche für sich genommen, bzw. als eine Suche die diese Ansätze im Rahmen des Suchprozesses nicht kombiniert. Die durchgeführten Evaluierungen aus System- und aus Benutzersicht zeigen, dass die im Rahmen der Arbeit entwickelte hybride semantische Suchlösung durch die Kombination von strukturierten und unstrukturierten Inhalten im Suchprozess bessere Antworten liefert als die oben genannten Verfahren und somit Vorteile gegenüber bisherigen Ansätzen bietet. Eine Befragung von Benutzern macht deutlich, dass die hybride semantische Suche als verständlich empfunden und für heterogen strukturierte Datenmengen bevorzugt wird.
Contemporary multi-core processors are parallel systems that also provide shared memory for programs running on them. Both the increasing number of cores in so-called many-core systems and the still growing computational power of the cores demand for memory systems that are able to deliver high bandwidths. Caches are essential components to satisfy this requirement. Nevertheless, hardware-based cache coherence in many-core chips faces practical limits to provide both coherence and high memory bandwidths. In addition, a shift away from global coherence can be observed. As a result, alternative architectures and suitable programming models need to be investigated.
This thesis focuses on fast communication for non-cache-coherent many-core architectures. Experiments are conducted on the Single-Chip Cloud Computer (SCC), a non-cache-coherent many-core processor with 48 mesh-connected cores. Although originally designed for message passing, the results of this thesis show that shared memory can be efficiently used for one-sided communication on this kind of architecture. One-sided communication enables data exchanges between processes where the receiver is not required to know the details of the performed communication. In the notion of the Message Passing Interface (MPI) standard, this type of communication allows to access memory of remote processes. In order to support this communication scheme on non-cache-coherent architectures, both an efficient process synchronization and a communication scheme with software-managed cache coherence are designed and investigated.
The process synchronization realizes the concept of the general active target synchronization scheme from the MPI standard. An existing classification of implementation approaches is extended and used to identify an appropriate class for the non-cache-coherent shared memory platform. Based on this classification, existing implementations are surveyed in order to find beneficial concepts, which are then used to design a lightweight synchronization protocol for the SCC that uses shared memory and uncached memory accesses. The proposed scheme is not prone to process skew and also enables direct communication as soon as both communication partners are ready. Experimental results show very good scaling properties and up to five times lower synchronization latency compared to a tuned message-based MPI implementation for the SCC.
For the communication, SCOSCo, a shared memory approach with software-managed cache coherence, is presented. According requirements for the coherence that fulfill MPI's separate memory model are formulated, and a lightweight implementation exploiting SCC hard- and software features is developed. Despite a discovered malfunction in the SCC's memory subsystem, the experimental evaluation of the design reveals up to five times better bandwidths and nearly four times lower latencies in micro-benchmarks compared to the SCC-tuned but message-based MPI library. For application benchmarks, like a parallel 3D fast Fourier transform, the runtime share of communication can be reduced by a factor of up to five. In addition, this thesis postulates beneficial hardware concepts that would support software-managed coherence for one-sided communication on future non-cache-coherent architectures where coherence might be only available in local subdomains but not on a global processor level.