004 Datenverarbeitung; Informatik
Refine
Has Fulltext
- yes (121) (remove)
Year of publication
Document Type
- Monograph/Edited Volume (121) (remove)
Language
- English (121) (remove)
Keywords
- Hasso-Plattner-Institut (9)
- Forschungskolleg (8)
- Hasso Plattner Institute (8)
- Klausurtagung (8)
- Service-oriented Systems Engineering (8)
- Forschungsprojekte (6)
- Future SOC Lab (6)
- In-Memory Technologie (6)
- Multicore Architekturen (6)
- cloud computing (6)
- Cloud Computing (5)
- Ph.D. retreat (5)
- cyber-physical systems (5)
- multicore architectures (5)
- quantitative analysis (5)
- research projects (5)
- service-oriented systems engineering (5)
- Modellierung (4)
- Research School (4)
- nested graph conditions (4)
- probabilistic timed systems (4)
- qualitative Analyse (4)
- qualitative analysis (4)
- quantitative Analyse (4)
- research school (4)
- Datenintegration (3)
- Graphtransformationen (3)
- In-Memory technology (3)
- Model Synchronisation (3)
- Model Transformation (3)
- Ph.D. Retreat (3)
- Sicherheit (3)
- Smalltalk (3)
- Tripel-Graph-Grammatik (3)
- Verifikation (3)
- Virtualisierung (3)
- graph transformation (3)
- graph transformation systems (3)
- in-memory technology (3)
- machine learning (3)
- maschinelles Lernen (3)
- openHPI (3)
- privacy (3)
- security (3)
- virtual machines (3)
- AUTOSAR (2)
- BPMN (2)
- Betriebssysteme (2)
- Bounded Model Checking (2)
- Cloud-Sicherheit (2)
- Cloud-Speicher (2)
- Data Integration (2)
- Design Thinking (2)
- Digitalisierung (2)
- Graphentransformationssysteme (2)
- Graphtransformationssysteme (2)
- Identitätsmanagement (2)
- Innovation (2)
- Java (2)
- Live-Programmierung (2)
- Lively Kernel (2)
- Model Synchronization (2)
- Model-Driven Engineering (2)
- Modeling (2)
- Modellprüfung (2)
- Privacy (2)
- Process Modeling (2)
- Prozessmodellierung (2)
- Ressourcenoptimierung (2)
- SysML (2)
- Versionsverwaltung (2)
- Virtuelle Maschinen (2)
- Werkzeuge (2)
- artifical intelligence (2)
- bounded model checking (2)
- cloud (2)
- continuous integration (2)
- cyber-physische Systeme (2)
- data profiling (2)
- debugging (2)
- digitalization (2)
- graph constraints (2)
- identity management (2)
- incremental graph pattern matching (2)
- k-inductive invariant checking (2)
- kontinuierliche Integration (2)
- live programming (2)
- model checking (2)
- modeling (2)
- modellgetriebene Entwicklung (2)
- operating systems (2)
- probabilistische gezeitete Systeme (2)
- probabilistische zeitgesteuerte Systeme (2)
- smalltalk (2)
- triple graph grammars (2)
- typed attributed graphs (2)
- verification (2)
- verschachtelte Graphbedingungen (2)
- version control (2)
- virtualization (2)
- virtuelle Maschinen (2)
- ACINQ (1)
- ASIC (1)
- Abhängigkeiten (1)
- Abstraktion von Geschäftsprozessmodellen (1)
- Agile (1)
- Agilität (1)
- Aktivitäten (1)
- Ambiguity (1)
- Ambiguität (1)
- Analog-zu-Digital-Konvertierung (1)
- Apriori (1)
- Architektur (1)
- Artem Erkomaishvili (1)
- Aspect-oriented Programming (1)
- Aspektorientierte Softwareentwicklung (1)
- Association Rule Mining (1)
- Assoziationsregeln (1)
- Asynchrone Schaltung (1)
- Asynchronous circuit (1)
- Attribut-Merge-Prozess (1)
- Attribute Merge Process (1)
- Ausführung von Modellen (1)
- Australian securities exchange (1)
- Auswirkungen (1)
- BCCC (1)
- BPM (1)
- BTC (1)
- Bahnwesen (1)
- Batchprozesse (1)
- Bayes'sche Netze (1)
- Bayesian networks (1)
- Bedingte Inklusionsabhängigkeiten (1)
- Behavior change (1)
- Beschränkungen und Abhängigkeiten (1)
- Bisimulation (1)
- BitShares (1)
- Bitcoin (1)
- Bitcoin Core (1)
- Blockchain (1)
- Blockchain Auth (1)
- Blockchain-Konsortium R3 (1)
- Blockkette (1)
- Blockstack (1)
- Blockstack ID (1)
- Blumix-Plattform (1)
- Blöcke (1)
- Bounded Backward Model Checking (1)
- Byzantine Agreement (1)
- CEP (1)
- CSC (1)
- CSCW (1)
- Change Management (1)
- Cloud (1)
- Colored Coins (1)
- Conditional Inclusion Dependency (1)
- Conformance Überprüfung (1)
- Constraints (1)
- Context-oriented Programming (1)
- Contracts (1)
- Controller-Resynthese (1)
- Creative (1)
- Cyber-Physical Systems (1)
- Cyber-Physical-Systeme (1)
- Cyber-physical-systems (1)
- Cyber-physikalische Systeme (1)
- DAO (1)
- DPoS (1)
- Data Dependency (1)
- Data Modeling (1)
- Data Profiling (1)
- Data Quality (1)
- Data Warehouse (1)
- Database Cost Model (1)
- Datenabhängigkeiten (1)
- Datenanalyse (1)
- Datenbank-Kostenmodell (1)
- Datenflusskorrektheit (1)
- Datenmodellierung (1)
- Datenqualität (1)
- Datensatz (1)
- Datenvertraulichkeit (1)
- Datenvisualisierung (1)
- Deadline-Verbreitung (1)
- Debugging (1)
- Dekubitus (1)
- Delegated Proof-of-Stake (1)
- Denkweise (1)
- Differential Privacy (1)
- Discrimination Networks (1)
- Distributed Proof-of-Research (1)
- Distributed-Ledger-Technologie (DLT) (1)
- Duplicate Detection (1)
- Duplikaterkennung (1)
- Dynamic Type System (1)
- Dynamische Typ Systeme (1)
- E-Learning (1)
- E-Wallet (1)
- ECDSA (1)
- EHR (1)
- EPA (1)
- Echtzeit (1)
- Echtzeitsysteme (1)
- Elektronische Patientenakte (1)
- Ereignisse (1)
- Erfüllbarkeitsanalyse (1)
- Eris (1)
- Erkennen von Meta-Daten (1)
- Ether (1)
- Ethereum (1)
- European Union (1)
- Europäische Union (1)
- Evolution (1)
- Evolution in MDE (1)
- Extract-Transform-Load (ETL) (1)
- FRP (1)
- Fallstudie (1)
- Federated Byzantine Agreement (1)
- Feedback Loops (1)
- Fehlersuche (1)
- Fehlertoleranz (1)
- FollowMyVote (1)
- Fork (1)
- Formale Verifikation (1)
- Functional Lenses (1)
- Generalized Discrimination Networks (1)
- Georgian chant (1)
- Georgische liturgische Gesänge (1)
- Geschäftsprozesse (1)
- Geschäftsprozessmanagement (1)
- Gesetze (1)
- GitHub (1)
- Graph-Constraints (1)
- Graph-basierte Suche (1)
- Graphbedingungen (1)
- Graphdatenbanken (1)
- Graphreparatur (1)
- Graphtransformation (1)
- Gridcoin (1)
- HENSHIN (1)
- HPI Schul-Cloud (1)
- Hard Fork (1)
- Hashed Timelock Contracts (1)
- Hasso-Plattner-Institute (1)
- Hauptspeicherdatenbank (1)
- Heuristiken (1)
- Homomorphe Verschlüsselung (1)
- Häkeln (1)
- Ideation (1)
- Ideenfindung (1)
- Impact (1)
- Implementation in Organizations (1)
- Implementierung in Organisationen (1)
- In-Memory Database (1)
- In-Memory Datenbank (1)
- In-Memory-Datenbank (1)
- Individuen (1)
- Infinite State (1)
- Information Extraction (1)
- Information Systems (1)
- Informationsextraktion (1)
- Informationssysteme (1)
- Inkrementelle Graphmustersuche (1)
- Innovationsmanagement (1)
- Innovationsmethode (1)
- Interdisciplinary Teams (1)
- Internet der Dinge (1)
- Internet of Things (1)
- Interpreter (1)
- Interval Timed Automata (1)
- Invariant-Checking (1)
- Invarianten (1)
- Invariants (1)
- IoT (1)
- JCop (1)
- Japanese Blockchain Consortium (1)
- Japanisches Blockchain-Konsortium (1)
- Kausalität (1)
- Kette (1)
- Kollaborationen (1)
- Konsensalgorithmus (1)
- Konsensprotokoll (1)
- Konsensprotokolle (1)
- Konsistenzrestauration (1)
- Kreativität (1)
- Kunstanalyse (1)
- Künstliche Intelligenz (1)
- Laufzeitanalyse (1)
- Laufzeitmodelle (1)
- Leadership (1)
- Leistungsmodelle von virtuellen Maschinen (1)
- Lightning Network (1)
- Link Discovery (1)
- Link-Entdeckung (1)
- Linked Data (1)
- Linked Open Data (1)
- Lock-Time-Parameter (1)
- Lösungsraum (1)
- MDE Ansatz (1)
- MDE settings (1)
- MERLOT (1)
- MOOC (1)
- MOOCs (1)
- Management (1)
- Measurement (1)
- Megamodell (1)
- Megamodels (1)
- Mehrkernsysteme (1)
- Messung (1)
- Metadata Discovery (1)
- Metadatenentdeckung (1)
- Metadatenqualität (1)
- Micropayment-Kanäle (1)
- Microsoft Azur (1)
- Middleware (1)
- Mindset (1)
- Mobile Application Development (1)
- Model Execution (1)
- Modeling Languages (1)
- Modell-getriebene Softwareentwicklung (1)
- Modelle mit mehreren Versionen (1)
- Modellerzeugung (1)
- Modellgetriebene Softwareentwicklung (1)
- Modellierungssprachen (1)
- Modellreparatur (1)
- Modelltransformationen (1)
- Models at Runtime (1)
- Morphic (1)
- Multi-Instanzen (1)
- Multicore architectures (1)
- Multidisciplinary Teams (1)
- Muster (1)
- Musterabgleich (1)
- NASDAQ (1)
- NameID (1)
- Namecoin (1)
- Nested Graph Conditions (1)
- Netzwerkprotokolle (1)
- Newspeak (1)
- Object Constraint Programming (1)
- Object-Oriented Programming (1)
- Objekt-Constraint Programmierung (1)
- Objekt-Orientiertes Programmieren (1)
- Objekt-orientiertes Programmieren mit Constraints (1)
- Objektlebenszyklus-Synchronisation (1)
- Off-Chain-Transaktionen (1)
- Onename (1)
- Online Course (1)
- Online-Learning (1)
- Online-Lernen (1)
- Onlinekurs (1)
- OpenBazaar (1)
- Oracles (1)
- Organisationsveränderung (1)
- Orphan Block (1)
- P2P (1)
- PRISM Modell-Checker (1)
- PRISM model checker (1)
- PTCTL (1)
- Pattern Matching (1)
- Patterns (1)
- Peer-to-Peer Netz (1)
- Peercoin (1)
- Petri net Mapping (1)
- Petri net mapping (1)
- Petrinetz (1)
- PoB (1)
- PoS (1)
- PoW (1)
- Posenabschätzung (1)
- Privatsphäre (1)
- Problem Solving (1)
- Problemlösung (1)
- Process (1)
- Process Enactment (1)
- Process Mining (1)
- Programmiererlebnis (1)
- Programmierung (1)
- Programming Languages (1)
- Proof-of-Burn (1)
- Proof-of-Stake (1)
- Proof-of-Work (1)
- Propagation von Aktivitätsinstanzzuständen (1)
- Prototyping (1)
- Prozess (1)
- Prozessausführung (1)
- Prozesserhebung (1)
- Prozessinstanz (1)
- Prozessoren (1)
- Python (1)
- Quanten-Computing (1)
- Quantitative Analysen (1)
- Regressionstests (1)
- Research Projects (1)
- Reverse Engineering (1)
- Ripple (1)
- Ruby (1)
- Runtime Binding (1)
- Runtime-monitoring (1)
- SCP (1)
- SHA (1)
- SPV (1)
- SQL (1)
- STG decomposition (1)
- STG-Dekomposition (1)
- Sammlungsdatentypen (1)
- Savanne (1)
- Schemaentdeckung (1)
- Schlüsselentdeckung (1)
- Schriftartgestaltung (1)
- Schriftrendering (1)
- Schwierigkeitsgrad (1)
- Scrollytelling (1)
- Self-Adaptive Software (1)
- Sequenzeigenschaften (1)
- Sequenzen von s/t-Pattern (1)
- Serialisierung (1)
- Service-Oriented Architecture (1)
- Service-Orientierte Architekturen (1)
- Service-orientierte Systme (1)
- Signalflankengraph (SFG oder STG) (1)
- Simplified Payment Verification (1)
- Simulation (1)
- Skalierbarkeit der Blockchain (1)
- Slock.it (1)
- SoaML (1)
- Soft Fork (1)
- Software/Hardware Co-Design (1)
- Softwarearchitektur (1)
- Softwareproduktlinien (1)
- Softwaretests (1)
- Solution Space (1)
- Sozialen Medien (1)
- Speicheroptimierungen (1)
- Spezifikation von gezeiteten Graph Transformationen (1)
- Sprachspezifikation (1)
- Squeak (1)
- Standardisierung (1)
- Standards (1)
- Steemit (1)
- Stellar Consensus Protocol (1)
- Storj (1)
- Studie (1)
- Synchronisation (1)
- System of Systems (1)
- Systemsoftware (1)
- Tableaumethode (1)
- Tele-Lab (1)
- Tele-Teaching (1)
- Telemedizin (1)
- Temporallogik (1)
- Testergebnisse (1)
- Testpriorisierungs (1)
- The Bitfury Group (1)
- The DAO (1)
- Threshold Cryptography (1)
- Timed Automata (1)
- Tools (1)
- Trajektorien (1)
- Transaktion (1)
- Transformationsebene (1)
- Transformationssequenzen (1)
- Travis CI (1)
- Tripel-Graph-Grammatiken (1)
- Triple Graph Grammar (1)
- Triple Graph Grammars (1)
- Triple-Graph-Grammatiken (1)
- Two-Way-Peg (1)
- Unbegrenzter Zustandsraum (1)
- Unspent Transaction Output (1)
- Unveränderlichkeit (1)
- VUCA-World (1)
- Verbindungsnetzwerke (1)
- Verhaltensabstraktion (1)
- Verhaltensbewahrung (1)
- Verhaltensverfeinerung (1)
- Verhaltensänderung (1)
- Verhaltensäquivalenz (1)
- Verification (1)
- Verlässlichkeit (1)
- Verteilungsalgorithmen (1)
- Verteilungsalgorithmus (1)
- Verträge (1)
- Verzögerungs-Verbreitung (1)
- Virtual machines (1)
- Visualisierung (1)
- Visualisierungskonzept-Exploration (1)
- Wartung von Graphdatenbanksichten (1)
- Watson IoT (1)
- Web applications (1)
- Web-Anwendungen (1)
- Wicked Problems (1)
- Wikipedia (1)
- Wüstenbildung (1)
- Zielvorgabe (1)
- Zookos Dreieck (1)
- Zookos triangle (1)
- Zugriffskontrolle (1)
- access control (1)
- activity instance state propagation (1)
- adaptive Systeme (1)
- adaptive systems (1)
- adoption (1)
- agil (1)
- altchain (1)
- alternative chain (1)
- analog-to-digital conversion (1)
- apriori (1)
- architecture (1)
- art analysis (1)
- asset management (1)
- atomic swap (1)
- ausführbare Semantiken (1)
- batch processing (1)
- behavior preservation (1)
- behavioral abstraction (1)
- behavioral equivalenc (1)
- behavioral refinement (1)
- benutzergenerierte Inhalte (1)
- beschreibende Feldstudie (1)
- bidirectional payment channels (1)
- big data services (1)
- bisimulation (1)
- bitcoin (1)
- bitcoins (1)
- blockchain (1)
- blockchain consortium (1)
- blockchain-übergreifend (1)
- blocks (1)
- blumix platform (1)
- bounded backward model checking (1)
- business process management (1)
- business process model abstraction (1)
- business processes (1)
- case study (1)
- causality (1)
- chain (1)
- change management (1)
- cloud security (1)
- cloud storage (1)
- collaboration (1)
- collection types (1)
- compositional analysis (1)
- computational ethnomusicology (1)
- computer vision (1)
- computer-aided design (1)
- computergestützte Musikethnologie (1)
- confidentiality (1)
- confirmation period (1)
- conformance checking (1)
- consensus algorithm (1)
- consensus protocol (1)
- consensus protocols (1)
- consistency restoration (1)
- contest period (1)
- continuous testing (1)
- contracts (1)
- control resynthesis (1)
- controlled experiment (1)
- convolutional neural networks (1)
- crochet (1)
- cross-chain (1)
- cultural heritage (1)
- cyber-physikalische Systeme (1)
- data center management (1)
- data flow correctness (1)
- data integration (1)
- data set (1)
- data visualization (1)
- deadline propagation (1)
- decentral identities (1)
- decentralized autonomous organization (1)
- decubitus (1)
- deep learning (1)
- delay propagation (1)
- demografische Informationen (1)
- demographic information (1)
- dependability (1)
- dependable computing (1)
- dependencies (1)
- desertification (1)
- design thinking (1)
- dezentrale Identitäten (1)
- dezentrale autonome Organisation (1)
- differential privacy (1)
- difficulty (1)
- difficulty target (1)
- diffusion (1)
- digital education (1)
- digital enlightenment (1)
- digital learning platform (1)
- digital picture archive (1)
- digital sovereignty (1)
- digitale Aufklärung (1)
- digitale Bildung (1)
- digitale Lernplattform (1)
- digitale Souveränität (1)
- digitales Bildarchiv (1)
- direct manipulation (1)
- direkte Manipulation (1)
- discrete-event model (1)
- discrimination networks (1)
- diskretes Ereignismodell (1)
- distributed performance monitoring (1)
- distribution algorithm (1)
- doppelter Hashwert (1)
- double hashing (1)
- dynamic typing (1)
- dynamic programming languages (1)
- dynamic systems (1)
- dynamische Programmiersprachen (1)
- dynamische Sprachen (1)
- dynamische Systeme (1)
- eindeutig (1)
- electronic health record (1)
- erfahrbare Medien (1)
- events (1)
- evolution in MDE (1)
- executable semantics (1)
- exploratives Programmieren (1)
- exploratory programming (1)
- fault tolerance (1)
- federated voting (1)
- feedback loops (1)
- fehlende Daten (1)
- font engineering (1)
- font rendering (1)
- formal verification (1)
- formal verification methods (1)
- formale Verifikation (1)
- functional dependency (1)
- functional lenses (1)
- functional programming (1)
- funktionale Abhängigkeit (1)
- funktionale Programmierung (1)
- future SOC lab (1)
- gefaltete neuronale Netze (1)
- generalized discrimination networks (1)
- getypte Attributierte Graphen (1)
- global model management (1)
- globales Modellmanagement (1)
- graph databases (1)
- graph queries (1)
- graph repair (1)
- graph transformations (1)
- hashrate (1)
- heuristics (1)
- homomorphic encryption (1)
- human-centered (1)
- hybrid graph-transformation-systems (1)
- hybride Graph-Transformations-Systeme (1)
- immutable values (1)
- in-memory database (1)
- individuals (1)
- inductive invariant checking (1)
- induktives Invariant Checking (1)
- inkrementelles Graph Pattern Matching (1)
- innovation (1)
- innovation capabilities (1)
- innovation management (1)
- integrated development environments (1)
- integrierte Entwicklungsumgebungen (1)
- intelligente Verträge (1)
- inter-chain (1)
- interactive media (1)
- interaktive Medien (1)
- interconnect (1)
- interdisziplinäre Teams (1)
- interpreters (1)
- interval probabilistic timed systems (1)
- interval probabilistische zeitgesteuerte Systeme (1)
- interval timed automata (1)
- intuitive Benutzeroberflächen (1)
- intuitive interfaces (1)
- invariant checking (1)
- juridical recording (1)
- k-Induktion (1)
- k-induction (1)
- k-inductive invariants (1)
- k-induktive Invarianten (1)
- k-induktive Invariantenprüfung (1)
- k-induktives Invariant-Checking (1)
- key discovery (1)
- kompositionale Analyse (1)
- kontinuierliches Testen (1)
- kontrolliertes Experiment (1)
- kulturelles Erbe (1)
- künstliche Intelligenz (1)
- language specification (1)
- law (1)
- leadership (1)
- lebenslanges Lernen (1)
- lebenszentriert (1)
- ledger assets (1)
- left recursion (1)
- life-centered (1)
- lifelong learning (1)
- location-based (1)
- management (1)
- many-core (1)
- maschinelles Sehen (1)
- mehrdimensionale Belangtrennung (1)
- mehrsprachige Ausführungsumgebungen (1)
- memory optimization (1)
- menschenzentriert (1)
- merged mining (1)
- merkle root (1)
- metadata discovery (1)
- metadata quality (1)
- metric temporal logic (1)
- metric termporal graph logic (1)
- metrisch temporale Graph Logic (1)
- metrische Temporallogik (1)
- micropayment (1)
- micropayment channels (1)
- middleware (1)
- miner (1)
- mining (1)
- mining hardware (1)
- minting (1)
- missing data (1)
- model generation (1)
- model repair (1)
- model transformation (1)
- model-driven engineering (1)
- monitoring (1)
- morphic (1)
- multi-core (1)
- multi-dimensional separation of concerns (1)
- multi-instances (1)
- multi-version models (1)
- multidisziplinäre Teams (1)
- musical scales (1)
- musikalische Tonleitern (1)
- nested application conditions (1)
- network protocols (1)
- nonce (1)
- object life cycle synchronization (1)
- object-constraint programming (1)
- object-oriented programming (1)
- objektorientiertes Programmieren (1)
- off-chain transaction (1)
- organizational change (1)
- orts-basiert (1)
- packrat parsing (1)
- parallel and sequential independence (1)
- parallel computing (1)
- parallele und Sequentielle Unabhängigkeit (1)
- paralleles Rechnen (1)
- parsing expression grammars (1)
- partial application conditions (1)
- partielle Anwendungsbedingungen (1)
- peer-to-peer network (1)
- pegged sidechains (1)
- performance models of virtual machines (1)
- periodic tasks (1)
- periodische Aufgaben (1)
- petri net (1)
- polyglot execution environments (1)
- pose estimation (1)
- probabilistic timed automata (1)
- probabilistische zeitbehaftete Automaten (1)
- process elicitation (1)
- process instance (1)
- process mining (1)
- processor hardware (1)
- profiling (1)
- programming (1)
- programming experience (1)
- prototyping (1)
- qualitative model (1)
- qualitatives Modell (1)
- quantum computing (1)
- quorum slices (1)
- railways (1)
- reactive (1)
- reaktive Programmierung (1)
- real-time (1)
- real-time systems (1)
- rechnerunterstütztes Konstruieren (1)
- regression testing (1)
- relational model transformation (1)
- relationale Modelltransformationen (1)
- resource optimization (1)
- reverse engineering (1)
- rootstock (1)
- runtime adaptations (1)
- runtime monitoring (1)
- s/t-pattern sequences (1)
- satisfiabilitiy solving (1)
- savanna (1)
- scalability of blockchain (1)
- scarce tokens (1)
- schema discovery (1)
- scrollytelling (1)
- selbstbestimmte Identitäten (1)
- self-sovereign identity (1)
- semantics preservation (1)
- sequence properties (1)
- serialization (1)
- service-oriented systems (1)
- sidechain (1)
- signal transition graph (1)
- simulation (1)
- small talk (1)
- smart contracts (1)
- software architecture (1)
- software product lines (1)
- software tests (1)
- software/hardware co-design (1)
- specification of timed graph transformations (1)
- speed independent (1)
- squeak (1)
- standardization (1)
- standards (1)
- static analysis (1)
- static source-code analysis (1)
- statische Analyse (1)
- statische Quellcodeanalyse (1)
- stochastic Petri nets (1)
- stochastische Petri Netze (1)
- study (1)
- symbolic analysis (1)
- symbolic graphs (1)
- symbolische Analyse (1)
- symbolische Graphen (1)
- synchronization (1)
- system of systems (1)
- systems software (1)
- t.BPM (1)
- tableau method (1)
- tangible media (1)
- tele-TASK (1)
- telemedicine (1)
- temporal logic (1)
- test case prioritization (1)
- test results (1)
- threshold cryptography (1)
- tiefes Lernen (1)
- timed automata (1)
- tools (1)
- traditional Georgian music (1)
- traditionelle Georgische Musik (1)
- trajectories (1)
- transaction (1)
- transformation level (1)
- transformation sequences (1)
- typed graph transformation systems (1)
- typisierte attributierte Graphen (1)
- unique (1)
- user-generated content (1)
- verifiable credentials (1)
- verschachtelte Anwednungsbedingungen (1)
- verschachtelte Anwendungsbedingungen (1)
- verteilte Leistungsüberwachung (1)
- verzwickte Probleme (1)
- view maintenance (1)
- visual language (1)
- visualization (1)
- visualization concept exploration (1)
- visuelle Sprache (1)
- web-applications (1)
- web-based development (1)
- web-based development environment (1)
- web-basierte Entwicklungsumgebung (1)
- webbasierte Entwicklung (1)
- zuverlässige Datenverarbeitung (1)
- zuverlässigen Datenverarbeitung (1)
- Überwachung (1)
- überprüfbare Nachweise (1)
Unique column combinations of a relational database table are sets of columns that contain only unique values. Discovering such combinations is a fundamental research problem and has many different data management and knowledge discovery applications. Existing discovery algorithms are either brute force or have a high memory load and can thus be applied only to small datasets or samples. In this paper, the wellknown GORDIAN algorithm and "Apriori-based" algorithms are compared and analyzed for further optimization. We greatly improve the Apriori algorithms through efficient candidate generation and statistics-based pruning methods. A hybrid solution HCAGORDIAN combines the advantages of GORDIAN and our new algorithm HCA, and it significantly outperforms all previous work in many situations.
Technical report
(2019)
Design and Implementation of service-oriented architectures imposes a huge number of research questions from the fields of software engineering, system analysis and modeling, adaptability, and application integration. Component orientation and web services are two approaches for design and realization of complex web-based system. Both approaches allow for dynamic application adaptation as well as integration of enterprise application.
Commonly used technologies, such as J2EE and .NET, form de facto standards for the realization of complex distributed systems. Evolution of component systems has lead to web services and service-based architectures. This has been manifested in a multitude of industry standards and initiatives such as XML, WSDL UDDI, SOAP, etc. All these achievements lead to a new and promising paradigm in IT systems engineering which proposes to design complex software solutions as collaboration of contractually defined software services.
Service-Oriented Systems Engineering represents a symbiosis of best practices in object-orientation, component-based development, distributed computing, and business process management. It provides integration of business and IT concerns.
The annual Ph.D. Retreat of the Research School provides each member the opportunity to present his/her current state of their research and to give an outline of a prospective Ph.D. thesis. Due to the interdisciplinary structure of the research school, this technical report covers a wide range of topics. These include but are not limited to: Human Computer Interaction and Computer Vision as Service; Service-oriented Geovisualization Systems; Algorithm Engineering for Service-oriented Systems; Modeling and Verification of Self-adaptive Service-oriented Systems; Tools and Methods for Software Engineering in Service-oriented Systems; Security Engineering of Service-based IT Systems; Service-oriented Information Systems; Evolutionary Transition of Enterprise Applications to Service Orientation; Operating System Abstractions for Service-oriented Computing; and Services Specification, Composition, and Enactment.
Extract-Transform-Load (ETL) tools are used for the creation, maintenance, and evolution of data warehouses, data marts, and operational data stores. ETL workflows populate those systems with data from various data sources by specifying and executing a DAG of transformations. Over time, hundreds of individual workflows evolve as new sources and new requirements are integrated into the system. The maintenance and evolution of large-scale ETL systems requires much time and manual effort. A key problem is to understand the meaning of unfamiliar attribute labels in source and target databases and ETL transformations. Hard-to-understand attribute labels lead to frustration and time spent to develop and understand ETL workflows. We present a schema decryption technique to support ETL developers in understanding cryptic schemata of sources, targets, and ETL transformations. For a given ETL system, our recommender-like approach leverages the large number of mapped attribute labels in existing ETL workflows to produce good and meaningful decryptions. In this way we are able to decrypt attribute labels consisting of a number of unfamiliar few-letter abbreviations, such as UNP_PEN_INT, which we can decrypt to UNPAID_PENALTY_INTEREST. We evaluate our schema decryption approach on three real-world repositories of ETL workflows and show that our approach is able to suggest high-quality decryptions for cryptic attribute labels in a given schema.
Program behavior that relies on contextual information, such as physical location or network accessibility, is common in today's applications, yet its representation is not sufficiently supported by programming languages. With context-oriented programming (COP), such context-dependent behavioral variations can be explicitly modularized and dynamically activated. In general, COP could be used to manage any context-specific behavior. However, its contemporary realizations limit the control of dynamic adaptation. This, in turn, limits the interaction of COP's adaptation mechanisms with widely used architectures, such as event-based, mobile, and distributed programming. The JCop programming language extends Java with language constructs for context-oriented programming and additionally provides a domain-specific aspect language for declarative control over runtime adaptations. As a result, these redesigned implementations are more concise and better modularized than their counterparts using plain COP. JCop's main features have been described in our previous publications. However, a complete language specification has not been presented so far. This report presents the entire JCop language including the syntax and semantics of its new language constructs.
The noble way to substantiate decisions that affect many people is to ask these people for their opinions. For governments that run whole countries, this means asking all citizens for their views to consider their situations and needs.
Organizations such as Africa's Voices Foundation, who want to facilitate communication between decision-makers and citizens of a country, have difficulty mediating between these groups. To enable understanding, statements need to be summarized and visualized. Accomplishing these goals in a way that does justice to the citizens' voices and situations proves challenging. Standard charts do not help this cause as they fail to create empathy for the people behind their graphical abstractions. Furthermore, these charts do not create trust in the data they are representing as there is no way to see or navigate back to the underlying code and the original data. To fulfill these functions, visualizations would highly benefit from interactions to explore the displayed data, which standard charts often only limitedly provide.
To help improve the understanding of people's voices, we developed and categorized 80 ideas for new visualizations, new interactions, and better connections between different charts, which we present in this report. From those ideas, we implemented 10 prototypes and two systems that integrate different visualizations. We show that this integration allows consistent appearance and behavior of visualizations. The visualizations all share the same main concept: representing each individual with a single dot. To realize this idea, we discuss technologies that efficiently allow the rendering of a large number of these dots. With these visualizations, direct interactions with representations of individuals are achievable by clicking on them or by dragging a selection around them. This direct interaction is only possible with a bidirectional connection from the visualization to the data it displays. We discuss different strategies for bidirectional mappings and the trade-offs involved. Having unified behavior across visualizations enhances exploration. For our prototypes, that includes grouping, filtering, highlighting, and coloring of dots. Our prototyping work was enabled by the development environment Lively4. We explain which parts of Lively4 facilitated our prototyping process. Finally, we evaluate our approach to domain problems and our developed visualization concepts.
Our work provides inspiration and a starting point for visualization development in this domain. Our visualizations can improve communication between citizens and their government and motivate empathetic decisions. Our approach, combining low-level entities to create visualizations, provides value to an explorative and empathetic workflow. We show that the design space for visualizing this kind of data has a lot of potential and that it is possible to combine qualitative and quantitative approaches to data analysis.
Modular and incremental global model management with extended generalized discrimination networks
(2023)
Complex projects developed under the model-driven engineering paradigm nowadays often involve several interrelated models, which are automatically processed via a multitude of model operations. Modular and incremental construction and execution of such networks of models and model operations are required to accommodate efficient development with potentially large-scale models. The underlying problem is also called Global Model Management.
In this report, we propose an approach to modular and incremental Global Model Management via an extension to the existing technique of Generalized Discrimination Networks (GDNs). In addition to further generalizing the notion of query operations employed in GDNs, we adapt the previously query-only mechanism to operations with side effects to integrate model transformation and model synchronization. We provide incremental algorithms for the execution of the resulting extended Generalized Discrimination Networks (eGDNs), as well as a prototypical implementation for a number of example eGDN operations.
Based on this prototypical implementation, we experiment with an application scenario from the software development domain to empirically evaluate our approach with respect to scalability and conceptually demonstrate its applicability in a typical scenario. Initial results confirm that the presented approach can indeed be employed to realize efficient Global Model Management in the considered scenario.
Like conventional software projects, projects in model-driven software engineering require adequate management of multiple versions of development artifacts, importantly allowing living with temporary inconsistencies. In the case of model-driven software engineering, employed versioning approaches also have to handle situations where different artifacts, that is, different models, are linked via automatic model transformations.
In this report, we propose a technique for jointly handling the transformation of multiple versions of a source model into corresponding versions of a target model, which enables the use of a more compact representation that may afford improved execution time of both the transformation and further analysis operations. Our approach is based on the well-known formalism of triple graph grammars and a previously introduced encoding of model version histories called multi-version models. In addition to showing the correctness of our approach with respect to the standard semantics of triple graph grammars, we conduct an empirical evaluation that demonstrates the potential benefit regarding execution time performance.
In recent years, computer vision algorithms based on machine learning have seen rapid development. In the past, research mostly focused on solving computer vision problems such as image classification or object detection on images displaying natural scenes. Nowadays other fields such as the field of cultural heritage, where an abundance of data is available, also get into the focus of research. In the line of current research endeavours, we collaborated with the Getty Research Institute which provided us with a challenging dataset, containing images of paintings and drawings. In this technical report, we present the results of the seminar "Deep Learning for Computer Vision". In this seminar, students of the Hasso Plattner Institute evaluated state-of-the-art approaches for image classification, object detection and image recognition on the dataset of the Getty Research Institute. The main challenge when applying modern computer vision methods to the available data is the availability of annotated training data, as the dataset provided by the Getty Research Institute does not contain a sufficient amount of annotated samples for the training of deep neural networks. However, throughout the report we show that it is possible to achieve satisfying to very good results, when using further publicly available datasets, such as the WikiArt dataset, for the training of machine learning models.
Data dependencies, or integrity constraints, are used to improve the quality of a database schema, to optimize queries, and to ensure consistency in a database. In the last years conditional dependencies have been introduced to analyze and improve data quality. In short, a conditional dependency is a dependency with a limited scope defined by conditions over one or more attributes. Only the matching part of the instance must adhere to the dependency. In this paper we focus on conditional inclusion dependencies (CINDs). We generalize the definition of CINDs, distinguishing covering and completeness conditions. We present a new use case for such CINDs showing their value for solving complex data quality tasks. Further, we define quality measures for conditions inspired by precision and recall. We propose efficient algorithms that identify covering and completeness conditions conforming to given quality thresholds. Our algorithms choose not only the condition values but also the condition attributes automatically. Finally, we show that our approach efficiently provides meaningful and helpful results for our use case.