Refine
Year of publication
Document Type
- Doctoral Thesis (125) (remove)
Language
- English (125) (remove)
Is part of the Bibliography
- yes (125) (remove)
Keywords
- Maschinelles Lernen (7)
- Machine Learning (6)
- Antwortmengenprogrammierung (5)
- Answer Set Programming (3)
- answer set programming (3)
- machine learning (3)
- Algorithmen (2)
- Algorithms (2)
- Computersicherheit (2)
- Deep Learning (2)
- EEG (2)
- ICA (2)
- Knowledge Representation and Reasoning (2)
- Künstliche Intelligenz (2)
- Middleware (2)
- Modeling (2)
- Modell (2)
- Modellierung (2)
- Ontologie (2)
- Process (2)
- Prozess (2)
- Semantic Web (2)
- Vorhersage (2)
- maschinelles Lernen (2)
- model (2)
- non-photorealistic rendering (2)
- systems biology (2)
- 'Peer To Peer' (1)
- 3D Computer Grafik (1)
- 3D Computer Graphics (1)
- 3D Drucken (1)
- 3D Semiotik (1)
- 3D Visualisierung (1)
- 3D printing (1)
- 3D semiotics (1)
- 3D visualization (1)
- 3D-Stadtmodelle (1)
- 3d city models (1)
- 6LoWPAN (1)
- ASIC (1)
- ASIC (Applikationsspezifische Integrierte Schaltkreise) (1)
- ASP (Answer Set Programming) (1)
- Abbrecherquote (1)
- Abstraktion (1)
- Ackerschmalwand (1)
- Active Evaluation (1)
- Adversarial Learning (1)
- Aktive Evaluierung (1)
- Algorithmenablaufplanung (1)
- Algorithmenkonfiguration (1)
- Algorithmenselektion (1)
- Android Security (1)
- Angewandte Spieltheorie (1)
- Anisotroper Kuwahara Filter (1)
- Anleitung (1)
- Answer Set Solving modulo Theories (1)
- Antwortmengen Programmierung (1)
- Applied Game Theory (1)
- Argumentation (1)
- Artificial Intelligence (1)
- Artificial Neuronal Network (1)
- Asynchrone Schaltung (1)
- Augenbewegungen (1)
- Ausreissererkennung (1)
- BCI (1)
- BSS (1)
- Bachelorstudierende der Informatik (1)
- Baumweite (1)
- Behavior (1)
- Berührungseingaben (1)
- Beweis (1)
- Beweisassistent (1)
- Beweistheorie (1)
- Beweisumgebung (1)
- Bilddatenanalyse (1)
- Bildverarbeitung (1)
- Binäres Entscheidungsdiagramm (1)
- Bioelektrisches Signal (1)
- Bioinformatik (1)
- Boolean constraint solver (1)
- Boosting (1)
- Brain Computer Interface (1)
- Business Process (1)
- Business Process Models (1)
- CASP (Constraint Answer Set Programming) (1)
- CSC (1)
- Cactus (1)
- Choreographien (1)
- Classification (1)
- Clusteranalyse (1)
- Common Spatial Pattern (1)
- Compliance (1)
- Composition (1)
- Computational Complexity (1)
- Computer Science (1)
- Computergrafik (1)
- Coq (1)
- Covariate Shift (1)
- Curry (1)
- DDoS (1)
- DPLL (1)
- Declarative Problem Solving (1)
- Dempster-Shafer-Theorie (1)
- Dempster–Shafer theory (1)
- Description Logics (1)
- Deskriptive Logik (1)
- Diagonalisierung (1)
- Didaktik der Informatik (1)
- Dienstkomposition (1)
- Dienstplattform (1)
- Differenz von Gauss Filtern (1)
- Digital Design (1)
- Distributed Computing (1)
- Dynamic Programming (1)
- Dynamische Programmierung (1)
- Dynamische Rekonfiguration (1)
- E-Learning (1)
- Eingabegenauigkeit (1)
- Elektroencephalographie (1)
- Emotionen (1)
- Emotionsforschung (1)
- Enterprise Architecture (1)
- Entscheidungsbäume (1)
- Entwurfsmuster für SOA-Sicherheit (1)
- Entwurfsprinzipien (1)
- Erfüllbarkeit einer Formel der Aussagenlogik (1)
- Erfüllbarkeitsproblem (1)
- Erklärbarkeit (1)
- Error Estimation (1)
- Evidenztheorie (1)
- Explainability (1)
- Exploration (1)
- Exponential Time Hypothesis (1)
- Exponentialzeit Hypothese (1)
- FMC-QE (1)
- Feature Combination (1)
- Feedback (1)
- Fehlende Daten (1)
- Fehlerschätzung (1)
- Flussgesteuerter Bilateraler Filter (1)
- Focus+Context Visualization (1)
- Fokus-&-Kontext Visualisierung (1)
- Formalismus (1)
- Formalitätsgrad (1)
- Formeln der quantifizierten Aussagenlogik (1)
- GIS-Dienstkomposition (1)
- GPU (1)
- Gebäudemodelle (1)
- Gehirn-Computer-Schnittstelle (1)
- Geländemodelle (1)
- Generalisierung (1)
- Geodaten (1)
- Geometrieerzeugung (1)
- Geovisualisierung (1)
- Geschäftsprozess (1)
- Geschäftsprozessmodelle (1)
- Gesichtsausdruck (1)
- Globus (1)
- Grid (1)
- Grid Computing (1)
- Grounding Theory (1)
- HCI (1)
- Hardware Design (1)
- Hauptkomponentenanalyse (1)
- High-Level Synthesis (1)
- Hochschulsystem (1)
- I/O-effiziente Algorithmen (1)
- IP core (1)
- IT security (1)
- IT-Security (1)
- IT-Sicherheit (1)
- Industrie 4.0 (1)
- Industry 4.0 (1)
- Informatik (1)
- Informatik-Studiengänge (1)
- Informatikvoraussetzungen (1)
- Information Transfer Rate (1)
- Inkonsistenz (1)
- Interactive Rendering (1)
- Interaktionsmodel (1)
- Interaktionsmodellierung (1)
- Interaktives Rendering (1)
- Internet Security (1)
- Internet of Things (1)
- Internet-Sicherheit (1)
- Interoperability (1)
- Interoperabilität (1)
- Interpretability (1)
- Interpretierbarkeit (1)
- IoT (1)
- Kartografisches Design (1)
- Kern-PCA (1)
- Kernmethoden (1)
- Klassifikation (1)
- Klassifikation mit großem Margin (1)
- Klassifikator-Kalibrierung (1)
- Klimafolgenanalyse (1)
- Klimawandel (1)
- Knowledge (1)
- Knowledge Management (1)
- Kommunikation (1)
- Komplexität (1)
- Komplexitätsbewältigung (1)
- Komplexitätstheorie (1)
- Komposition (1)
- Kybernetik (1)
- Künstliche Neuronale Netzwerke (1)
- Landmarken (1)
- Large Margin Classification (1)
- Laser Cutten (1)
- Learning (1)
- Lehrer (1)
- Leistungsvorhersage (1)
- Lernen (1)
- Logik (1)
- Logiksynthese (1)
- Lower Bounds (1)
- MEG (1)
- MQTT (1)
- Magnetoencephalographie (1)
- Malware (1)
- Mathematical Optimization (1)
- Mathematikdidaktik (1)
- Mathematikphilosophie (1)
- Mathematische Optimierung (1)
- Matrizen-Eigenwertaufgabe (1)
- Megamodel (1)
- Megamodell (1)
- Mehrklassen-Klassifikation (1)
- Message Passing Interface (1)
- Migration (1)
- Mischmodelle (1)
- Mischung <Signalverarbeitung> (1)
- Mobilgeräte (1)
- Model Management (1)
- Model-Driven Engineering (1)
- Modell Management (1)
- Modell-driven Security (1)
- Modell-getriebene Sicherheit (1)
- Modellgetriebene Entwicklung (1)
- Molekulare Bioinformatik (1)
- Multi Task Learning (1)
- Multi-Class (1)
- Multi-Task-Lernen (1)
- Multiprocessor (1)
- Multiprozessor (1)
- NETCONF (1)
- Network Management (1)
- Netzwerk Management (1)
- Netzwerke (1)
- Neuronales Netz (1)
- Next Generation Network (1)
- Nicht-photorealistisches Rendering (1)
- Nichtfotorealistische Bildsynthese (1)
- Nutzungsinteresse (1)
- Objektive Schwierigkeit (1)
- Ontologien (1)
- Ontologies (1)
- Ontology (1)
- Optimierung (1)
- Optimierungsproblem (1)
- Optimization (1)
- Parallel Programming (1)
- Paralleles Rechnen (1)
- Parallelrechner (1)
- Parameterized Complexity (1)
- Parametrisierte Komplexität (1)
- Peer-to-Peer-Netz ; GRID computing ; Zuverlässigkeit ; Web Services ; Betriebsmittelverwaltung ; Migration (1)
- Performance Prediction (1)
- Platzierung (1)
- Policy Enforcement (1)
- Power Monitoring (1)
- Prediction Game (1)
- Predictive Models (1)
- Preference Handling (1)
- Privacy Protection (1)
- Probleme in der Studie (1)
- Process Management (1)
- Process modeling (1)
- Professoren (1)
- Programmierung (1)
- Proof Theory (1)
- Prozesse (1)
- Prozessmanagement (1)
- Prozessmodellierung (1)
- Prozesssynchronisierung (1)
- Prädiktionsspiel (1)
- Präferenzen (1)
- Quantified Boolean Formula (QBF) (1)
- Quantitative Modeling (1)
- Quantitative Modellierung (1)
- Queuing Theory (1)
- Reconfigurable (1)
- Regression (1)
- Regularisierung (1)
- Regularization (1)
- Rekonfiguration (1)
- Reparatur (1)
- SMT (SAT Modulo Theories) (1)
- SOA Security Pattern (1)
- STG decomposition (1)
- STG-Dekomposition (1)
- Sample Selection Bias (1)
- Satisfiability (1)
- Schulmaterial (1)
- Security Modelling (1)
- Segmentierung (1)
- Selektionsbias (1)
- Semantic Search (1)
- Semantik Web (1)
- Semantische Suche (1)
- Sensornetzwerke (1)
- Service Creation (1)
- Service Delivery Platform (1)
- Service convergence (1)
- Service-Orientierte Architekturen (1)
- Service-oriented Architectures (1)
- Sicherheitsmodellierung (1)
- Signal Processing (1)
- Signalquellentrennung (1)
- Signaltrennung (1)
- Simulation (1)
- Simultane Diagonalisierung (1)
- Single Event Transient (1)
- Single Trial Analysis (1)
- Skelettberechnung (1)
- Software-basierte Cache-Kohärenz (1)
- Sonnenteilchen-Ereignis (1)
- Spam (1)
- Spam Filtering (1)
- Spam-Erkennung (1)
- Spam-Filter (1)
- Spam-Filtering (1)
- Spatio-Spectral Filter (1)
- Spawning (1)
- Sprachdesign (1)
- Static Analysis (1)
- Statistical Tests (1)
- Statistische Tests (1)
- Stilisierung (1)
- Strahlungshartes Design (1)
- Strahlungshärte Entwurf (1)
- Stromverbrauchüberwachung (1)
- Structuring (1)
- Strukturierung (1)
- Studentenerwartungen (1)
- Studentenhaltungen (1)
- Support Vectors (1)
- Support-Vector Lernen (1)
- Synthese (1)
- System Biologie (1)
- Systembiologie (1)
- Taktik (1)
- Telekommunikation (1)
- Temporal Answer Set Solving (1)
- Temporal Logic (1)
- Temporallogik (1)
- Temporäre Anbindung (1)
- Terminologische Logik (1)
- Texturen (1)
- Theoretischen Vorlesungen (1)
- Time Augmented Petri Nets (1)
- Time Series Analysis (1)
- Traceability (1)
- Tracking (1)
- Transformation (1)
- Treewidth (1)
- Unabhängige Komponentenanalyse (1)
- Universität Bagdad (1)
- Universität Potsdam (1)
- Universitätseinstellungen (1)
- Untere Schranken (1)
- Unterrichtswerkzeuge (1)
- Unvollständigkeit (1)
- Usage Interest (1)
- VM (1)
- Verhalten (1)
- Verifikation (1)
- Verletzung Auflösung (1)
- Verletzung Erklärung (1)
- Verteiltes Rechnen (1)
- Verteilungsunterschied (1)
- Violation Explanation (1)
- Violation Resolution (1)
- Visualisierung (1)
- Vorhersagemodelle (1)
- Wahrnehmung (1)
- Wahrnehmung von Arousal (1)
- Wahrnehmungsunterschiede (1)
- Warteschlangentheorie (1)
- Web Services (1)
- Web Sites (1)
- Web of Data (1)
- Webseite (1)
- Well-structuredness (1)
- Wetterextreme (1)
- Wirtschaftsinformatik (1)
- Wissen (1)
- Wissenschaftlichesworkflows (1)
- Wissensmanagement (1)
- Wissensrepräsentation und -verarbeitung (1)
- Wissensrepräsentation und Schlussfolgerung (1)
- Wohlstrukturiertheit (1)
- ZQSA (1)
- ZQSAT (1)
- Zeitbehaftete Petri Netze (1)
- Zero-Suppressed Binary Decision Diagram (ZDD) (1)
- Zuverlässigkeitsanalyse (1)
- abstraction (1)
- adaptiv (1)
- adaptive (1)
- algorithm configuration (1)
- algorithm scheduling (1)
- algorithm selection (1)
- anisotropic Kuwahara filter (1)
- approximate joint diagonalization (1)
- argumentation (1)
- arousal perception (1)
- artificial intelligence (1)
- asynchronous circuit (1)
- bild (1)
- biometrics (1)
- biometrische Identifikation (1)
- blind source separation (1)
- building models (1)
- business informatics (1)
- cartographic design (1)
- changing the study field (1)
- changing the university (1)
- choreographies (1)
- classifier calibration (1)
- classroom material (1)
- climate change (1)
- climate impact analysis (1)
- clustering (1)
- coherence-enhancing filtering (1)
- communication (1)
- complexity (1)
- computational biology (1)
- computational methods (1)
- computer graphics (1)
- computer science education (1)
- computer security (1)
- computergestützte Methoden (1)
- concurrent checking (1)
- constraints (1)
- decision trees (1)
- degree of formality (1)
- design principles (1)
- didaktische Rekonstruktion (1)
- difference of Gaussians (1)
- digital circuit (1)
- digital design (1)
- dropout (1)
- dynamic (1)
- dynamic classification (1)
- dynamic reconfiguration (1)
- dynamisch (1)
- dynamische Klassifikation (1)
- e-Learning (1)
- educational reconstruction (1)
- eingebettete Systeme (1)
- einseitige Kommunikation (1)
- email spam detection (1)
- embedded systems (1)
- emotion (1)
- emotion representation (1)
- emotion research (1)
- entity alignment (1)
- evidence theory (1)
- external memory algorithms (1)
- eye movements (1)
- face tracking (1)
- facial expression (1)
- flow-based bilateral filter (1)
- formalism (1)
- generalization (1)
- geometry generation (1)
- geospatial data (1)
- geospatial services (1)
- geovisualization (1)
- graph clustering (1)
- hardware design (1)
- higher education (1)
- human computer interaction (1)
- hybrid (1)
- hybrides Problemlösen (1)
- image (1)
- image data analysis (1)
- image processing (1)
- incompleteness (1)
- inconsistency (1)
- independent component analysis (1)
- indirect economic impacts (1)
- indirekte ökonomische Effekte (1)
- informatische Bildung im Sekundarbereich (1)
- input accuracy (1)
- interaction modeling (1)
- kernel PCA (1)
- kernel methods (1)
- konvergente Dienste (1)
- landmarks (1)
- language design (1)
- logic (1)
- logic programming (1)
- logic synthesis (1)
- logical signaling networks (1)
- logische Programmierung (1)
- logische Signalnetzwerke (1)
- macro-economic modelling (1)
- makroökonomische Modellierung (1)
- malware detection (1)
- map/reduce (1)
- maschninelles Lernen (1)
- mathematics education (1)
- medical (1)
- medizinisch (1)
- middleware (1)
- mixture models (1)
- mobile devices (1)
- molecular networks (1)
- molekulare Netzwerke (1)
- multi-class classification (1)
- networks-on-chip (1)
- nichtlineare ICA (1)
- nichtlineare PCA (NLPCA) (1)
- nonlinear ICA (1)
- nonlinear PCA (NLPCA) (1)
- objective difficulty (1)
- one-sided communication (1)
- oneM2M (1)
- ontologies (1)
- outlier detection (1)
- output space compaction (1)
- overcomplete ICA (1)
- parallel programming (1)
- parallel solving (1)
- parallele Programmierung (1)
- paralleles Lösen (1)
- pattern recognition (1)
- perception (1)
- perception differences (1)
- philosophy of mathematics (1)
- physical Computing (1)
- physical computing (1)
- placement (1)
- prediction (1)
- preferences (1)
- priorities (1)
- probabilistic deep learning (1)
- probabilistic deep metric learning (1)
- probabilistische tiefe neuronale Netze (1)
- probabilistisches tiefes metrisches Lernen (1)
- process (1)
- process synchronization (1)
- professors (1)
- proof (1)
- proof assistant (1)
- proof environment (1)
- propagation probability (1)
- radiation hardness (1)
- radiation hardness design (1)
- reconfiguration (1)
- rekonfigurierbar (1)
- reliability assessment (1)
- repair (1)
- robust ICA (1)
- robuste ICA (1)
- scheduling (1)
- scientific workflows (1)
- secondary computer science education (1)
- segmentation (1)
- selbstanpassendes Multiprozessorsystem (1)
- self-adaptive multiprocessing system (1)
- semantic domain modeling (1)
- semantische Domänenmodellierung (1)
- service composition (1)
- single event upset (1)
- skeletonization (1)
- software-based cache coherence (1)
- solar particle event (1)
- speed independence (1)
- strahleninduzierte Einzelereignis-Effekte (1)
- structured output prediction (1)
- strukturierte Vorhersage (1)
- study problems (1)
- stylization (1)
- tactic (1)
- teachers (1)
- temporary binding (1)
- terrain models (1)
- test (1)
- textures (1)
- tools for teaching (1)
- topics (1)
- touch input (1)
- transformation (1)
- tutorial section (1)
- verification (1)
- virtual 3D city models (1)
- virtual machine (1)
- virtuelle 3D-Stadtmodelle (1)
- visualization (1)
- weather extremes (1)
- zero-aliasing (1)
- überbestimmte ICA (1)
Institute
- Institut für Informatik und Computational Science (125) (remove)
With increasing number of applications in Internet and mobile environments, distributed software systems are demanded to be more powerful and flexible, especially in terms of dynamism and security. This dissertation describes my work concerning three aspects: dynamic reconfiguration of component software, security control on middleware applications, and web services dynamic composition. Firstly, I proposed a technology named Routing Based Workflow (RBW) to model the execution and management of collaborative components and realize temporary binding for component instances. The temporary binding means component instances are temporarily loaded into a created execution environment to execute their functions, and then are released to their repository after executions. The temporary binding allows to create an idle execution environment for all collaborative components, on which the change operations can be immediately carried out. The changes on execution environment will result in a new collaboration of all involved components, and also greatly simplifies the classical issues arising from dynamic changes, such as consistency preserving etc. To demonstrate the feasibility of RBW, I created a dynamic secure middleware system - the Smart Data Server Version 3.0 (SDS3). In SDS3, an open source implementation of CORBA is adopted and modified as the communication infrastructure, and three secure components managed by RBW, are created to enhance the security on the access of deployed applications. SDS3 offers multi-level security control on its applications from strategy control to application-specific detail control. For the management by RBW, the strategy control of SDS3 applications could be dynamically changed by reorganizing the collaboration of the three secure components. In addition, I created the Dynamic Services Composer (DSC) based on Apache open source projects, Apache Axis and WSIF. In DSC, RBW is employed to model the interaction and collaboration of web services and to enable the dynamic changes on the flow structure of web services. Finally, overall performance tests were made to evaluate the efficiency of the developed RBW and SDS3. The results demonstrated that temporary binding of component instances makes slight impacts on the execution efficiency of components, and the blackout time arising from dynamic changes can be extremely reduced in any applications.
Answer Set Programming (ASP) emerged in the late 1990s as a new logic programming paradigm, having its roots in nonmonotonic reasoning, deductive databases, and logic programming with negation as failure. The basic idea of ASP is to represent a computational problem as a logic program whose answer sets correspond to solutions, and then to use an answer set solver for finding answer sets of the program. ASP is particularly suited for solving NP-complete search problems. Among these, we find applications to product configuration, diagnosis, and graph-theoretical problems, e.g. finding Hamiltonian cycles. On different lines of ASP research, many extensions of the basic formalism have been proposed. The most intensively studied one is the modelling of preferences in ASP. They constitute a natural and effective way of selecting preferred solutions among a plethora of solutions for a problem. For example, preferences have been successfully used for timetabling, auctioning, and product configuration. In this thesis, we concentrate on preferences within answer set programming. Among several formalisms and semantics for preference handling in ASP, we concentrate on ordered logic programs with the underlying D-, W-, and B-semantics. In this setting, preferences are defined among rules of a logic program. They select preferred answer sets among (standard) answer sets of the underlying logic program. Up to now, those preferred answer sets have been computed either via a compilation method or by meta-interpretation. Hence, the question comes up, whether and how preferences can be integrated into an existing ASP solver. To solve this question, we develop an operational graph-based framework for the computation of answer sets of logic programs. Then, we integrate preferences into this operational approach. We empirically observe that our integrative approach performs in most cases better than the compilation method or meta-interpretation. Another research issue in ASP are optimization methods that remove redundancies, as also found in database query optimizers. For these purposes, the rather recently suggested notion of strong equivalence for ASP can be used. If a program is strongly equivalent to a subprogram of itself, then one can always use the subprogram instead of the original program, a technique which serves as an effective optimization method. Up to now, strong equivalence has not been considered for logic programs with preferences. In this thesis, we tackle this issue and generalize the notion of strong equivalence to ordered logic programs. We give necessary and sufficient conditions for the strong equivalence of two ordered logic programs. Furthermore, we provide program transformations for ordered logic programs and show in how far preferences can be simplified. Finally, we present two new applications for preferences within answer set programming. First, we define new procedures for group decision making, which we apply to the problem of scheduling a group meeting. As a second new application, we reconstruct a linguistic problem appearing in German dialects within ASP. Regarding linguistic studies, there is an ongoing debate about how unique the rule systems of language are in human cognition. The reconstruction of grammatical regularities with tools from computer science has consequences for this debate: if grammars can be modelled this way, then they share core properties with other non-linguistic rule systems.
This work introduces novel internal and external memory algorithms for computing voxel skeletons of massive voxel objects with complex network-like architecture and for converting these voxel skeletons to piecewise linear geometry, that is triangle meshes and piecewise straight lines. The presented techniques help to tackle the challenge of visualizing and analyzing 3d images of increasing size and complexity, which are becoming more and more important in, for example, biological and medical research. Section 2.3.1 contributes to the theoretical foundations of thinning algorithms with a discussion of homotopic thinning in the grid cell model. The grid cell model explicitly represents a cell complex built of faces, edges, and vertices shared between voxels. A characterization of pairs of cells to be deleted is much simpler than characterizations of simple voxels were before. The grid cell model resolves topologically unclear voxel configurations at junctions and locked voxel configurations causing, for example, interior voxels in sets of non-simple voxels. A general conclusion is that the grid cell model is superior to indecomposable voxels for algorithms that need detailed control of topology. Section 2.3.2 introduces a noise-insensitive measure based on the geodesic distance along the boundary to compute two-dimensional skeletons. The measure is able to retain thin object structures if they are geometrically important while ignoring noise on the object's boundary. This combination of properties is not known of other measures. The measure is also used to guide erosion in a thinning process from the boundary towards lines centered within plate-like structures. Geodesic distance based quantities seem to be well suited to robustly identify one- and two-dimensional skeletons. Chapter 6 applies the method to visualization of bone micro-architecture. Chapter 3 describes a novel geometry generation scheme for representing voxel skeletons, which retracts voxel skeletons to piecewise linear geometry per dual cube. The generated triangle meshes and graphs provide a link to geometry processing and efficient rendering of voxel skeletons. The scheme creates non-closed surfaces with boundaries, which contain fewer triangles than a representation of voxel skeletons using closed surfaces like small cubes or iso-surfaces. A conclusion is that thinking specifically about voxel skeleton configurations instead of generic voxel configurations helps to deal with the topological implications. The geometry generation is one foundation of the applications presented in Chapter 6. Chapter 5 presents a novel external memory algorithm for distance ordered homotopic thinning. The presented method extends known algorithms for computing chamfer distance transformations and thinning to execute I/O-efficiently when input is larger than the available main memory. The applied block-wise decomposition schemes are quite simple. Yet it was necessary to carefully analyze effects of block boundaries to devise globally correct external memory variants of known algorithms. In general, doing so is superior to naive block-wise processing ignoring boundary effects. Chapter 6 applies the algorithms in a novel method based on confocal microscopy for quantitative study of micro-vascular networks in the field of microcirculation.
The innovation of information techniques has changed many aspects of our life. In health care field, we can obtain, manage and communicate high-quality large volumetric image data by computer integrated devices, to support medical care. In this dissertation I propose several promising methods that could assist physicians in processing, observing and communicating the image data. They are included in my three research aspects: telemedicine integration, medical image visualization and image segmentation. And these methods are also demonstrated by the demo software that I developed. One of my research point focuses on medical information storage standard in telemedicine, for example DICOM, which is the predominant standard for the storage and communication of medical images. I propose a novel 3D image data storage method, which was lacking in current DICOM standard. I also created a mechanism to make use of the non-standard or private DICOM files. In this thesis I present several rendering techniques on medical image visualization to offer different display manners, both 2D and 3D, for example, cut through data volume in arbitrary degree, rendering the surface shell of the data, and rendering the semi-transparent volume of the data. A hybrid segmentation approach, designed for semi-automated segmentation of radiological image, such as CT, MRI, etc, is proposed in this thesis to get the organ or interested area from the image. This approach takes advantage of the region-based method and boundary-based methods. Three steps compose the hybrid approach: the first step gets coarse segmentation by fuzzy affinity and generates homogeneity operator; the second step divides the image by Voronoi Diagram and reclassifies the regions by the operator to refine segmentation from the previous step; the third step handles vague boundary by level set model. Topics for future research are mentioned in the end, including new supplement for DICOM standard for segmentation information storage, visualization of multimodal image information, and improvement of the segmentation approach to higher dimension.
One of the main problems in machine learning is to train a predictive model from training data and to make predictions on test data. Most predictive models are constructed under the assumption that the training data is governed by the exact same distribution which the model will later be exposed to. In practice, control over the data collection process is often imperfect. A typical scenario is when labels are collected by questionnaires and one does not have access to the test population. For example, parts of the test population are underrepresented in the survey, out of reach, or do not return the questionnaire. In many applications training data from the test distribution are scarce because they are difficult to obtain or very expensive. Data from auxiliary sources drawn from similar distributions are often cheaply available. This thesis centers around learning under differing training and test distributions and covers several problem settings with different assumptions on the relationship between training and test distributions-including multi-task learning and learning under covariate shift and sample selection bias. Several new models are derived that directly characterize the divergence between training and test distributions, without the intermediate step of estimating training and test distributions separately. The integral part of these models are rescaling weights that match the rescaled or resampled training distribution to the test distribution. Integrated models are studied where only one optimization problem needs to be solved for learning under differing distributions. With a two-step approximation to the integrated models almost any supervised learning algorithm can be adopted to biased training data. In case studies on spam filtering, HIV therapy screening, targeted advertising, and other applications the performance of the new models is compared to state-of-the-art reference methods.
Although educational content in electronic form is increasing dramatically, its usage in an educational environment is poor, mainly due to the fact that there is too much of (unreliable) redundant, and not relevant information. Finding appropriate answers is a rather difficult task being reliant on the user filtering of the pertinent information from the noise. Turning knowledge bases like the online tele-TASK archive into useful educational resources requires identifying correct, reliable, and "machine-understandable" information, as well as developing simple but efficient search tools with the ability to reason over this information. Our vision is to create an E-Librarian Service, which is able to retrieve multimedia resources from a knowledge base in a more efficient way than by browsing through an index, or by using a simple keyword search. In our E-Librarian Service, the user can enter his question in a very simple and human way; in natural language (NL). Our premise is that more pertinent results would be retrieved if the search engine understood the sense of the user's query. The returned results are then logical consequences of an inference rather than of keyword matchings. Our E-Librarian Service does not return the answer to the user's question, but it retrieves the most pertinent document(s), in which the user finds the answer to his/her question. Among all the documents that have some common information with the user query, our E-Librarian Service identifies the most pertinent match(es), keeping in mind that the user expects an exhaustive answer while preferring a concise answer with only little or no information overhead. Also, our E-Librarian Service always proposes a solution to the user, even if the system concludes that there is no exhaustive answer. Our E-Librarian Service was implemented prototypically in three different educational tools. A first prototype is CHESt (Computer History Expert System); it has a knowledge base with 300 multimedia clips that cover the main events in computer history. A second prototype is MatES (Mathematics Expert System); it has a knowledge base with 115 clips that cover the topic of fractions in mathematics for secondary school w.r.t. the official school programme. All clips were recorded mainly by pupils. The third and most advanced prototype is the "Lecture Butler's E-Librarain Service"; it has a Web service interface to respect a service oriented architecture (SOA), and was developed in the context of the Web-University project at the Hasso-Plattner-Institute (HPI). Two major experiments in an educational environment - at the Lycée Technique Esch/Alzette in Luxembourg - were made to test the pertinence and reliability of our E-Librarian Service as a complement to traditional courses. The first experiment (in 2005) was made with CHESt in different classes, and covered a single lesson. The second experiment (in 2006) covered a period of 6 weeks of intensive use of MatES in one class. There was no classical mathematics lesson where the teacher gave explanations, but the students had to learn in an autonomous and exploratory way. They had to ask questions to the E-Librarian Service just the way they would if there was a human teacher.
With the rise of electronic integration between organizations, the need for a precise specification of interaction behavior increases. Information systems, replacing interaction previously carried out by humans via phone, faxes and emails, require a precise specification for handling all possible situations. Such interaction behavior is described in process choreographies. Choreographies enumerate the roles involved, the allowed interactions, the message contents and the behavioral dependencies between interactions. Choreographies serve as interaction contract and are the starting point for adapting existing business processes and systems or for implementing new software components. As a thorough analysis and comparison of choreography modeling languages is missing in the literature, this thesis introduces a requirements framework for choreography languages and uses it for comparing current choreography languages. Language proposals for overcoming the limitations are given for choreography modeling on the conceptual and on the technical level. Using an interconnection modeling style, behavioral dependencies are defined on a per-role basis and different roles are interconnected using message flow. This thesis reveals a number of modeling "anti-patterns" for interconnection modeling, motivating further investigations on choreography languages following the interaction modeling style. Here, interactions are seen as atomic building blocks and the behavioral dependencies between them are defined globally. Two novel language proposals are put forward for this modeling style which have already influenced industrial standardization initiatives. While avoiding many of the pitfalls of interconnection modeling, new anomalies can arise in interaction models. A choreography might not be realizable, i.e. there does not exist a set of interacting roles that collectively realize the specified behavior. This thesis investigates different dimensions of realizability.
The programmable network envisioned in the 1990s within standardization and research for the Intelligent Network is currently coming into reality using IPbased Next Generation Networks (NGN) and applying Service-Oriented Architecture (SOA) principles for service creation, execution, and hosting. SOA is the foundation for both next-generation telecommunications and middleware architectures, which are rapidly converging on top of commodity transport services. Services such as triple/quadruple play, multimedia messaging, and presence are enabled by the emerging service-oriented IPMultimedia Subsystem (IMS), and allow telecommunications service providers to maintain, if not improve, their position in the marketplace. SOA becomes the de facto standard in next-generation middleware systems as the system model of choice to interconnect service consumers and providers within and between enterprises. We leverage previous research activities in overlay networking technologies along with recent advances in network abstraction, service exposure, and service creation to develop a paradigm for a service environment providing converged Internet and Telecommunications services that we call Service Broker. Such a Service Broker provides mechanisms to combine and mediate between different service paradigms from the two domains Internet/WWW and telecommunications. Furthermore, it enables the composition of services across these domains and is capable of defining and applying temporal constraints during creation and execution time. By adding network-awareness into the service fabric, such a Service Broker may also act as a next generation network-to-service element allowing the composition of crossdomain and cross-layer network and service resources. The contribution of this research is threefold: first, we analyze and classify principles and technologies from Information Technologies (IT) and telecommunications to identify and discuss issues allowing cross-domain composition in a converging service layer. Second, we discuss service composition methods allowing the creation of converged services on an abstract level; in particular, we present a formalized method for model-checking of such compositions. Finally, we propose a Service Broker architecture converging Internet and Telecom services. This environment enables cross-domain feature interaction in services through formalized obligation policies acting as constraints during service discovery, creation, and execution time.
This thesis presents methods for automated synthesis of flexible chip multiprocessor systems from parallel programs targeted at FPGAs to exploit both task-level parallelism and architecture customization. Automated synthesis is necessitated by the complexity of the design space. A detailed description of the design space is provided in order to determine which parameters should be modeled to facilitate automated synthesis by optimizing a cost function, the emphasis being placed on inclusive modeling of parameters from application, architectural and physical subspaces, as well as their joint coverage in order to avoid pre-constraining the design space. Given a parallel program and a set of an IP library, the automated synthesis problem is to simultaneously (i) select processors (ii) map and schedule tasks to them, and (iii) select one or several networks for inter-task communications such that design constraints and optimization objectives are met. The research objective in this thesis is to find a suitable model for automated synthesis, and to evaluate methods of using the model for architectural optimizations. Our contributions are a holistic approach for the design of such systems, corresponding models to facilitate automated synthesis, evaluation of optimization methods using state of the art integer linear and answer set programming, as well as the development of synthesis heuristics to solve runtime challenges.