TY - JOUR
A1 - Laskov, Pavel
A1 - Gehl, Christian
A1 - Krüger, Stefan
A1 - Müller, Klaus-Robert
T1 - Incremental support vector learning: analysis, implementation and applications
JF - Journal of machine learning research
N2 - Incremental Support Vector Machines (SVM) are instrumental in practical applications of online learning. This work focuses on the design and analysis of efficient incremental SVM learning, with the aim of providing a fast, numerically stable and robust implementation. A detailed analysis of convergence and of algorithmic complexity of incremental SVM learning is carried out. Based on this analysis, a new design of storage and numerical operations is proposed, which speeds up the training of an incremental SVM by a factor of 5 to 20. The performance of the new algorithm is demonstrated in two scenarios: learning with limited resources and active learning. Various applications of the algorithm, such as in drug discovery, online monitoring of industrial devices and and surveillance of network traffic, can be foreseen.
KW - incremental SVM
KW - online learning
KW - drug discovery
KW - intrusion detection
Y1 - 2006
SN - 1532-4435
VL - 7
SP - 1909
EP - 1936
PB - MIT Press
CY - Cambridge, Mass.
ER -
TY - JOUR
A1 - Steuer, Ralf
A1 - Humburg, Peter
A1 - Selbig, Joachim
T1 - Validation and functional annotation of expression-based clusters based on gene ontology
JF - BMC bioinformatics
N2 - Background: The biological interpretation of large-scale gene expression data is one of the paramount challenges in current bioinformatics. In particular, placing the results in the context of other available functional genomics data, such as existing bio-ontologies, has already provided substantial improvement for detecting and categorizing genes of interest. One common approach is to look for functional annotations that are significantly enriched within a group or cluster of genes, as compared to a reference group. Results: In this work, we suggest the information-theoretic concept of mutual information to investigate the relationship between groups of genes, as given by data-driven clustering, and their respective functional categories. Drawing upon related approaches (Gibbons and Roth, Genome Research 12: 1574-1581, 2002), we seek to quantify to what extent individual attributes are sufficient to characterize a given group or cluster of genes. Conclusion: We show that the mutual information provides a systematic framework to assess the relationship between groups or clusters of genes and their functional annotations in a quantitative way. Within this framework, the mutual information allows us to address and incorporate several important issues, such as the interdependence of functional annotations and combinatorial combinations of attributes. It thus supplements and extends the conventional search for overrepresented attributes within a group or cluster of genes. In particular taking combinations of attributes into account, the mutual information opens the way to uncover specific functional descriptions of a group of genes or clustering result. All datasets and functional annotations used in this study are publicly available. All scripts used in the analysis are provided as additional files.
Y1 - 2006
U6 - https://doi.org/10.1186/1471-2105-7-380
SN - 1471-2105
VL - 7
IS - 380
PB - BioMed Central
CY - London
ER -
TY - JOUR
A1 - Bordihn, Henning
A1 - Fernau, Henning
A1 - Holzer, Markus
A1 - Manca, Vincenzo
A1 - Martin-Vide, Carlos
T1 - Iterated sequential transducers as language generating devices
JF - Theoretical computer science
N2 - Iterated finite state sequential transducers are considered as language generating devices. The hierarchy induced by the size of the state alphabet is proved to collapse to the fourth level. The corresponding language families are related to the families of languages generated by Lindenmayer systems and Chomsky grammars. Finally, some results on deterministic and extended iterated finite state transducers are established.
KW - finite state sequential transducers
KW - state complexity
KW - Lindenmayer systems
Y1 - 2006
U6 - https://doi.org/10.1016/j.tcs.2006.07.059
SN - 0304-3975
VL - 369
IS - 1
SP - 67
EP - 81
PB - Elsevier
CY - Amsterdam
ER -
TY - JOUR
A1 - Grell, Susanne
A1 - Schaub, Torsten H.
A1 - Selbig, Joachim
T1 - Modelling biological networks by action languages via set programming
Y1 - 2006
UR - http://www.cs.uni-potsdam.de/wv/pdfformat/gebsch06c.pdf
U6 - https://doi.org/10.1007/11799573
SN - 0302-9743
ER -
TY - JOUR
A1 - Delgrande, James Patrick
A1 - Schaub, Torsten H.
A1 - Tompits, Hans
T1 - A Preference-Based Framework for Updating logic Programs : preliminary reports
Y1 - 2006
UR - http://www.easychair.org/FLoC-06/PREFS-preproceedings.pdf
ER -
TY - JOUR
A1 - Gressmann, Jean
A1 - Janhunen, Tomi
A1 - Mercer, Robert E.
A1 - Schaub, Torsten H.
A1 - Thiele, Sven
A1 - Tichy, Richard
T1 - On probing and multi-threading in platypus
Y1 - 2006
UR - http://www2.in.tu-clausthal.de/~tmbehrens/NMR_Proc_TR4.pdf
ER -
TY - BOOK
ED - Jürgensen, Helmut
T1 - Accessible Media : Pre-Proceedings of a Workshop Potsdam 8-9 May, 2006
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 7
PB - Univ.
CY - Potsdam
ER -
TY - BOOK
A1 - Krahmer, Sebastian
T1 - Generating runtime call graphs
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 8
PB - Univ.
CY - Potsdam
ER -
TY - THES
A1 - Leininger, Andreas
T1 - New diagnosis and test methods with high compaction rates
Y1 - 2006
SN - 3-86664-066-8
PB - Mensch & Buch Verl.
CY - Berlin
ER -
TY - JOUR
A1 - Anger, Christian
A1 - Gebser, Martin
A1 - Schaub, Torsten H.
T1 - Approaching the core of unfounded sets
Y1 - 2006
UR - http://www.cs.uni-potsdam.de/wv/pdfformat/angesc06a.pdf
ER -
TY - JOUR
A1 - Shenoy, Pradeep
A1 - Krauledat, Matthias
A1 - Blankertz, Benjamin
A1 - Rao, Rajesh P. N.
A1 - Müller, Klaus-Robert
T1 - Towards adaptive classification for BCI
N2 - Non-stationarities are ubiquitous in EEG signals. They are especially apparent in the use of EEG-based brain- computer interfaces (BCIs): (a) in the differences between the initial calibration measurement and the online operation of a BCI, or (b) caused by changes in the subject's brain processes during an experiment (e.g. due to fatigue, change of task involvement, etc). In this paper, we quantify for the first time such systematic evidence of statistical differences in data recorded during offline and online sessions. Furthermore, we propose novel techniques of investigating and visualizing data distributions, which are particularly useful for the analysis of (non-) stationarities. Our study shows that the brain signals used for control can change substantially from the offline calibration sessions to online control, and also within a single session. In addition to this general characterization of the signals, we propose several adaptive classification schemes and study their performance on data recorded during online experiments. An encouraging result of our study is that surprisingly simple adaptive methods in combination with an offline feature selection scheme can significantly increase BCI performance
Y1 - 2006
UR - http://iopscience.iop.org/1741-2552/3/1/R02/
U6 - https://doi.org/10.1088/1741-2560/3/1/R02
ER -
TY - JOUR
A1 - Blankertz, Benjamin
A1 - Dornhege, Guido
A1 - Krauledat, Matthias
A1 - Müller, Klaus-Robert
A1 - Kunzmann, Volker
A1 - Losch, Florian
A1 - Curio, Gabriel
T1 - The Berlin brain-computer interface : EEG-based communication without subject training
N2 - The Berlin Brain-Computer Interface (BBCI) project develops a noninvasive BCI system whose key features are 1) the use of well-established motor competences as control paradigms, 2) high-dimensional features from 128-channel electroencephalogram (EEG), and 3) advanced machine learning techniques. As reported earlier, our experiments demonstrate that very high information transfer rates can be achieved using the readiness potential (RP) when predicting the laterality of upcoming left-versus right-hand movements in healthy subjects. A more recent study showed that the RP similarily accompanies phantom movements in arm amputees, but the signal strength decreases with longer loss of the limb. In a complementary approach, oscillatory features are used to discriminate imagined movements (left hand versus right hand versus foot). In a recent feedback study with six healthy subjects with no or very little experience with BCI control, three subjects achieved an information transfer rate above 35 bits per minute (bpm), and further two subjects above 24 and 15 bpm, while one subject could not achieve any BCI control. These results are encouraging for an EEG-based BCI system in untrained subjects that is independent of peripheral nervous system activity and does not rely on evoked potentials even when compared to results with very well-trained subjects operating other BCI systems
Y1 - 2006
UR - http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7333
U6 - https://doi.org/10.1109/Tnsre.2006.875557
SN - 1534-4320
ER -
TY - JOUR
A1 - Willig, Andreas
A1 - Mitschke, Robert
T1 - Results of bit error measurements with sensor nodes and casuistic consequences for design of energy-efficient error control schemes
N2 - For the proper design of energy-efficient error control schemes some insight into channel error patterns is needed. This paper presents bit error and packet loss measurements taken with sensor nodes running the popular RFM
Y1 - 2006
SN - 978-3-540-32158-3
ER -
TY - JOUR
A1 - Rozinat, A
A1 - Van der Aalst, Wil M. P.
T1 - Conformance testing: Measuring the fit and appropriateness of event logs and process models
N2 - Most information systems log events (e.g., transaction logs, audit traits) to audit and monitor the processes they support. At the same time, many of these processes have been explicitly modeled. For example, SAP R/3 logs events in transaction logs and there are EPCs (Event-driven Process Chains) describing the so-called reference models. These reference models describe how the system should be used. The coexistence of event logs and process models raises an interesting question: "Does the event log conform to the process model and vice versa?". This paper demonstrates that there is not a simple answer to this question. To tackle the problem, we distinguish two dimensions of conformance: fitness (the event log may be the result of the process modeled) and appropriateness (the model is a likely candidate from a structural and behavioral point of view). Different metrics have been defined and a Conformance Checker has been implemented within the ProM Framework
Y1 - 2006
ER -
TY - JOUR
A1 - Gerbser, Martin
A1 - Schaub, Torsten H.
T1 - Tableau calculi for answer set programming
Y1 - 2006
UR - http://www.cs.uni-potsdam.de/wv/pdfformat/gebsch06c.pdf
U6 - https://doi.org/10.1007/11799573
SN - 0302-9743
ER -
TY - JOUR
A1 - Konczak, Kathrin
T1 - Voting Theory in Answer Set Programming
Y1 - 2006
ER -
TY - JOUR
A1 - Anger, Christian
A1 - Gebser, Martin
A1 - Janhunen, Tomi
A1 - Schaub, Torsten H.
T1 - What's a head without a body?
Y1 - 2006
ER -
TY - JOUR
A1 - Gerbser, Martin
A1 - Lee, Joohyung
A1 - Lierler, Yuliya
T1 - Elementary sets for logic programs
Y1 - 2006
SN - 978-1-57735-281-5
ER -
TY - JOUR
A1 - Gerbser, Martin
A1 - Schaub, Torsten H.
T1 - Characterizing (ASP) inferences by unit propagation
Y1 - 2006
ER -
TY - JOUR
A1 - Konczak, Kathrin
T1 - Weak order equivalence for Logic Programs with Prefernces
Y1 - 2006
ER -
TY - JOUR
A1 - Konczak, Kathrin
A1 - Linke, Thomas
A1 - Schaub, Torsten H.
T1 - Graphs and colorings for answer set programming
N2 - We investigate the usage of rule dependency graphs and their colorings for characterizing and computing answer sets of logic programs. This approach provides us with insights into the interplay between rules when inducing answer sets. We start with different characterizations of answer sets in terms of totally colored dependency graphs that differ ill graph-theoretical aspects. We then develop a series of operational characterizations of answer sets in terms of operators on partial colorings. In analogy to the notion of a derivation in proof theory, our operational characterizations are expressed as (non-deterministically formed) sequences of colorings, turning an uncolored graph into a totally colored one. In this way, we obtain an operational framework in which different combinations of operators result in different formal properties. Among others, we identify the basic strategy employed by the noMoRe system and justify its algorithmic approach. Furthermore, we distinguish operations corresponding to Fitting's operator as well as to well-founded semantics
Y1 - 2006
UR - http://www.cs.kuleuven.ac.be/~dtai/projects/ALP//TPLP/
U6 - https://doi.org/10.1017/S1471068405002528
SN - 1471-0684
ER -
TY - BOOK
A1 - Hüttenrauch, Stefan
A1 - Kylau, Uwe
A1 - Grund, Martin
A1 - Queck, Tobias
A1 - Ploskonos, Anna
A1 - Schreiter, Torben
A1 - Breest, Martin
A1 - Haubrock, Sören
A1 - Bouche, Paul
T1 - Fundamentals of Service-Oriented Engineering
BT - Proceedings of the Fall 2006 Workshop of the HPI Research School on Service-Oriented Systems Engineering
T3 - Technische Berichte des Hasso-Plattner-Instituts für Softwaresystemtechnik an der Universität Potsdam
Y1 - 2006
SN - 3-939469-35-1
SN - 1613-5652
VL - 18
PB - Universitätsverlag Potsdam
CY - Potsdam
ER -
TY - JOUR
A1 - Cordes, Frank
A1 - Kaiser, Rolf
A1 - Selbig, Joachim
T1 - Bioinformatics approach to predicting HIV drug resistance
N2 - The emergence of drug resistance remains one of the most challenging issues in the treatment of HIV-1 infection. The extreme replication dynamics of HIV facilitates its escape from the selective pressure exerted by the human immune system and by the applied combination drug therapy. This article reviews computational methods whose combined use can support the design of optimal antiretroviral therapies based on viral genotypic and phenotypic data. Genotypic assays are based on the analysis of mutations associated with reduced drug susceptibility, but are difficult to interpret due to the numerous mutations and mutational patterns that confer drug resistance. Phenotypic resistance or susceptibility can be experimentally evaluated by measuring the inhibition of the viral replication in cell culture assays. However, this procedure is expensive and time consuming
Y1 - 2006
UR - http://www.expert-reviews.com/loi/erm
U6 - https://doi.org/10.1586/14737159.6.2.207
SN - 1473-7159
ER -
TY - JOUR
A1 - Lemm, Steven
A1 - Curio, Gabriel
A1 - Hlushchuk, Yevhen
A1 - Müller, Klaus-Robert
T1 - Enhancing the signal-to-noise ratio of ICA-based extracted ERPs
N2 - When decomposing single trial electroencephalography it is a challenge to incorporate prior physiological knowledge. Here, we develop a method that uses prior information about the phase-locking property of event-related potentials in a regularization framework to bias a blind source separation algorithm toward an improved separation of single-trial phase-locked responses in terms of an increased signal-to-noise ratio. In particular, we suggest a transformation of the data, using weighted average of the single trial and trial-averaged response, that redirects the focus of source separation methods onto the subspace of event-related potentials. The practical benefit with respect to an improved separation of such components from ongoing background activity and extraneous noise is first illustrated on artificial data and finally verified in a real-world application of extracting single-trial somatosensory evoked potentials from multichannel EEG-recordings
Y1 - 2006
UR - http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=10
U6 - https://doi.org/10.1109/Tbme.2006.870258
SN - 0018-9294
ER -
TY - JOUR
A1 - Gressmann, Jean
A1 - Janhunen, Tomi
A1 - Mercer, Robert E.
A1 - Schaub, Torsten H.
A1 - Thiele, Sven
A1 - Tichy, Richard
T1 - On probing and multi-threading in platypus
Y1 - 2006
ER -
TY - THES
A1 - Hetzer, Dirk
T1 - Adaptive Quality of Service based Bandwidth Planning in Internet
Y1 - 2006
CY - Potsdam
ER -
TY - JOUR
A1 - Laub, Julian
A1 - Roth, Volker
A1 - Buhmann, Joachim
A1 - Müller, Klaus-Robert
T1 - On the information and representation of non-Euclidean pairwise data
N2 - Two common data representations are mostly used in intelligent data analysis, namely the vectorial and the pairwise representation. Pairwise data which satisfy the restrictive conditions of Euclidean spaces can be faithfully translated into a Euclidean vectorial representation by embedding. Non-metric pairwise data with violations of symmetry, reflexivity or triangle inequality pose a substantial conceptual problem for pattern recognition since the amount of predictive structural information beyond what can be measured by embeddings is unclear. We show by systematic modeling of non-Euclidean pairwise data that there exists metric violations which can carry valuable problem specific information. Furthermore, Euclidean and non-metric data can be unified on the level of structural information contained in the data. Stable component analysis selects linear subspaces which are particularly insensitive to data fluctuations. Experimental results from different domains support our pattern recognition strategy.
Y1 - 2006
UR - http://www.sciencedirect.com/science/journal/00313203
U6 - https://doi.org/10.1016/j.patcog.2006.04.016
SN - 0031-3203
ER -
TY - JOUR
A1 - Mileo, Alessandra
A1 - Schaub, Torsten H.
T1 - Extending ordered disjunctions for policy enforcement : preliminary report
Y1 - 2006
UR - http://www.easychair.org/FLoC-06/PREFS-preproceedings.pdf
ER -
TY - JOUR
A1 - Kawanabe, Motoaki
A1 - Blanchard, Gilles
A1 - Sugiyama, Masashi
A1 - Spokoiny, Vladimir G.
A1 - Müller, Klaus-Robert
T1 - A novel dimension reduction procedure for searching non-Gaussian subspaces
N2 - In this article, we consider high-dimensional data which contains a low-dimensional non-Gaussian structure contaminated with Gaussian noise and propose a new linear method to identify the non-Gaussian subspace. Our method NGCA (Non-Gaussian Component Analysis) is based on a very general semi-parametric framework and has a theoretical guarantee that the estimation error of finding the non-Gaussian components tends to zero at a parametric rate. NGCA can be used not only as preprocessing for ICA, but also for extracting and visualizing more general structures like clusters. A numerical study demonstrates the usefulness of our method
Y1 - 2006
UR - http://www.springerlink.com/content/105633/
U6 - https://doi.org/10.1007/11679363_19
SN - 0302-9743
ER -
TY - JOUR
A1 - Delgrande, James Patrick
A1 - Liu, Daphne H.
A1 - Schaub, Torsten H.
A1 - Thiele, Sven
T1 - COBA 2.0 : a consistency-based belief change system
Y1 - 2006
UR - http://www2.in.tu-clausthal.de/~tmbehrens/NMR_Proc_TR4.pdf
ER -
TY - JOUR
A1 - Delgrande, James Patrick
A1 - Schaub, Torsten H.
A1 - Tompits, Hans
T1 - An Extended Query language for action languages (and its application to aggregates and preferences)
Y1 - 2006
UR - http://www2.in.tu-clausthal.de/~tmbehrens/NMR_Proc_TR4.pdf
ER -
TY - BOOK
A1 - Jürgensen, Helmut
T1 - Complexity, information, energy
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 6
PB - Univ.
CY - Potsdam
ER -
TY - JOUR
A1 - Pernici, Barbara
A1 - Weske, Mathias
T1 - Business process management
Y1 - 2006
SN - 0169-023X
ER -
TY - BOOK
A1 - Balan, Sakthin M.
A1 - Jürgensen, Helmut
T1 - Peptide computing : universality and theoretical model
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 1
PB - Univ.
CY - Potsdam
ER -
TY - BOOK
A1 - Krahmer, Sebastian
T1 - Control flow integrity with ptrace()
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 2
PB - Univ.
CY - Potsdam
ER -
TY - BOOK
A1 - Balan, Sakthin M.
A1 - Jürgensen, Helmut
T1 - On the universality of peptide computing
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 9
PB - Univ.
CY - Potsdam
ER -
TY - BOOK
A1 - Krahmer, Sebastian
T1 - Hardend *OS exploitation techniques
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 4
PB - Univ.
CY - Potsdam
ER -
TY - BOOK
A1 - Jürgensen, Helmut
T1 - Synchronization
T3 - Preprint / Universität Potsdam, Institut für Informatik
Y1 - 2006
SN - 0946-7580
VL - 2006, 5
PB - Univ.
CY - Potsdam
ER -
TY - THES
A1 - Dornhege, Guido
T1 - Increasing information transfer rates for brain-computer interfacing
T1 - Erhöhung der Informationstransferrate einer Gehirn-Computer-Schnittstelle
N2 - The goal of a Brain-Computer Interface (BCI) consists of the development of a unidirectional interface between a human and a computer to allow control of a device only via brain signals. While the BCI systems of almost all other groups require the user to be trained over several weeks or even months, the group of Prof. Dr. Klaus-Robert Müller in Berlin and Potsdam, which I belong to, was one of the first research groups in this field which used machine learning techniques on a large scale. The adaptivity of the processing system to the individual brain patterns of the subject confers huge advantages for the user. Thus BCI research is considered a hot topic in machine learning and computer science. It requires interdisciplinary cooperation between disparate fields such as neuroscience, since only by combining machine learning and signal processing techniques based on neurophysiological knowledge will the largest progress be made. In this work I particularly deal with my part of this project, which lies mainly in the area of computer science. I have considered the following three main points: Establishing a performance measure based on information theory: I have critically illuminated the assumptions of Shannon's information transfer rate for application in a BCI context. By establishing suitable coding strategies I was able to show that this theoretical measure approximates quite well to what is practically achieveable. Transfer and development of suitable signal processing and machine learning techniques: One substantial component of my work was to develop several machine learning and signal processing algorithms to improve the efficiency of a BCI. Based on the neurophysiological knowledge that several independent EEG features can be observed for some mental states, I have developed a method for combining different and maybe independent features which improved performance. In some cases the performance of the combination algorithm outperforms the best single performance by more than 50 %. Furthermore, I have theoretically and practically addressed via the development of suitable algorithms the question of the optimal number of classes which should be used for a BCI. It transpired that with BCI performances reported so far, three or four different mental states are optimal. For another extension I have combined ideas from signal processing with those of machine learning since a high gain can be achieved if the temporal filtering, i.e., the choice of frequency bands, is automatically adapted to each subject individually. Implementation of the Berlin brain computer interface and realization of suitable experiments: Finally a further substantial component of my work was to realize an online BCI system which includes the developed methods, but is also flexible enough to allow the simple realization of new algorithms and ideas. So far, bitrates of up to 40 bits per minute have been achieved with this system by absolutely untrained users which, compared to results of other groups, is highly successful.
N2 - Ein Brain-Computer Interface (BCI) ist eine unidirektionale Schnittstelle zwischen Mensch und Computer, bei der ein Mensch in der Lage ist, ein Gerät einzig und allein Kraft seiner Gehirnsignale zu steuern. In den BCI Systemen fast aller Forschergruppen wird der Mensch in Experimenten über Wochen oder sogar Monaten trainiert, geeignete Signale zu produzieren, die vordefinierten allgemeinen Gehirnmustern entsprechen. Die BCI Gruppe in Berlin und Potsdam, der ich angehöre, war in diesem Feld eine der ersten, die erkannt hat, dass eine Anpassung des Verarbeitungssystems an den Menschen mit Hilfe der Techniken des Maschinellen Lernens große Vorteile mit sich bringt. In unserer Gruppe und mittlerweile auch in vielen anderen Gruppen wird BCI somit als aktuelles Forschungsthema im Maschinellen Lernen und folglich in der Informatik mit interdisziplinärer Natur in Neurowissenschaften und anderen Feldern verstanden, da durch die geeignete Kombination von Techniken des Maschinellen Lernens und der Signalverarbeitung basierend auf neurophysiologischem Wissen der größte Erfolg erzielt werden konnte. In dieser Arbeit gehe ich auf meinem Anteil an diesem Projekt ein, der vor allem im Informatikbereich der BCI Forschung liegt. Im Detail beschäftige ich mich mit den folgenden drei Punkten: Diskussion eines informationstheoretischen Maßes für die Güte eines BCI's: Ich habe kritisch die Annahmen von Shannon's Informationsübertragungsrate für die Anwendung im BCI Kontext beleuchtet. Durch Ermittlung von geeigneten Kodierungsstrategien konnte ich zeigen, dass dieses theoretische Maß den praktisch erreichbaren Wert ziemlich gut annähert. Transfer und Entwicklung von geeigneten Techniken aus dem Bereich der Signalverarbeitung und des Maschinellen Lernens: Eine substantielle Komponente meiner Arbeit war die Entwicklung von Techniken des Machinellen Lernens und der Signalverarbeitung, um die Effizienz eines BCI's zu erhöhen. Basierend auf dem neurophysiologischem Wissen, dass verschiedene unabhängige Merkmale in Gehirnsignalen für verschiedene mentale Zustände beobachtbar sind, habe ich eine Methode zur Kombination von verschiedenen und unter Umständen unabhängigen Merkmalen entwickelt, die sehr erfolgreich die Fähigkeiten eines BCI's verbessert. Besonders in einigen Fällen übertraf die Leistung des entwickelten Kombinationsalgorithmus die beste Leistung auf den einzelnen Merkmalen mit mehr als 50 %. Weiterhin habe ich theoretisch und praktisch durch Einführung geeigneter Algorithmen die Frage untersucht, wie viele Klassen man für ein BCI nutzen kann und sollte. Auch hier wurde ein relevantes Resultat erzielt, nämlich dass für BCI Güten, die bis heute berichtet sind, die Benutzung von 3 oder 4 verschiedenen mentalen Zuständen in der Regel optimal im Sinne von erreichbarer Leistung sind. Für eine andere Erweiterung wurden Ideen aus der Signalverarbeitung mit denen des Maschinellen Lernens kombiniert, da ein hoher Erfolg erzielt werden kann, wenn der temporale Filter, d.h. die Wahl des benutzten Frequenzbandes, automatisch und individuell für jeden Menschen angepasst wird. Implementation des Berlin Brain-Computer Interfaces und Realisierung von geeigneten Experimenten: Eine weitere wichtige Komponente meiner Arbeit war eine Realisierung eines online BCI Systems, welches die entwickelten Methoden umfasst, aber auch so flexibel ist, dass neue Algorithmen und Ideen einfach zu verwirklichen sind. Bis jetzt wurden mit diesem System Bitraten von bis zu 40 Bits pro Minute von absolut untrainierten Personen in ihren ersten BCI Experimenten erzielt. Dieses Resultat übertrifft die bisher berichteten Ergebnisse aller anderer BCI Gruppen deutlich.
Bemerkung: Der Autor wurde mit dem Michelson-Preis 2005/2006 für die beste Promotion des Jahrgangs der Mathematisch-Naturwissenschaftlichen Fakultät der Universität Potsdam ausgezeichnet.
KW - Kybernetik
KW - Maschinelles Lernen
KW - Gehirn-Computer-Schnittstelle
KW - BCI
KW - EEG
KW - Spatio-Spectral Filter
KW - Feedback
KW - Multi-Class
KW - Classification
KW - Signal Processing
KW - Brain Computer Interface
KW - Information Transfer Rate
KW - Machine Learning
KW - Single Trial Analysis
KW - Feature Combination
KW - Common Spatial Pattern
Y1 - 2006
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-7690
ER -
TY - THES
A1 - Scholz, Matthias
T1 - Approaches to analyse and interpret biological profile data
T1 - Methoden zur Analyse und Interpretation biologischer Profildaten
N2 - Advances in biotechnologies rapidly increase the number of molecules of a cell which can be observed simultaneously. This includes expression levels of thousands or ten-thousands of genes as well as concentration levels of metabolites or proteins. Such Profile data, observed at different times or at different experimental conditions (e.g., heat or dry stress), show how the biological experiment is reflected on the molecular level. This information is helpful to understand the molecular behaviour and to identify molecules or combination of molecules that characterise specific biological condition (e.g., disease). This work shows the potentials of component extraction algorithms to identify the major factors which influenced the observed data. This can be the expected experimental factors such as the time or temperature as well as unexpected factors such as technical artefacts or even unknown biological behaviour. Extracting components means to reduce the very high-dimensional data to a small set of new variables termed components. Each component is a combination of all original variables. The classical approach for that purpose is the principal component analysis (PCA). It is shown that, in contrast to PCA which maximises the variance only, modern approaches such as independent component analysis (ICA) are more suitable for analysing molecular data. The condition of independence between components of ICA fits more naturally our assumption of individual (independent) factors which influence the data. This higher potential of ICA is demonstrated by a crossing experiment of the model plant Arabidopsis thaliana (Thale Cress). The experimental factors could be well identified and, in addition, ICA could even detect a technical artefact. However, in continuously observations such as in time experiments, the data show, in general, a nonlinear distribution. To analyse such nonlinear data, a nonlinear extension of PCA is used. This nonlinear PCA (NLPCA) is based on a neural network algorithm. The algorithm is adapted to be applicable to incomplete molecular data sets. Thus, it provides also the ability to estimate the missing data. The potential of nonlinear PCA to identify nonlinear factors is demonstrated by a cold stress experiment of Arabidopsis thaliana. The results of component analysis can be used to build a molecular network model. Since it includes functional dependencies it is termed functional network. Applied to the cold stress data, it is shown that functional networks are appropriate to visualise biological processes and thereby reveals molecular dynamics.
N2 - Fortschritte in der Biotechnologie ermöglichen es, eine immer größere Anzahl von Molekülen in einer Zelle gleichzeitig zu erfassen. Das betrifft sowohl die Expressionswerte tausender oder zehntausender Gene als auch die Konzentrationswerte von Metaboliten oder Proteinen. Diese Profildaten verschiedener Zeitpunkte oder unterschiedlicher experimenteller Bedingungen (z.B. unter Stressbedingungen wie Hitze oder Trockenheit) zeigen, wie sich das biologische Experiment auf molekularer Ebene widerspiegelt. Diese Information kann genutzt werden, um molekulare Abläufe besser zu verstehen und um Moleküle oder Molekül-Kombinationen zu bestimmen, die für bestimmte biologische Zustände (z.B.: Krankheit) charakteristisch sind. Die Arbeit zeigt die Möglichkeiten von Komponenten-Extraktions-Algorithmen zur Bestimmung der wesentlichen Faktoren, die einen Einfluss auf die beobachteten Daten ausübten. Das können sowohl die erwarteten experimentellen Faktoren wie Zeit oder Temperatur sein als auch unerwartete Faktoren wie technische Einflüsse oder sogar unerwartete biologische Vorgänge. Unter der Extraktion von Komponenten versteht man die Reduzierung dieser stark hoch-dimensionalen Daten auf wenige neue Variablen, die eine Kombination aus allen ursprünglichen Variablen darstellen und als Komponenten bezeichnet werden. Die Standard-Methode für diesen Zweck ist die Hauptkomponentenanalyse (PCA). Es wird gezeigt, dass - im Vergleich zur nur die Varianz maximierenden PCA - moderne Methoden wie die Unabhängige Komponentenanalyse (ICA) für die Analyse molekularer Datensätze besser geeignet sind. Die Unabhängigkeit von Komponenten in der ICA entspricht viel besser unserer Annahme individueller (unabhängiger) Faktoren, die einen Einfluss auf die Daten ausüben. Dieser Vorteil der ICA wird anhand eines Kreuzungsexperiments mit der Modell-Pflanze Arabidopsis thaliana (Ackerschmalwand) demonstriert. Die experimentellen Faktoren konnten dabei gut identifiziert werden und ICA erkannte sogar zusätzlich einen technischen Störfaktor. Bei kontinuierlichen Beobachtungen wie in Zeitexperimenten zeigen die Daten jedoch häufig eine nichtlineare Verteilung. Für die Analyse dieser nichtlinearen Daten wird eine nichtlinear erweiterte Methode der PCA angewandt. Diese nichtlineare PCA (NLPCA) basiert auf einem neuronalen Netzwerk-Algorithmus. Der Algorithmus wurde für die Anwendung auf unvollständigen molekularen Daten erweitert. Dies ermöglicht es, die fehlenden Werte zu schätzen. Die Fähigkeit der nichtlinearen PCA zur Bestimmung nichtlinearer Faktoren wird anhand eines Kältestress-Experiments mit Arabidopsis thaliana demonstriert. Die Ergebnisse aus der Komponentenanalyse können zur Erstellung molekularer Netzwerk-Modelle genutzt werden. Da sie funktionelle Abhängigkeiten berücksichtigen, werden sie als Funktionale Netzwerke bezeichnet. Anhand der Kältestress-Daten wird demonstriert, dass solche funktionalen Netzwerke geeignet sind, biologische Prozesse zu visualisieren und dadurch die molekularen Dynamiken aufzuzeigen.
KW - Bioinformatik
KW - Hauptkomponentenanalyse
KW - Unabhängige Komponentenanalyse
KW - Neuronales Netz
KW - Maschinelles Lernen
KW - Fehlende Daten
KW - Ackerschmalwand
KW - nichtlineare PCA (NLPCA)
KW - molekulare Netzwerke
KW - nonlinear PCA (NLPCA)
KW - molecular networks
Y1 - 2006
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-7839
ER -
TY - THES
A1 - Lunemann, Carolin
T1 - Quantum cryptography : security analysis of multiuser quantum communication with embedded authentication
N2 - Three quantum cryptographic protocols of multiuser quantum networks with embedded authentication, allowing quantum key distribution or quantum direct communication, are discussed in this work. The security of the protocols against different types of attacks is analysed with a focus on various impersonation attacks and the man-in-the-middle attack. On the basis of the security analyses several improvements are suggested and implemented in order to adjust the investigated vulnerabilities. Furthermore, the impact of the eavesdropping test procedure on impersonation attacks is outlined. The framework of a general eavesdropping test is proposed to provide additional protection against security risks in impersonation attacks.
N2 - In der Diplomarbeit werden drei verschiedene quantenkryptographische Protokolle mit dem Schwerpunkt auf authentifizierten Quantennetzwerken analysiert. Die Sicherheit der Protokolle gegenüber verschiedenen Angriffen wird untersucht, wobei der Fokus auf kompletten Personifikationsattacken („impersonation attacks“) liegt. Auf Basis der Sicherheitsanalyse und den Netzwerkanforderungen werden entsprechende Verbesserungen vorgeschlagen. Um die Gefahr von Personifikationen realistisch abschätzen zu können, wird außerdem der Einfluss des Testablaufs analysiert. Um zusätzlichen Schutz gegen Personifikationsattacken zu gewährleisten, werden die Rahmenbedingungen für eine allgemeine Testspezifikation festgelegt.
KW - Kryptographie
KW - Quantenkryptographie
KW - Authentifizierung
KW - Netzwerk
KW - cryptography
KW - quantum cryptography
KW - authentication
KW - multiuser
KW - network
Y1 - 2006
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-12756
ER -
TY - THES
A1 - Buchholz, Henrik
T1 - Real-time visualization of 3D city models
T1 - Echtzeit-Visualisierung von 3D-Stadtmodellen
N2 - An increasing number of applications requires user interfaces that facilitate the handling of large geodata sets. Using virtual 3D city models, complex geospatial information can be communicated visually in an intuitive way. Therefore, real-time visualization of virtual 3D city models represents a key functionality for interactive exploration, presentation, analysis, and manipulation of geospatial data. This thesis concentrates on the development and implementation of concepts and techniques for real-time city model visualization. It discusses rendering algorithms as well as complementary modeling concepts and interaction techniques. Particularly, the work introduces a new real-time rendering technique to handle city models of high complexity concerning texture size and number of textures. Such models are difficult to handle by current technology, primarily due to two problems: - Limited texture memory: The amount of simultaneously usable texture data is limited by the memory of the graphics hardware. - Limited number of textures: Using several thousand different textures simultaneously causes significant performance problems due to texture switch operations during rendering. The multiresolution texture atlases approach, introduced in this thesis, overcomes both problems. During rendering, it permanently maintains a small set of textures that are sufficient for the current view and the screen resolution available. The efficiency of multiresolution texture atlases is evaluated in performance tests. To summarize, the results demonstrate that the following goals have been achieved: - Real-time rendering becomes possible for 3D scenes whose amount of texture data exceeds the main memory capacity. - Overhead due to texture switches is kept permanently low, so that the number of different textures has no significant effect on the rendering frame rate. Furthermore, this thesis introduces two new approaches for real-time city model visualization that use textures as core visualization elements: - An approach for visualization of thematic information. - An approach for illustrative visualization of 3D city models. Both techniques demonstrate that multiresolution texture atlases provide a basic functionality for the development of new applications and systems in the domain of city model visualization.
N2 - Eine zunehmende Anzahl von Anwendungen benötigt Benutzungsschnittstellen, um den Umgang mit großen Geodatenmengen zu ermöglichen. Virtuelle 3D-Stadtmodelle bieten eine Möglichkeit, komplexe raumbezogene Informationen auf intuitive Art und Weise visuell erfassbar zu machen. Echtzeit-Visualisierung virtueller Stadtmodelle bildet daher eine Grundlage für die interaktive Exploration, Präsentation, Analyse und Bearbeitung raumbezogener Daten. Diese Arbeit befasst sich mit der Entwicklung und Implementierung von Konzepten und Techniken für die Echtzeit-Visualisierung virtueller 3D-Stadtmodelle. Diese umfassen sowohl Rendering-Algorithmen als auch dazu komplementäre Modellierungskonzepte und Interaktionstechniken. Insbesondere wird in dieser Arbeit eine neue Echtzeit-Rendering-Technik für Stadtmodelle hoher Komplexität hinsichtlich Texturgröße und Texturanzahl vorgestellt. Solche Modelle sind durch die derzeit zur Verfügung stehende Technologie schwierig zu bewältigen, vor allem aus zwei Gründen: - Begrenzter Textur-Speicher: Die Menge an gleichzeitig nutzbaren Texturdaten ist beschränkt durch den Speicher der Grafik-Hardware. - Begrenzte Textur-Anzahl: Die gleichzeitige Verwendung mehrerer tausend Texturen verursacht erhebliche Performance-Probleme aufgrund von Textur-Umschaltungs-Operationen während des Renderings. Das in dieser Arbeit vorgestellte Verfahren, das Rendering mit Multiresolutions-Texturatlanten löst beide Probleme. Während der Darstellung wird dazu permanent eine kleine Textur-Menge verwaltet, die für die aktuelle Sichtperspektive und die zur Verfügung stehende Bildschirmauflösung hinreichend ist. Die Effizienz des Verfahrens wird in Performance-Tests untersucht. Die Ergebnisse zeigen, dass die folgenden Ziele erreicht werden: - Echtzeit-Darstellung wird für Modelle möglich, deren Texturdaten-Menge die Kapazität des Hauptspeichers übersteigt. - Der Overhead durch Textur-Umschaltungs-Operationen wird permanent niedrig gehalten, so dass die Anzahl der unterschiedlichen Texturen keinen wesentlichen Einfluss auf die Bildrate der Darstellung hat. Die Arbeit stellt außerdem zwei neue Ansätze zur 3D-Stadtmodell-Visualisierung vor, in denen Texturen als zentrale Visualisierungselemente eingesetzt werden: - Ein Verfahren zur Visualisierung thematischer Informationen. - Ein Verfahren zur illustrativen Visualisierung von 3D-Stadtmodellen. Beide Ansätze zeigen, dass Rendering mit Multiresolutions-Texturatlanten eine Grundlage für die Entwicklung neuer Anwendungen und Systeme im Bereich der 3D-Stadtmodell-Visualisierung bietet.
KW - Computergrafik
KW - Geovisualisierung
KW - 3D-Stadtmodelle
KW - Texturen
KW - computer graphics
KW - geovisualization
KW - 3d city models
KW - textures
Y1 - 2006
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-13337
ER -
TY - JOUR
A1 - Friedrich, Sven
A1 - Krahmer, Sebastian
A1 - Schneidenbach, Lars
A1 - Schnor, Bettina
T1 - Loaded: Server Load Balancing for IPv6
N2 - With the next generation Internet protocol IPv6 at the horizon, it is time to think about how applications can migrate to IPv6. Web traffic is currently one of the most important applications in the Internet. The increasing popularity of dynamically generated content on the World Wide Web, has created the need for fast web servers. Server clustering together with server load balancing has emerged as a promising technique to build scalable web servers. The paper gives a short overview over the new features of IPv6 and different server load balancing technologies. Further, we present and evaluate Loaded, an user-space server load balancer for IPv4 and IPv6 based on Linux.
Y1 - 2006
SN - 0-7695-2622-5
ER -
TY - JOUR
A1 - Hallama, Nicole
A1 - Luckow, André
A1 - Schnor, Bettina
T1 - Grid Security for Fault Tolerant Grid Applications
Y1 - 2006
SN - 978-1-880843-60-4
ER -
TY - JOUR
A1 - Luckow, André
A1 - Schnor, Bettina
T1 - Migol : a Fault Tolerant Service Framework for Grid Computing : Evolution to WSRF (2006)
Y1 - 2006
ER -
TY - JOUR
A1 - Harmeling, Stefan
A1 - Dornhege, Guido
A1 - Tax, David
A1 - Meinecke, Frank C.
A1 - Müller, Klaus-Robert
T1 - From outliers to prototypes : Ordering data
N2 - We propose simple and fast methods based on nearest neighbors that order objects from high-dimensional data sets from typical points to untypical points. On the one hand, we show that these easy-to-compute orderings allow us to detect outliers (i.e. very untypical points) with a performance comparable to or better than other often much more sophisticated methods. On the other hand, we show how to use these orderings to detect prototypes (very typical points) which facilitate exploratory data analysis algorithms such as noisy nonlinear dimensionality reduction and clustering. Comprehensive experiments demonstrate the validity of our approach.
Y1 - 2006
UR - http://www.sciencedirect.com/science/journal/09252312
U6 - https://doi.org/10.1016/j.neucom.2005.05.015
SN - 0925-2312
ER -
TY - JOUR
A1 - Blankertz, Benjamin
A1 - Müller, Klaus-Robert
A1 - Krusienski, Dean
A1 - Schalk, Gerwin
A1 - Wolpaw, Jonathan R.
A1 - Schlögl, Alois
A1 - Pfurtscheller, Gert
A1 - Millan, José del R.
A1 - Schröder, Michael
A1 - Birbaumer, Niels
T1 - The BCI competition III : validating alternative approaches to actual BCI problems
N2 - A brain-computer interface (BCI) is a system that allows its users to control external devices with brain activity. Although the proof-of-concept was given decades ago, the reliable translation of user intent into device control commands is still a major challenge. Success requires the effective interaction of two adaptive controllers: the user's brain, which produces brain activity that encodes intent, and the BCI system, which translates that activity into device control commands. In order to facilitate this interaction, many laboratories are exploring a variety of signal analysis techniques to improve the adaptation of the BCI system to the user. In the literature, many machine learning and pattern classification algorithms have been reported to give impressive results when applied to BCI data in offline analyses. However, it is more difficult to evaluate their relative value for actual online use. BCI data competitions have been organized to provide objective formal evaluations of alternative methods. Prompted by the great interest in the first two BCI Competitions, we organized the third BCI Competition to address several of the most difficult and important analysis problems in BCI research. The paper describes the data sets that were provided to the competitors and gives an overview of the results.
Y1 - 2006
UR - http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7333
U6 - https://doi.org/10.1109/Tnsre.2006.875642
SN - 1534-4320
ER -
TY - JOUR
A1 - Jürgensen, Helmut
A1 - Konstantinidis, Stavros
T1 - (Near-)inverses of sequences
N2 - We introduce the notion of a near-inverse of a non-decreasing sequence of positive integers; near-inverses are intended to assume the role of inverses in cases when the latter cannot exist. We prove that the near-inverse of such a sequence is unique; moreover, the relation of being near-inverses of each other is symmetric, i.e. if sequence g is the near-inverse of sequence f, then f is the near-inverse of g. There is a connection, by approximations, between near- inverses of sequences and inverses of continuous strictly increasing real-valued functions which can be exploited to derive simple expressions for near-inverses
Y1 - 2006
UR - http://www.informaworld.com/openurl?genre=journal&issn=0020-7160
U6 - https://doi.org/10.1080/00207160500537801
SN - 0020-7160
ER -
TY - JOUR
A1 - Dornhege, Guido
A1 - Blankertz, Benjamin
A1 - Krauledat, Matthias
A1 - Losch, Florian
A1 - Curio, Gabriel
A1 - Müller, Klaus-Robert
T1 - Combined optimization of spatial and temporal filters for improving brain-computer interfacing
JF - IEEE transactions on bio-medical electronics
N2 - Brain-computer interface (BCI) systems create a novel communication channel from the brain to an output de ice by bypassing conventional motor output pathways of nerves and muscles. Therefore they could provide a new communication and control option for paralyzed patients. Modern BCI technology is essentially based on techniques for the classification of single-trial brain signals. Here we present a novel technique that allows the simultaneous optimization of a spatial and a spectral filter enhancing discriminability rates of multichannel EEG single-trials. The evaluation of 60 experiments involving 22 different subjects demonstrates the significant superiority of the proposed algorithm over to its classical counterpart: the median classification error rate was decreased by 11%. Apart from the enhanced classification, the spatial and/or the spectral filter that are determined by the algorithm can also be used for further analysis of the data, e.g., for source localization of the respective brain rhythms.
KW - brain-computer interface
KW - common spatial patterns
KW - EEG
KW - event-related desynchronization
KW - single-trial-analysis
Y1 - 2006
U6 - https://doi.org/10.1109/TBME.2006.883649
SN - 0018-9294
VL - 53
IS - 11
SP - 2274
EP - 2281
PB - IEEE
CY - New York
ER -
TY - JOUR
A1 - Bordihn, Henning
A1 - Holzer, Markus
T1 - Programmed grammars and their relation to the LBA problem
JF - Acta informatica
N2 - We consider generating and accepting programmed grammars with bounded degree of non-regulation, that is, the maximum number of elements in success or in failure fields of the underlying grammar. In particular, it is shown that this measure can be restricted to two without loss of descriptional capacity, regardless of whether arbitrary derivations or left-most derivations are considered. Moreover, in some cases, precise characterizations of the linear bounded automaton problem in terms of programmed grammars are obtained. Thus, the results presented in this paper shed new light on some longstanding open problem in the theory of computational complexity.
KW - programmed grammars
KW - accepting grammars
KW - LBA problem
KW - degree of non-regulation
KW - leftmost derivations
Y1 - 2006
U6 - https://doi.org/10.1007/s00236-006-0017-9
SN - 0001-5903
VL - 43
SP - 223
EP - 242
PB - Elsevier
CY - New York
ER -
TY - THES
A1 - Hu, Ji
T1 - A virtual machine architecture for IT-security laboratories
T1 - Eine virtuelle maschinenbasierte Architektur für IT-Sicherheitslabore
N2 - This thesis discusses challenges in IT security education, points out a gap between e-learning and practical education, and presents a work to fill the gap. E-learning is a flexible and personalized alternative to traditional education. Nonetheless, existing e-learning systems for IT security education have difficulties in delivering hands-on experience because of the lack of proximity. Laboratory environments and practical exercises are indispensable instruction tools to IT security education, but security education in conventional computer laboratories poses particular problems such as immobility as well as high creation and maintenance costs. Hence, there is a need to effectively transform security laboratories and practical exercises into e-learning forms. In this thesis, we introduce the Tele-Lab IT-Security architecture that allows students not only to learn IT security principles, but also to gain hands-on security experience by exercises in an online laboratory environment. In this architecture, virtual machines are used to provide safe user work environments instead of real computers. Thus, traditional laboratory environments can be cloned onto the Internet by software, which increases accessibility to laboratory resources and greatly reduces investment and maintenance costs. Under the Tele-Lab IT-Security framework, a set of technical solutions is also proposed to provide effective functionalities, reliability, security, and performance. The virtual machines with appropriate resource allocation, software installation, and system configurations are used to build lightweight security laboratories on a hosting computer. Reliability and availability of laboratory platforms are covered by a virtual machine management framework. This management framework provides necessary monitoring and administration services to detect and recover critical failures of virtual machines at run time. Considering the risk that virtual machines can be misused for compromising production networks, we present a security management solution to prevent the misuse of laboratory resources by security isolation at the system and network levels. This work is an attempt to bridge the gap between e-learning/tele-teaching and practical IT security education. It is not to substitute conventional teaching in laboratories but to add practical features to e-learning. This thesis demonstrates the possibility to implement hands-on security laboratories on the Internet reliably, securely, and economically.
N2 - Diese Dissertation beschreibt die Herausforderungen in der IT Sicherheitsausbildung und weist auf die noch vorhandene Lücke zwischen E-Learning und praktischer Ausbildung hin. Sie erklärt einen Ansatz sowie ein System, um diese Lücke zwischen Theorie und Praxis in der elektronischen Ausbildung zu schließen. E-Learning ist eine flexible und personalisierte Alternative zu traditionellen Lernmethoden. Heutigen E-Learning Systemen mangelt es jedoch an der Fähigkeit, praktische Erfahrungen über große Distanzen zu ermöglichen. Labor- bzw. Testumgebungen sowie praktische Übungen sind jedoch unverzichtbar, wenn es um die Ausbildung von Sicherheitsfachkräften geht. Konventionelle Laborumgebungen besitzen allerdings einige Nachteile wie bspw. hoher Erstellungsaufwand, keine Mobilität, hohe Wartungskosten, etc. Die Herausforderung heutiger IT Sicherheitsausbildung ist es daher, praktische Sicherheitslaborumgebungen und Übungen effektiv mittels E-Learning zu unterstützen. In dieser Dissertation wird die Architektur von Tele-Lab IT-Security vorgestellt, die Studenten nicht nur erlaubt theoretische Sicherheitskonzepte zu erlernen, sondern darüber hinaus Sicherheitsübungen in einer Online-Laborumgebung praktisch zu absolvieren. Die Teilnehmer können auf diese Weise wichtige praktische Erfahrungen im Umgang mit Sicherheitsprogrammen sammeln. Zur Realisierung einer sicheren Übungsumgebung, werden virtuelle Maschinen anstatt reale Rechner im Tele-Lab System verwendet. Mittels virtueller Maschinen können leicht Laborumgebungen geklont, verwaltet und über das Internet zugänglich gemacht werden. Im Vergleich zu herkömmlichen Offline-Laboren können somit erhebliche Investitions- und Wartungskosten gespart werden. Das Tele-Lab System bietet eine Reihe von technischen Funktionen, die den effektiven, zuverlässigen und sicheren Betrieb dieses Trainingssystems gewährleistet. Unter Beachtung angemessener Ressourcennutzung, Softwareinstallationen und Systemkonfigurationen wurden virtuelle Maschinen als Übungsstationen erstellt, die auf einem einzelnen Rechner betrieben werden. Für ihre Zuverlässigkeit und Verfügbarkeit ist das Managementsystem der virtuellen Maschinen verantwortlich. Diese Komponente besitzt die notwendigen Überwachungs- und Verwaltungsfunktionen, um kritische Fehler der virtuellen Maschinen während der Laufzeit zu erkennen und zu beheben. Damit die Übungsstationen nicht bspw. zur Kompromittierung von Produktivnetzwerken genutzt werden, beschreibt die Dissertation Sicherheits-Managementlösungen, die mittels Isolation auf System und Netzwerk Ebene genau dieses Risiko verhindern sollen. Diese Arbeit ist der Versuch, die Lücke zwischen E-Learning/Tele-Teaching und praktischer Sicherheitsausbildung zu schließen. Sie verfolgt nicht das Ziel, konventionelle Ausbildung in Offline Laboren zu ersetzen, sondern auch praktische Erfahrungen via E-Learning zu unterstützen. Die Dissertation zeigt die Möglichkeit, praktische Erfahrungen mittels Sicherheitsübungsumgebungen über das Internet auf zuverlässige, sichere und wirtschaftliche Weise zu vermitteln.
KW - Computersicherheit
KW - VM
KW - E-Learning
KW - IT security
KW - virtual machine
KW - E-Learning
Y1 - 2006
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-7818
ER -
TY - THES
A1 - Huang, Wanjun
T1 - Temporary binding for dynamic middleware construction and web services composition
T1 - Temporäre Anbindung für dynamischen Middlewareaufbau und Web Services Integration
N2 - With increasing number of applications in Internet and mobile environments, distributed software systems are demanded to be more powerful and flexible, especially in terms of dynamism and security. This dissertation describes my work concerning three aspects: dynamic reconfiguration of component software, security control on middleware applications, and web services dynamic composition. Firstly, I proposed a technology named Routing Based Workflow (RBW) to model the execution and management of collaborative components and realize temporary binding for component instances. The temporary binding means component instances are temporarily loaded into a created execution environment to execute their functions, and then are released to their repository after executions. The temporary binding allows to create an idle execution environment for all collaborative components, on which the change operations can be immediately carried out. The changes on execution environment will result in a new collaboration of all involved components, and also greatly simplifies the classical issues arising from dynamic changes, such as consistency preserving etc. To demonstrate the feasibility of RBW, I created a dynamic secure middleware system - the Smart Data Server Version 3.0 (SDS3). In SDS3, an open source implementation of CORBA is adopted and modified as the communication infrastructure, and three secure components managed by RBW, are created to enhance the security on the access of deployed applications. SDS3 offers multi-level security control on its applications from strategy control to application-specific detail control. For the management by RBW, the strategy control of SDS3 applications could be dynamically changed by reorganizing the collaboration of the three secure components. In addition, I created the Dynamic Services Composer (DSC) based on Apache open source projects, Apache Axis and WSIF. In DSC, RBW is employed to model the interaction and collaboration of web services and to enable the dynamic changes on the flow structure of web services. Finally, overall performance tests were made to evaluate the efficiency of the developed RBW and SDS3. The results demonstrated that temporary binding of component instances makes slight impacts on the execution efficiency of components, and the blackout time arising from dynamic changes can be extremely reduced in any applications.
N2 - Heutige Softwareanwendungen fuer das Internet und den mobilen Einsatz erfordern bezueglich Funktionalitaet und Sicherheit immer leistungsstaerkere verteilte Softwaresysteme. Diese Dissertation befasst sich mit der dynamischen Rekonfiguration von Komponentensoftware, Sicherheitskontrolle von Middlewareanwendungen und der dynamischen Komposition von Web Services. Zuerst wird eine Routing Based Workflow (RBW) Technologie vorgestellt, welche die Ausfuehrung und das Management von kollaborierenden Komponenten modelliert, sowie fuer die Realisierung einer temporaeren Anbindung von Komponenteninstanzen zustaendig ist. D.h., Komponenteninstanzen werden zur Ausfuehrung ihrer Funktionalitaet temporaer in eine geschaffene Ausfuehrungsumgebung geladen und nach Beendigung wieder freigegeben. Die temporaere Anbindung erlaubt das Erstellen einer Ausfuehrungsumgebung, in der Rekonfigurationen unmittelbar vollzogen werden koennen. Aenderungen der Ausfuehrungsumgebung haben neue Kollaborations-Beziehungen der Komponenten zufolge und vereinfachen stark die Schwierigkeiten wie z.B. Konsistenzerhaltung, die mit dynamischen Aenderungen verbunden sind. Um die Durchfuehrbarkeit von RBW zu demonstrieren, wurde ein dynamisches, sicheres Middleware System erstellt - der Smart Data Server, Version 3 (SDS3). Bei SDS3 kommt eine Open Source Softwareimplementierung von CORBA zum Einsatz, die modifiziert als Kommunikationsinfrasturkutur genutzt wird. Zudem wurden drei Sicherheitskomponenten erstellt, die von RBW verwaltet werden und die Sicherheit beim Zugriff auf die eingesetzten Anwendungen erhoehen. SDS3 bietet den Anwendungen Sicherheitskontrollfunktionen auf verschiedenen Ebenen, angefangen von einer Strategiekontrolle bis zu anwendungsspezifischen Kontrollfunktionen. Mittels RBW kann die Strategiekontrolle des SDS3 dynamisch durch Reorganisation von Kollabortions-Beziehungen zwischen den Sicherheitskomponenten angepasst werden. Neben diesem System wurde der Dynamic Service Composer (DSC) implementiert, welcher auf den Apache Open Source Projekten Apache Axis und WSIF basiert. Im DSC wird RBW eingesetzt, um die Interaktion und Zusammenarbeit von Web Services zu modellieren sowie dynamische Aenderungen der Flussstruktur von Web Services zu ermoeglichen. Nach der Implementierung wurden Performance-Tests bezueglich RBW und SDS3 durchgefuehrt. Die Ergebnisse der Tests zeigen, dass eine temporaere Anbindung von Komponenteninstanzen nur einen geringen Einfluss auf die Ausfuehrungseffizienz von Komponeten hat. Ausserdem bestaetigen die Testergebnisse, dass die mit der dynamischen Rekonfiguration verbundene Ausfallzeit extrem niedrig ist.
KW - Middleware
KW - Web Services
KW - Temporäre Anbindung
KW - Dynamische Rekonfiguration
KW - temporary binding
KW - dynamic reconfiguration
Y1 - 2006
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-7672
ER -
TY - JOUR
A1 - Meinel, Christoph
A1 - Klotz, Volker
T1 - The first 10 years of the ECCC digital library
Y1 - 2006
UR - http://portal.acm.org/cacm
U6 - https://doi.org/10.1145/1107458.1107484
ER -
TY - JOUR
A1 - Meinel, Christoph
A1 - Wang, Long
T1 - Building content clusters based on modelling page pairs
N2 - We give a new view on building content clusters from page pair models. We measure the heuristic importance within every two pages by computing the distance of their accessed positions in usage sessions. We also compare our page pair models with the classical pair models used in information theories and natural language processing, and give different evaluation methods to build the reasonable content communities. And we finally interpret the advantages and disadvantages of our models from detailed experiment results
Y1 - 2006
UR - http://www.springerlink.com/content/105633/
U6 - https://doi.org/10.1007/11610113_85
ER -
TY - JOUR
A1 - Ocheretnij, Vitalij
A1 - Gössel, Michael
A1 - Sogomonyan, Egor S.
A1 - Marienfeld, Daniel
T1 - Modulo p=3 checking for a carry select adder
N2 - In this paper a self-checking carry select adder is proposed. The duplicated adder blocks which are inherent to a carry select adder without error detection are checked modulo 3. Compared to a carry select adder without error detection the delay of the MSB of the sum of the proposed adder does not increase. Compared to a self-checking duplicated carry select adder the area is reduced by 20%. No restrictions are imposed on the design of the adder blocks
Y1 - 2006
UR - http://www.springerlink.com/content/100286
U6 - https://doi.org/10.1007/s10836-006-6260-8
ER -