Refine
Document Type
- Article (12)
- Doctoral Thesis (2)
- Other (1)
Language
- English (15)
Is part of the Bibliography
- yes (15)
Keywords
- Machine learning (15) (remove)
Institute
- Institut für Informatik und Computational Science (3)
- Fachgruppe Betriebswirtschaftslehre (2)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (2)
- Institut für Geowissenschaften (2)
- Institut für Mathematik (2)
- Institut für Physik und Astronomie (2)
- Institut für Umweltwissenschaften und Geographie (2)
- Institut für Biochemie und Biologie (1)
After almost two decades of development, modern Security Information and Event Management (SIEM) systems still face issues with normalisation of heterogeneous data sources, high number of false positive alerts and long analysis times, especially in large-scale networks with high volumes of security events. In this paper, we present our own prototype of SIEM system, which is capable of dealing with these issues. For efficient data processing, our system employs in-memory data storage (SAP HANA) and our own technologies from the previous work, such as the Object Log Format (OLF) and high-speed event normalisation. We analyse normalised data using a combination of three different approaches for security analysis: misuse detection, query-based analytics, and anomaly detection. Compared to the previous work, we have significantly improved our unsupervised anomaly detection algorithms. Most importantly, we have developed a novel hybrid outlier detection algorithm that returns ranked clusters of anomalies. It lets an operator of a SIEM system to concentrate on the several top-ranked anomalies, instead of digging through an unsorted bundle of suspicious events. We propose to use anomaly detection in a combination with signatures and queries, applied on the same data, rather than as a full replacement for misuse detection. In this case, the majority of attacks will be captured with misuse detection, whereas anomaly detection will highlight previously unknown behaviour or attacks. We also propose that only the most suspicious event clusters need to be checked by an operator, whereas other anomalies, including false positive alerts, do not need to be explicitly checked if they have a lower ranking. We have proved our concepts and algorithms on a dataset of 160 million events from a network segment of a big multinational company and suggest that our approach and methods are highly relevant for modern SIEM systems.
Management of agricultural soil quality requires fast and cost-efficient methods to identify multiple stressors that can affect soil organisms and associated ecological processes. Here, we propose to use soil protists which have a great yet poorly explored potential for bioindication. They are ubiquitous, highly diverse, and respond to various stresses to agricultural soils caused by frequent management or environmental changes. We test an approach that combines metabarcoding data and machine learning algorithms to identify potential stressors of soil protist community composition and diversity. We measured 17 key variables that reflect various potential stresses on soil protists across 132 plots in 28 Swiss vineyards over 2 years. We identified the taxa showing strong responses to the selected soil variables (potential bioindicator taxa) and tested for their predictive power. Changes in protist taxa occurrence and, to a lesser extent, diversity metrics exhibited great predictive power for the considered soil variables. Soil copper concentration, moisture, pH, and basal respiration were the best predicted soil variables, suggesting that protists are particularly responsive to stresses caused by these variables. The most responsive taxa were found within the clades Rhizaria and Alveolata. Our results also reveal that a majority of the potential bioindicators identified in this study can be used across years, in different regions and across different grape varieties. Altogether, soil protist metabarcoding data combined with machine learning can help identifying specific abiotic stresses on microbial communities caused by agricultural management. Such an approach provides complementary information to existing soil monitoring tools that can help manage the impact of agricultural practices on soil biodiversity and quality.
We present a new model of the geomagnetic field spanning the last 20 years and called Kalmag. Deriving from the assimilation of CHAMP and Swarm vector field measurements, it separates the different contributions to the observable field through parameterized prior covariance matrices. To make the inverse problem numerically feasible, it has been sequentialized in time through the combination of a Kalman filter and a smoothing algorithm. The model provides reliable estimates of past, present and future mean fields and associated uncertainties. The version presented here is an update of our IGRF candidates; the amount of assimilated data has been doubled and the considered time window has been extended from [2000.5, 2019.74] to [2000.5, 2020.33].
The Kp index is a measure of the midlatitude global geomagnetic activity and represents short-term magnetic variations driven by solar wind plasma and interplanetary magnetic field. The Kp index is one of the most widely used indicators for space weather alerts and serves as input to various models, such as for the thermosphere and the radiation belts. It is therefore crucial to predict the Kp index accurately. Previous work in this area has mostly employed artificial neural networks to nowcast Kp, based their inferences on the recent history of Kp and on solar wind measurements at L1. In this study, we systematically test how different machine learning techniques perform on the task of nowcasting and forecasting Kp for prediction horizons of up to 12 hr. Additionally, we investigate different methods of machine learning and information theory for selecting the optimal inputs to a predictive model. We illustrate how these methods can be applied to select the most important inputs to a predictive model of Kp and to significantly reduce input dimensionality. We compare our best performing models based on a reduced set of optimal inputs with the existing models of Kp, using different test intervals, and show how this selection can affect model performance.
The intensity of cosmic radiation may differ over five orders of magnitude within a few hours or days during the Solar Particle Events (SPEs), thus increasing for several orders of magnitude the probability of Single Event Upsets (SEUs) in space-borne electronic systems. Therefore, it is vital to enable the early detection of the SEU rate changes in order to ensure timely activation of dynamic radiation hardening measures. In this paper, an embedded approach for the prediction of SPEs and SRAM SEU rate is presented. The proposed solution combines the real-time SRAM-based SEU monitor, the offline-trained machine learning model and online learning algorithm for the prediction. With respect to the state-of-the-art, our solution brings the following benefits: (1) Use of existing on-chip data storage SRAM as a particle detector, thus minimizing the hardware and power overhead, (2) Prediction of SRAM SEU rate one hour in advance, with the fine-grained hourly tracking of SEU variations during SPEs as well as under normal conditions, (3) Online optimization of the prediction model for enhancing the prediction accuracy during run-time, (4) Negligible cost of hardware accelerator design for the implementation of selected machine learning model and online learning algorithm. The proposed design is intended for a highly dependable and self-adaptive multiprocessing system employed in space applications, allowing to trigger the radiation mitigation mechanisms before the onset of high radiation levels.
Email tracking allows email senders to collect fine-grained behavior and location data on email recipients, who are uniquely identifiable via their email address. Such tracking invades user privacy in that email tracking techniques gather data without user consent or awareness. Striving to increase privacy in email communication, this paper develops a detection engine to be the core of a selective tracking blocking mechanism in the form of three contributions. First, a large collection of email newsletters is analyzed to show the wide usage of tracking over different countries, industries and time. Second, we propose a set of features geared towards the identification of tracking images under real-world conditions. Novel features are devised to be computationally feasible and efficient, generalizable and resilient towards changes in tracking infrastructure. Third, we test the predictive power of these features in a benchmarking experiment using a selection of state-of-the-art classifiers to clarify the effectiveness of model-based tracking identification. We evaluate the expected accuracy of the approach on out-of-sample data, over increasing periods of time, and when faced with unknown senders. (C) 2018 Elsevier B.V. All rights reserved.
The plasmasphere is a dynamic region of cold, dense plasma surrounding the Earth. Its shape and size are highly susceptible to variations in solar and geomagnetic conditions. Having an accurate model of plasma density in the plasmasphere is important for GNSS navigation and for predicting hazardous effects of radiation in space on spacecraft. The distribution of cold plasma and its dynamic dependence on solar wind and geomagnetic conditions remain, however, poorly quantified. Existing empirical models of plasma density tend to be oversimplified as they are based on statistical averages over static parameters. Understanding the global dynamics of the plasmasphere using observations from space remains a challenge, as existing density measurements are sparse and limited to locations where satellites can provide in-situ observations. In this dissertation, we demonstrate how such sparse electron density measurements can be used to reconstruct the global electron density distribution in the plasmasphere and capture its dynamic dependence on solar wind and geomagnetic conditions.
First, we develop an automated algorithm to determine the electron density from in-situ measurements of the electric field on the Van Allen Probes spacecraft. In particular, we design a neural network to infer the upper hybrid resonance frequency from the dynamic spectrograms obtained with the Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrumentation suite, which is then used to calculate the electron number density. The developed Neural-network-based Upper hybrid Resonance Determination (NURD) algorithm is applied to more than four years of EMFISIS measurements to produce the publicly available electron density data set.
We utilize the obtained electron density data set to develop a new global model of plasma density by employing a neural network-based modeling approach. In addition to the location, the model takes the time history of geomagnetic indices and location as inputs, and produces electron density in the equatorial plane as an output. It is extensively validated using in-situ density measurements from the Van Allen Probes mission, and also by comparing the predicted global evolution of the plasmasphere with the global IMAGE EUV images of He+ distribution. The model successfully reproduces erosion of the plasmasphere on the night side as well as plume formation and evolution, and agrees well with data.
The performance of neural networks strongly depends on the availability of training data, which is limited during intervals of high geomagnetic activity. In order to provide reliable density predictions during such intervals, we can employ physics-based modeling. We develop a new approach for optimally combining the neural network- and physics-based models of the plasmasphere by means of data assimilation. The developed approach utilizes advantages of both neural network- and physics-based modeling and produces reliable global plasma density reconstructions for quiet, disturbed, and extreme geomagnetic conditions.
Finally, we extend the developed machine learning-based tools and apply them to another important problem in the field of space weather, the prediction of the geomagnetic index Kp. The Kp index is one of the most widely used indicators for space weather alerts and serves as input to various models, such as for the thermosphere, the radiation belts and the plasmasphere. It is therefore crucial to predict the Kp index accurately. Previous work in this area has mostly employed artificial neural networks to nowcast and make short-term predictions of Kp, basing their inferences on the recent history of Kp and solar wind measurements at L1. We analyze how the performance of neural networks compares to other machine learning algorithms for nowcasting and forecasting Kp for up to 12 hours ahead. Additionally, we investigate several machine learning and information theory methods for selecting the optimal inputs to a predictive model of Kp. The developed tools for feature selection can also be applied to other problems in space physics in order to reduce the input dimensionality and identify the most important drivers.
Research outlined in this dissertation clearly demonstrates that machine learning tools can be used to develop empirical models from sparse data and also can be used to understand the underlying physical processes. Combining machine learning, physics-based modeling and data assimilation allows us to develop novel methods benefiting from these different approaches.
Mixing layer manipulation experiment from open-loop forcing to closed-loop machine learning control
(2015)
Provisioning a sufficient stable source of food requires sound knowledge about current and upcoming threats to agricultural production. To that end machine learning approaches were used to identify the prevailing climatic and soil hydrological drivers of spatial and temporal yield variability of four crops, comprising 40 years yield data each from 351 counties in Germany. Effects of progress in agricultural management and breeding were subtracted from the data prior the machine learning modelling by fitting smooth non-linear trends to the 95th percentiles of observed yield data. An extensive feature selection approach was followed then to identify the most relevant predictors out of a large set of candidate predictors, comprising various soil and meteorological data. Particular emphasis was placed on studying the uniqueness of identified key predictors. Random Forest and Support Vector Machine models yielded similar although not identical results, capturing between 50% and 70% of the spatial and temporal variance of silage maize, winter barley, winter rapeseed and winter wheat yield. Equally good performance could be achieved with different sets of predictors. Thus identification of the most reliable models could not be based on the outcome of the model study only but required expert's judgement. Relationships between drivers and response often exhibited optimum curves, especially for summer air temperature and precipitation. In contrast, soil moisture clearly proved less relevant compared to meteorological drivers. In view of the expected climate change both excess precipitation and the excess heat effect deserve more attention in breeding as well as in crop modelling.
Machine learning for improvement of thermal conditions inside a hybrid ventilated animal building
(2021)
In buildings with hybrid ventilation, natural ventilation opening positions (windows), mechanical ventilation rates, heating, and cooling are manipulated to maintain desired thermal conditions. The indoor temperature is regulated solely by ventilation (natural and mechanical) when the external conditions are favorable to save external heating and cooling energy. The ventilation parameters are determined by a rule-based control scheme, which is not optimal. This study proposes a methodology to enable real-time optimum control of ventilation parameters. We developed offline prediction models to estimate future thermal conditions from the data collected from building in operation. The developed offline model is then used to find the optimal controllable ventilation parameters in real-time to minimize the setpoint deviation in the building. With the proposed methodology, the experimental building's setpoint deviation improved for 87% of time, on average, by 0.53 degrees C compared to the current deviations.