Refine
Has Fulltext
- yes (260) (remove)
Year of publication
- 2013 (260) (remove)
Document Type
- Doctoral Thesis (89)
- Postprint (60)
- Article (57)
- Preprint (22)
- Monograph/Edited Volume (17)
- Conference Proceeding (9)
- Habilitation Thesis (2)
- Master's Thesis (2)
- Part of Periodical (2)
Language
- English (260) (remove)
Keywords
- Curriculum Framework (17)
- European values education (17)
- Europäische Werteerziehung (17)
- Familie (17)
- Family (17)
- Lehrevaluation (17)
- Studierendenaustausch (17)
- Unterrichtseinheiten (17)
- curriculum framework (17)
- lesson evaluation (17)
Institute
- Institut für Chemie (29)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (26)
- Extern (24)
- Institut für Geowissenschaften (22)
- Wirtschaftswissenschaften (20)
- Institut für Mathematik (19)
- Institut für Umweltwissenschaften und Geographie (19)
- Institut für Biochemie und Biologie (15)
- Institut für Physik und Astronomie (15)
- Mathematisch-Naturwissenschaftliche Fakultät (14)
1. Introduction 2. Analysis of implementation of the Basel III in China 2.1 Implementation of capital adequacy rules 2.2 Implementation of leverage ratio rules 2.3 Implementation of liquidity management rules 3. Suggestions for further development of China’s banking industry 3.1 Promoting capital structure adjustment and broadening capital supplement channels 3.2 Transforming business models and developing intermediary and off-balance business 3.3 Increasing the intensity of risk management and refining its standards
The dynamics of external contributions to the geomagnetic field is investigated by applying time-frequency methods to magnetic observatory data. Fractal models and multiscale analysis enable obtaining maximum quantitative information related to the short-term dynamics of the geomagnetic field activity. The stochastic properties of the horizontal component of the transient external field are determined by searching for scaling laws in the power spectra. The spectrum fits a power law with a scaling exponent β, a typical characteristic of self-affine time-series. Local variations in the power-law exponent are investigated by applying wavelet analysis to the same time-series. These analyses highlight the self-affine properties of geomagnetic perturbations and their persistence. Moreover, they show that the main phases of sudden storm disturbances are uniquely characterized by a scaling exponent varying between 1 and 3, possibly related to the energy contained in the external field. These new findings suggest the existence of a long-range dependence, the scaling exponent being an efficient indicator of geomagnetic activity and singularity detection. These results show that by using magnetogram regularity to reflect the magnetosphere activity, a theoretical analysis of the external geomagnetic field based on local power-law exponents is possible.
As Albania is accelerating its preparations towards the European Union candidate status, numerous areas of public policy and practices undergo intensive development processes. Regional development policy is a very new area of public policy in Albania, and needs research and development. This study focuses on the process of sustainable development in Albania, by analyzing and comparing the regional development of regions of Tirana, Shkodra and Kukes. The methodology used consists of a literature/desk review; analytical and comparative approach; qualitative interviews; quantitative data collection; analysis. The research is organized in five chapters. First chapter provides an overview of the study framework. The second outlines the theory and scientific framework for sustainable and regional development in relation with geography. The third chapter presents the picture of the regional development in Albania, analyzing the disparities and regional development in the light of EU requirements and NUTS division. Chapter 4 continues by analyzing and comparing the regional development of the regions: Tirana – driver for change, Shkodra – the North in Development and Kukes – the “shrinking” region. Chapter 5 presents the conclusions and recommendations. This research comes to the conclusions that if growth in Albania is to be increased and sustained, a regional development policy needs to be established.
Background: The use of psychoactive substances to neuroenhance cognitive performance is prevalent. Neuroenhancement (NE) in everyday life and doping in sport might rest on similar attitudinal representations, and both behaviors can be theoretically modeled by comparable means-to-end relations (substance-performance). A behavioral (not substance-based) definition of NE is proposed, with assumed functionality as its core component. It is empirically tested whether different NE variants (lifestyle drug, prescription drug, and illicit substance) can be regressed to school stressors.
Findings: Participants were 519 students (25.8 +/- 8.4 years old, 73.1% female). Logistic regressions indicate that a modified doping attitude scale can predict all three NE variants. Multiple NE substance abuse was frequent. Overwhelming demands in school were associated with lifestyle and prescription drug NE.
Conclusions: Researchers should be sensitive for probable structural similarities between enhancement in everyday life and sport and systematically explore where findings from one domain can be adapted for the other. Policy makers should be aware that students might misperceive NE as an acceptable means of coping with stress in school, and help to form societal sensitivity for the topic of NE among our younger ones in general.
Background: Neuroenhancement (NE), the use of psychoactive substances in order to enhance a healthy individual's cognitive functioning from a proficient to an even higher level, is prevalent in student populations. According to the strength model of self-control, people fail to self-regulate and fall back on their dominant behavioral response when finite self-control resources are depleted. An experiment was conducted to test the hypothesis that ego-depletion will prevent students who are unfamiliar with NE from trying it.
Findings: 130 undergraduates, who denied having tried NE before (43% female, mean age = 22.76 +/- 4.15 years old), were randomly assigned to either an ego-depletion or a control condition. The dependent variable was taking an "energy-stick" (a legal nutritional supplement, containing low doses of caffeine, taurine and vitamin B), offered as a potential means of enhancing performance on the bogus concentration task that followed. Logistic regression analysis showed that ego-depleted participants were three times less likely to take the substance, OR = 0.37, p = .01.
Conclusion: This experiment found that trying NE for the first time was more likely if an individual's cognitive capacities were not depleted. This means that mental exhaustion is not predictive for NE in students for whom NE is not the dominant response. Trying NE for the first time is therefore more likely to occur as a thoughtful attempt at self-regulation than as an automatic behavioral response in stressful situations. We therefore recommend targeting interventions at this inter-individual difference. Students without previous reinforcing NE experience should be provided with information about the possible negative health outcomes of NE. Reconfiguring structural aspects in the academic environment (e.g. lessening workloads) might help to deter current users.
We present and discuss the results of crystallographic and electron paramagnetic resonance (EPR) spectroscopic analyses of five tetrachloridocuprate(II) complexes to supply a useful tool for the structural characterisation of the [CuCl4]2− moiety in the liquid state, for example in ionic liquids, or in solution. Bis(benzyltriethylammonium)-, bis(trimethylphenylammonium)-, bis(ethyltriphenylphosphonium)-, bis(benzyltriphenylphosphonium)-, and bis(tetraphenylarsonium)tetrachloridocuprate(II) were synthesised and characterised by elemental, IR, EPR and X-ray analyses. The results of the crystallographic analyses show distorted tetrahedral coordination geometry of all [CuCl4]2− anions in the five complexes and prove that all investigated complexes are stabilised by hydrogen bonds of different intensities. Despite the use of sterically demanding ammonium, phosphonium and arsonium cations to obtain the separation of the paramagnetic Cu(II) centres for EPR spectroscopy no hyperfine structure was observed in the EPR spectra but the principal values of the electron Zeeman tensor, g∥ and g⊥, could be determined. With these EPR data and the crystallographic parameters we were able to carry out a correlation study to anticipate the structural situation of tetrachloridocuprates in different physical states. This correlation is in good agreement with DFT calculations.
Deep into the second half of the twentieth century the traditionalist definition of India as a country of villages remained dominant in official political rhetoric as well as cultural production. In the past two decades or so, this ruralist paradigm has been effectively superseded by a metropolitan imaginary in which the modern, globalised megacity increasingly functions as representative of India as a whole. Has the village, then, entirely vanished from the cultural imaginary in contemporary India? Addressing economic practices from upper-class consumerism to working-class family support strategies, this paper attempts to trace how ‘the village’ resurfaces or survives as a cultural reference point in the midst of the urban.
Various 1,6- and 1,8-naphthalenophanes were synthesized by using the Photo-Dehydro-Diels-Alder (PDDA) reaction of bis-ynones. These compounds are easily accessible from omega-(3-iodophenyl)carboxylic acids in three steps. The obtained naphthalenophanes are axially chiral and the activation barrier for the atropisomerization could be determined in some cases by means of dynamic NMR (DNMR) and/or dynamic HPLC (DHPLC) experiments.
Informatics as a school subject has been virtually absent from bilingual education programs in German secondary schools. Most bilingual programs in German secondary education started out by focusing on subjects from the field of social sciences. Teachers and bilingual curriculum experts alike have been regarding those as the most suitable subjects for bilingual instruction – largely due to the intercultural perspective that a bilingual approach provides. And though one cannot deny the gain that ensues from an intercultural perspective on subjects such as history or geography, this benefit is certainly not limited to social science subjects. In consequence, bilingual curriculum designers have already begun to include other subjects such as physics or chemistry in bilingual school programs. It only seems a small step to extend this to informatics. This paper will start out by addressing potential benefits of adding informatics to the range of subjects taught as part of English-language bilingual programs in German secondary education. In a second step it will sketch out a methodological (= didactical) model for teaching informatics to German learners through English. It will then provide two items of hands-on and tested teaching material in accordance with this model. The discussion will conclude with a brief outlook on the chances and prerequisites of firmly establishing informatics as part of bilingual school curricula in Germany.
Multi-messenger constraints and pressure from dark matter annihilation into electron-positron pairs
(2013)
Despite striking evidence for the existence of dark matter from astrophysical observations, dark matter has still escaped any direct or indirect detection until today. Therefore a proof for its existence and the revelation of its nature belongs to one of the most intriguing challenges of nowadays cosmology and particle physics. The present work tries to investigate the nature of dark matter through indirect signatures from dark matter annihilation into electron-positron pairs in two different ways, pressure from dark matter annihilation and multi-messenger constraints on the dark matter annihilation cross-section. We focus on dark matter annihilation into electron-positron pairs and adopt a model-independent approach, where all the electrons and positrons are injected with the same initial energy E_0 ~ m_dm*c^2. The propagation of these particles is determined by solving the diffusion-loss equation, considering inverse Compton scattering, synchrotron radiation, Coulomb collisions, bremsstrahlung, and ionization. The first part of this work, focusing on pressure from dark matter annihilation, demonstrates that dark matter annihilation into electron-positron pairs may affect the observed rotation curve by a significant amount. The injection rate of this calculation is constrained by INTEGRAL, Fermi, and H.E.S.S. data. The pressure of the relativistic electron-positron gas is computed from the energy spectrum predicted by the diffusion-loss equation. For values of the gas density and magnetic field that are representative of the Milky Way, it is estimated that the pressure gradients are strong enough to balance gravity in the central parts if E_0 < 1 GeV. The exact value depends somewhat on the astrophysical parameters, and it changes dramatically with the slope of the dark matter density profile. For very steep slopes, as those expected from adiabatic contraction, the rotation curves of spiral galaxies would be affected on kiloparsec scales for most values of E_0. By comparing the predicted rotation curves with observations of dwarf and low surface brightness galaxies, we show that the pressure from dark matter annihilation may improve the agreement between theory and observations in some cases, but it also imposes severe constraints on the model parameters (most notably, the inner slope of the halo density profile, as well as the mass and the annihilation cross-section of dark matter particles into electron-positron pairs). In the second part, upper limits on the dark matter annihilation cross-section into electron-positron pairs are obtained by combining observed data at different wavelengths (from Haslam, WMAP, and Fermi all-sky intensity maps) with recent measurements of the electron and positron spectra in the solar neighbourhood by PAMELA, Fermi, and H.E.S.S.. We consider synchrotron emission in the radio and microwave bands, as well as inverse Compton scattering and final-state radiation at gamma-ray energies. For most values of the model parameters, the tightest constraints are imposed by the local positron spectrum and synchrotron emission from the central regions of the Galaxy. According to our results, the annihilation cross-section should not be higher than the canonical value for a thermal relic if the mass of the dark matter candidate is smaller than a few GeV. In addition, we also derive a stringent upper limit on the inner logarithmic slope α of the density profile of the Milky Way dark matter halo (α < 1 if m_dm < 5 GeV, α < 1.3 if m_dm < 100 GeV and α < 1.5 if m_dm < 2 TeV) assuming a dark matter annihilation cross-section into electron-positron pairs (σv) = 3*10^−26 cm^3 s^−1, as predicted for thermal relics from the big bang.
In a recent paper with N. Tarkhanov, the Lefschetz number for endomorphisms (modulo trace class operators) of sequences of trace class curvature was introduced. We show that this is a well defined, canonical extension of the classical Lefschetz number and establish the homotopy invariance of this number. Moreover, we apply the results to show that the Lefschetz fixed point formula holds for geometric quasiendomorphisms of elliptic quasicomplexes.
The development of self-adaptive software requires the engineering of an adaptation engine that controls and adapts the underlying adaptable software by means of feedback loops. The adaptation engine often describes the adaptation by using runtime models representing relevant aspects of the adaptable software and particular activities such as analysis and planning that operate on these runtime models. To systematically address the interplay between runtime models and adaptation activities in adaptation engines, runtime megamodels have been proposed for self-adaptive software. A runtime megamodel is a specific runtime model whose elements are runtime models and adaptation activities. Thus, a megamodel captures the interplay between multiple models and between models and activities as well as the activation of the activities. In this article, we go one step further and present a modeling language for ExecUtable RuntimE MegAmodels (EUREMA) that considerably eases the development of adaptation engines by following a model-driven engineering approach. We provide a domain-specific modeling language and a runtime interpreter for adaptation engines, in particular for feedback loops. Megamodels are kept explicit and alive at runtime and by interpreting them, they are directly executed to run feedback loops. Additionally, they can be dynamically adjusted to adapt feedback loops. Thus, EUREMA supports development by making feedback loops, their runtime models, and adaptation activities explicit at a higher level of abstraction. Moreover, it enables complex solutions where multiple feedback loops interact or even operate on top of each other. Finally, it leverages the co-existence of self-adaptation and off-line adaptation for evolution.
Even though quite different in occurrence and consequences, from a modeling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding: uncertainty about the modeling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Nevertheless deterministic approaches are still widely used in natural hazard assessments, holding the risk of underestimating the hazard with disastrous effects. The all-round probabilistic framework of Bayesian networks constitutes an attractive alternative. In contrast to deterministic proceedings, it treats response variables as well as explanatory variables as random variables making no difference between input and output variables. Using a graphical representation Bayesian networks encode the dependency relations between the variables in a directed acyclic graph: variables are represented as nodes and (in-)dependencies between variables as (missing) edges between the nodes. The joint distribution of all variables can thus be described by decomposing it, according to the depicted independences, into a product of local conditional probability distributions, which are defined by the parameters of the Bayesian network. In the framework of this thesis the Bayesian network approach is applied to different natural hazard domains (i.e. seismic hazard, flood damage and landslide assessments). Learning the network structure and parameters from data, Bayesian networks reveal relevant dependency relations between the included variables and help to gain knowledge about the underlying processes. The problem of Bayesian network learning is cast in a Bayesian framework, considering the network structure and parameters as random variables itself and searching for the most likely combination of both, which corresponds to the maximum a posteriori (MAP score) of their joint distribution given the observed data. Although well studied in theory the learning of Bayesian networks based on real-world data is usually not straight forward and requires an adoption of existing algorithms. Typically arising problems are the handling of continuous variables, incomplete observations and the interaction of both. Working with continuous distributions requires assumptions about the allowed families of distributions. To "let the data speak" and avoid wrong assumptions, continuous variables are instead discretized here, thus allowing for a completely data-driven and distribution-free learning. An extension of the MAP score, considering the discretization as random variable as well, is developed for an automatic multivariate discretization, that takes interactions between the variables into account. The discretization process is nested into the network learning and requires several iterations. Having to face incomplete observations on top, this may pose a computational burden. Iterative proceedings for missing value estimation become quickly infeasible. A more efficient albeit approximate method is used instead, estimating the missing values based only on the observations of variables directly interacting with the missing variable. Moreover natural hazard assessments often have a primary interest in a certain target variable. The discretization learned for this variable does not always have the required resolution for a good prediction performance. Finer resolutions for (conditional) continuous distributions are achieved with continuous approximations subsequent to the Bayesian network learning, using kernel density estimations or mixtures of truncated exponential functions. All our proceedings are completely data-driven. We thus avoid assumptions that require expert knowledge and instead provide domain independent solutions, that are applicable not only in other natural hazard assessments, but in a variety of domains struggling with uncertainties.
The process of introducing compulsory ICT education at primary school level in the Czech Republic should be completed next year. Programming and Information, two topics from the basics of computer science have been included in a new textbook. The question is whether the new chapters of the textbook are comprehensible for primary school teachers, who have undergone no training in computer science. The paper reports on a pilot verification project in which pre-service primary school teachers were trained to teach these informatics topics.
TRAPID
(2013)
Transcriptome analysis through next-generation sequencing technologies allows the generation of detailed gene catalogs for non-model species, at the cost of new challenges with regards to computational requirements and bioinformatics expertise. Here, we present TRAPID, an online tool for the fast and efficient processing of assembled RNA-Seq transcriptome data, developed to mitigate these challenges. TRAPID offers high-throughput open reading frame detection, frameshift correction and includes a functional, comparative and phylogenetic toolbox, making use of 175 reference proteomes. Benchmarking and comparison against state-of-the-art transcript analysis tools reveals the efficiency and unique features of the TRAPID system. TRAPID is freely available at http://bioinformatics.psb.ugent.be/webtools/trapid/.
Large Central European flood events of the past have demonstrated that flooding can affect several river basins at the same time leading to catastrophic economic and humanitarian losses that can stretch emergency resources beyond planned levels of service. For Germany, the spatial coherence of flooding, the contributing processes and the role of trans-basin floods for a national risk assessment is largely unknown and analysis is limited by a lack of systematic data, information and knowledge on past events. This study investigates the frequency and intensity of trans-basin flood events in Germany. It evaluates the data and information basis on which knowledge about trans-basin floods can be generated in order to improve any future flood risk assessment. In particu-lar, the study assesses whether flood documentations and related reports can provide a valuable data source for understanding trans-basin floods. An adaptive algorithm was developed that systematically captures trans-basin floods using series of mean daily discharge at a large number of sites of even time series length (1952-2002). It identifies the simultaneous occurrence of flood peaks based on the exceedance of an initial threshold of a 10 year flood at one location and consecutively pools all causally related, spatially and temporally lagged peak recordings at the other locations. A weighted cumulative index was developed that accounts for the spatial extent and the individual flood magnitudes within an event and allows quantifying the overall event severity. The parameters of the method were tested in a sensitivity analysis. An intensive study on sources and ways of information dissemination of flood-relevant publications in Germany was conducted. Based on the method of systematic reviews a strategic search approach was developed to identify relevant documentations for each of the 40 strongest trans-basin flood events. A novel framework for assessing the quality of event specific flood reports from a user’s perspective was developed and validated by independent peers. The framework was designed to be generally applicable for any natural hazard type and assesses the quality of a document addressing accessibility as well as representational, contextual, and intrinsic dimensions of quality. The analysis of time-series of mean daily discharge resulted in the identification of 80 trans-basin flood events within the period 1952-2002 in Germany. The set is dominated by events that were recorded in the hydrological winter (64%); 36% occurred during the summer months. The occurrence of floods is characterised by a distinct clustering in time. Dividing the study period into two sub-periods, we find an increase in the percentage of winter events from 58% in the first to 70.5% in the second sub-period. Accordingly, we find a significant increase in the number of extreme trans-basin floods in the second sub-period. A large body of 186 flood relevant documentations was identified. For 87.5% of the 40 strongest trans-basin floods in Germany at least one report has been found and for the most severe floods a substantial amount of documentation could be obtained. 80% of the material can be considered grey literature (i.e. literature not controlled by commercial publishers). The results of the quality assessment show that the majority of flood event specific reports are of a good quality, i.e. they are well enough drafted, largely accurate and objective, and contain a substantial amount of information on the sources, pathways and receptors/consequences of the floods. The inclusion of this information in the process of knowledge building for flood risk assessment is recommended. Both the results as well as the data produced in this study are openly accessible and can be used for further research. The results of this study contribute to an improved spatial risk assessment in Germany. The identified set of trans-basin floods provides the basis for an assessment of the chance that flooding occurs simultaneously at a number of sites. The information obtained from flood event documentation can usefully supplement the analysis of the processes that govern flood risk.
Interactive rendering techniques for focus+context visualization of 3D geovirtual environments
(2013)
This thesis introduces a collection of new real-time rendering techniques and applications for focus+context visualization of interactive 3D geovirtual environments such as virtual 3D city and landscape models. These environments are generally characterized by a large number of objects and are of high complexity with respect to geometry and textures. For these reasons, their interactive 3D rendering represents a major challenge. Their 3D depiction implies a number of weaknesses such as occlusions, cluttered image contents, and partial screen-space usage. To overcome these limitations and, thus, to facilitate the effective communication of geo-information, principles of focus+context visualization can be used for the design of real-time 3D rendering techniques for 3D geovirtual environments (see Figure). In general, detailed views of a 3D geovirtual environment are combined seamlessly with abstracted views of the context within a single image. To perform the real-time image synthesis required for interactive visualization, dedicated parallel processors (GPUs) for rasterization of computer graphics primitives are used. For this purpose, the design and implementation of appropriate data structures and rendering pipelines are necessary. The contribution of this work comprises the following five real-time rendering methods: • The rendering technique for 3D generalization lenses enables the combination of different 3D city geometries (e.g., generalized versions of a 3D city model) in a single image in real time. The method is based on a generalized and fragment-precise clipping approach, which uses a compressible, raster-based data structure. It enables the combination of detailed views in the focus area with the representation of abstracted variants in the context area. • The rendering technique for the interactive visualization of dynamic raster data in 3D geovirtual environments facilitates the rendering of 2D surface lenses. It enables a flexible combination of different raster layers (e.g., aerial images or videos) using projective texturing for decoupling image and geometry data. Thus, various overlapping and nested 2D surface lenses of different contents can be visualized interactively. • The interactive rendering technique for image-based deformation of 3D geovirtual environments enables the real-time image synthesis of non-planar projections, such as cylindrical and spherical projections, as well as multi-focal 3D fisheye-lenses and the combination of planar and non-planar projections. • The rendering technique for view-dependent multi-perspective views of 3D geovirtual environments, based on the application of global deformations to the 3D scene geometry, can be used for synthesizing interactive panorama maps to combine detailed views close to the camera (focus) with abstract views in the background (context). This approach reduces occlusions, increases the usage the available screen space, and reduces the overload of image contents. • The object-based and image-based rendering techniques for highlighting objects and focus areas inside and outside the view frustum facilitate preattentive perception. The concepts and implementations of interactive image synthesis for focus+context visualization and their selected applications enable a more effective communication of spatial information, and provide building blocks for design and development of new applications and systems in the field of 3D geovirtual environments.
Scrambling and interfaces
(2013)
This paper proposes a novel analysis of the Russian OVS construction and argues that the parametric variation in the availability of OVS cross-linguistically depends on the type of relative interpretative argument prominence that a language encodes via syntactic structure. When thematic and information-structural prominence relations do not coincide, only one of them can be structurally/linearly represented. The relation that is not structurally/linearly encoded must be made visible at the PF interface either via prosody or morphology.
Hydrothermal carbonisation
(2013)
The world’s appetite for energy is producing growing quantities of CO2, a pollutant that contributes to the warming of the planet and which currently cannot be removed or stored in any significant way. Other natural reserves are also being devoured at alarming rates and current assessments suggest that we will need to identify alternative sources in the near future. With the aid of materials chemistry it should be possible to create a world in which energy use needs not be limited and where usable energy can be produced and stored wherever it is needed, where we can minimize and remediate emissions as new consumer products are created, whilst healing the planet and preventing further disruptive and harmful depletion of valuable mineral assets. In achieving these aims, the creation of new and very importantly greener industries and new sustainable pathways are crucial. In all of the aforementioned applications, new materials based on carbon, ideally produced via inexpensive, low energy consumption methods, using renewable resources as precursors, with flexible morphologies, pore structures and functionalities, are increasingly viewed as ideal candidates to fulfill these goals. The resulting materials should be a feasible solution for the efficient storage of energy and gases. At the end of life, such materials ideally must act to improve soil quality and to act as potential CO2 storage sinks. This is exactly the subject of this habilitation thesis: an alternative technology to produce carbon materials from biomass in water using low carbonisation temperatures and self-generated pressures. This technology is called hydrothermal carbonisation. It has been developed during the past five years by a group of young and talented researchers working under the supervision of Dr. Titirici at the Max-Planck Institute of Colloids and Interfaces and it is now a well-recognised methodology to produce carbon materials with important application in our daily lives. These applications include electrodes for portable electronic devices, filters for water purification, catalysts for the production of important chemicals as well as drug delivery systems and sensors.