Refine
Year of publication
Document Type
- Article (35594)
- Doctoral Thesis (6480)
- Monograph/Edited Volume (5537)
- Postprint (3296)
- Review (2291)
- Part of a Book (1052)
- Other (899)
- Preprint (567)
- Conference Proceeding (536)
- Part of Periodical (497)
Language
- English (30534)
- German (25966)
- Spanish (363)
- French (329)
- Italian (115)
- Russian (112)
- Multiple languages (65)
- Hebrew (36)
- Portuguese (25)
- Polish (24)
Keywords
- Germany (201)
- climate change (180)
- Deutschland (142)
- machine learning (80)
- European Union (79)
- Sprachtherapie (77)
- diffusion (76)
- morphology (74)
- Patholinguistik (73)
- patholinguistics (73)
Institute
- Institut für Biochemie und Biologie (5417)
- Institut für Physik und Astronomie (5398)
- Institut für Geowissenschaften (3628)
- Institut für Chemie (3471)
- Wirtschaftswissenschaften (2644)
- Historisches Institut (2506)
- Department Psychologie (2337)
- Institut für Mathematik (2140)
- Institut für Romanistik (2107)
- Sozialwissenschaften (1883)
Effect of benzylglucosinolate on signaling pathways associated with type 2 diabetes prevention
(2014)
Type 2 diabetes (T2D) is a health problem throughout the world. In 2010, there were nearly 230 million individuals with diabetes worldwide and it is estimated that in the economically advanced countries the cases will increase about 50% in the next twenty years. Insulin resistance is one of major features in T2D, which is also a risk factor for metabolic and cardiovascular complications. Epidemiological and animal studies have shown that the consumption of vegetables and fruits can delay or prevent the development of the disease, although the underlying mechanisms of these effects are still unclear. Brassica species such as broccoli (Brassica oleracea var. italica) and nasturtium (Tropaeolum majus) possess high content of bioactive phytochemicals, e.g. nitrogen sulfur compounds (glucosinolates and isothiocyanates) and polyphenols largely associated with the prevention of cancer. Isothiocyanates (ITCs) display their anti-carcinogenic potential by inducing detoxicating phase II enzymes and increasing glutathione (GSH) levels in tissues. In T2D diabetes an increase in gluconeogenesis and triglyceride synthesis, and a reduction in fatty acid oxidation accompanied by the presence of reactive oxygen species (ROS) are observed; altogether is the result of an inappropriate response to insulin. Forkhead box O (FOXO) transcription factors play a crucial role in the regulation of insulin effects on gene expression and metabolism, and alterations in FOXO function could contribute to metabolic disorders in diabetes. In this study using stably transfected human osteosarcoma cells (U-2 OS) with constitutive expression of FOXO1 protein labeled with GFP (green fluorescent protein) and human hepatoma cells HepG2 cell cultures, the ability of benzylisothiocyanate (BITC) deriving from benzylglucosinolate, extracted from nasturtium to modulate, i) the insulin-signaling pathway, ii) the intracellular localization of FOXO1 and iii) the expression of proteins involved in glucose metabolism, ROS detoxification, cell cycle arrest and DNA repair was evaluated. BITC promoted oxidative stress and in response to that induced FOXO1 translocation from cytoplasm into the nucleus antagonizing the insulin effect. BITC stimulus was able to down-regulate gluconeogenic enzymes, which can be considered as an anti-diabetic effect; to promote antioxidant resistance expressed by the up-regulation in manganese superoxide dismutase (MnSOD) and detoxification enzymes; to modulate autophagy by induction of BECLIN1 and down-regulation of the mammalian target of rapamycin complex 1 (mTORC1) pathway; and to promote cell cycle arrest and DNA damage repair by up-regulation of the cyclin-dependent kinase inhibitor (p21CIP) and Growth Arrest / DNA Damage Repair (GADD45). Except for the nuclear factor (erythroid derived)-like2 (NRF2) and its influence in the detoxification enzymes gene expression, all the observed effects were independent from FOXO1, protein kinase B (AKT/PKB) and NAD-dependent deacetylase sirtuin-1 (SIRT1). The current study provides evidence that besides of the anticarcinogenic potential, isothiocyanates might have a role in T2D prevention. BITC stimulus mimics the fasting state, in which insulin signaling is not triggered and FOXO proteins remain in the nucleus modulating gene expression of their target genes, with the advantage of a down-regulation of gluconeogenesis instead of its increase. These effects suggest that BITC might be considered as a promising substance in the prevention or treatment of T2D, therefore the factors behind of its modulatory effects need further investigation.
Es ist in dieser Arbeit gelungen, starre Oligospiroketal(OSK)-Stäbe als Grundbausteine für komplexe 2D- und 3D-Systeme zu verwenden. Dazu wurde ein difunktionalisierter starrer Stab synthetisiert, der mit seines Gleichen und anderen verzweigten Funktionalisierungseinheiten in Azid-Alkin-Klickreaktionen eingesetzt wurde. An zwei über Klickreaktion verknüpften OSK-Stäben konnten mittels theoretischer Berechnungen Aussagen über die neuartige Bimodalität der Konformation getroffen werden. Es wurde dafür der Begriff Gelenkstab eingeführt, da die Moleküle um ein Gelenk gedreht sowohl gestreckt als auch geknickt vorliegen können. Aufbauend auf diesen Erkenntnissen konnte gezeigt werden, dass nicht nur gezielt große Polymere aus bis zu vier OSK-Stäben synthetisiert werden können, sondern es auch möglich ist, durch gezielte Änderung von Reaktionsbedingungen der Klickreaktion auch Cyclen aus starren OSK-Stäben herzustellen. Die neu entwickelte Substanzklasse der Gelenkstäbe wurde im Hinblick auf die Steuerung des vorliegenden Gleichgewichts zwischen geknicktem und gestrecktem Gelenkstab hin untersucht. Dafür wurde der Gelenkstab mit Pyrenylresten in terminaler Position versehen. Es wurde durch Fluoreszenzmessungen festgestellt, dass das Gleichgewicht z. B. durch die Temperatur oder die Wahl des Lösungsmittels beeinflussbar ist. Für vielfache Anwendungen wurde eine vereinfachte Synthesestrategie gefunden, mit der eine beliebige Funktionalisierung in nur einem Syntheseschritt erreicht werden konnte. Es konnten photoaktive Gelenkstäbe synthetisiert werden, die gezielt zur intramolekularen Dimerisierung geführt werden konnten. Zusätzlich wurde durch Aminosäuren ein Verknüpfungselement am Ende der Gelenkstäbe gefunden, das eine stereoselektive Synthese von Mehrfachfunktionalisierungen zulässt. Die Synthese der komplexen Gelenkstäbe wurde als ein neuartiges Gebiet aufgezeigt und bietet ein breites Forschungspotential für weitere Anwendungen z. B. in der Biologie (als molekulare Schalter für Ionentransporte) und in der Materialchemie (als Ladungs- oder Energietransporteure).
Der Arbeitskreis Militär und Gesellschaft in der Frühen Neuzeit e. V. wurde im Frühjahr 1995 gegründet. Er hat es sich zur Aufgabe gemacht, die Erforschung des Militärs im Rahmen der frühneuzeitlichen Geschichte zu befördern und zugleich das Bewusstsein der Frühneuzeit-HistorikerInnen für die Bedeutung des Militärs in all seinen Funktionen zu wecken. Das Militär steht somit als soziale Gruppe selbst im Mittelpunkt der Aktivitäten des Arbeitskreises, wird aber auch in seinen Wirkungen und Repräsentationen thematisiert. Ziel ist es, die Rolle des Militärs als Teil der frühneuzeitlichen Gesellschaft umfassend herauszuarbeiten und zu würdigen. Insofern versteht der AMG seine Arbeit nicht nur als Beitrag zur Militärgeschichte, sondern vor allem als Beitrag zur Geschichte der Frühen Neuzeit insgesamt. Der Arbeitskreis bietet ein Diskussions- und Informationsforum durch die Organisation von Tagungen, die Herausgabe der Schriftenreihe ‚Herrschaft und soziale Systeme in der Frühen Neuzeit‘, die Zeitschrift ‚Militär und Gesellschaft in der Frühen Neuzeit‘ und die Mailingliste mil-fnz.
Die Konferenz „International Conference for the 10th Anniversary of the Institute of Comparative Law” hat am 24. Mai 2013 in Szeged stattgefunden. Im Rahmen der viersprachigen Konferenz haben mehr als dreißig Teilnehmer ihre Forschungsergebnisse präsentiert. Der Essay von Zoltán Péteri blickt auf die Disziplin aus der Perspektive der Wissenschaftsgeschichte. Katalin Kelemen und Balázs Fekete gehen in ihrem Aufsatz der Frage nach, welchen Weg die Versuche der Klassifikation der Rechtssysteme von Osteuropa in der späten Phase der Umbrüche der 1980/90er Jahren genommen haben. Die historische Betrachtungsweise mit Bezug auf Rechtsgeschichte und Rechtsvergleichung spiegelt sich auch in anderen Essays wider, vor allem in den Aufsätzen von Szilvia Bató, Magdolna Gedeon und Béla Szabó P. sowie auch in den Aufsätzen von Péter Mezei und Tünde Szűcs. Attila Badó analysiert die Rechtsvergleichung aus der Sicht des Rechts, der Soziologie und der Politikwissenschaft anhand von Untersuchungen über das Sanktionsystem der Richter in den USA. Diese politikwissenschaftliche Seite wird auch in den Aufsätzen über die aktuellen Fragen der europäischen Integration von Carine Guemar und Laureline Congnard betont. Eine Reihe von Aufsätzen behandeln die konventionelle normative Komparatistik auf dem Gebiet des Verfassungsrechts (Jordane Arlettaz und Péter Kruzslicz), Gesellschaftsrechts (Kitti Bakos-Kovács), Urheberrechts (Dóra Hajdú) und Steuerrechts (Judit Jacsó). Daneben bilden eine weitere Gruppe die Aufsätze von János Bóka und Erzsébet Csatlós, die die Verwendung der vergleichenden Methode in der Praxis der Rechtsprechung untersuchen. Die Rechtsvergleichung ist eine sich dynamisch entwickelnde Disziplin. Die Konferenz und dieser Band dienen nicht nur der Würdigung der bisherigen Arbeit des Instituts für Rechtsvergleichung, sondern zeigen gleichzeitig neue Ziele auf. Die wichtigsten Grundsätze bleiben aber fest verankert auch in einem sich stets verändernden rechtlichen und geistigen Umfeld. Das Motto des Instituts lautet „instruere et docere omnes qui edoceri desiderant“ – „alle lehren, die lernen wollen.“ Auch in den folgenden Jahrzehnten werden uns der Wille des Lernens und Lehrens, die Freiheit der Forschung sowie die Übertragung und Weiterentwicklung der ungarischen wie globalen Rechtskultur leiten.
Ökonomen wie Wirtschaftspolitiker berufen sich auf die Neutralitätstheorie des Geldes, wenn sie eine Entpolitisierung der Geldpolitik fordern. Sowohl die Theorie der Geldneutralität als auch das Paradigma der Entpolitisierung der Geldpolitik sind jedoch problematisch. Die politökonomischen Entwicklungen nach der globalen Finanz- und Wirtschaftskrise 2007/2008 und die jüngsten Kontroversen über die Rolle und Bedeutung von Geld haben dies deutlich vor Augen geführt. Die vorliegende Arbeit diskutiert zunächst die konzeptionellen Grundlagen und theoretischen Modelle der Geldneutralität. Anschließend werden die zentralen theoretischen Annahmen und Aussagen der Neutralitätstheorie aus einer kritischen heterodoxen Perspektive hinterfragt. Es wird argumentiert, dass Geld eine nicht-neutrale Produktionskraft ist, die weder ökonomisch noch sozial neutral ist. Die Bedingungen, unter denen Geld verfügbar ist und zirkuliert, sind richtungsweisend für die ökonomische Entwicklung. Daher kann es auch kein neutrales Geld oder gar eine apolitische Geldpolitik geben.
The economic impact analysis contained in this book shows how irrigation farming is particularly susceptible when applying certain water management policies in the Australian Murray-Darling Basin, one of the world largest river basins and Australia’s most fertile region. By comparing different pricing and non-pricing water management policies with the help of the Water Integrated Market Model, it is found that the impact of water demand reducing policies is most severe on crops that need to be intensively irrigated and are at the same time less water productive. A combination of increasingly frequent and severe droughts and the application of policies that decrease agricultural water demand, in the same region, will create a situation in which the highly water dependent crops rice and cotton cannot be cultivated at all.
Geometric electroelasticity
(2014)
In this work a diffential geometric formulation of the theory of electroelasticity is developed which also includes thermal and magnetic influences. We study the motion of bodies consisting of an elastic material that are deformed by the influence of mechanical forces, heat and an external electromagnetic field. To this end physical balance laws (conservation of mass, balance of momentum, angular momentum and energy) are established. These provide an equation that describes the motion of the body during the deformation. Here the body and the surrounding space are modeled as Riemannian manifolds, and we allow that the body has a lower dimension than the surrounding space. In this way one is not (as usual) restricted to the description of the deformation of three-dimensional bodies in a three-dimensional space, but one can also describe the deformation of membranes and the deformation in a curved space. Moreover, we formulate so-called constitutive relations that encode the properties of the used material. Balance of energy as a scalar law can easily be formulated on a Riemannian manifold. The remaining balance laws are then obtained by demanding that balance of energy is invariant under the action of arbitrary diffeomorphisms on the surrounding space. This generalizes a result by Marsden and Hughes that pertains to bodies that have the same dimension as the surrounding space and does not allow the presence of electromagnetic fields. Usually, in works on electroelasticity the entropy inequality is used to decide which otherwise allowed deformations are physically admissible and which are not. It is alsoemployed to derive restrictions to the possible forms of constitutive relations describing the material. Unfortunately, the opinions on the physically correct statement of the entropy inequality diverge when electromagnetic fields are present. Moreover, it is unclear how to formulate the entropy inequality in the case of a membrane that is subjected to an electromagnetic field. Thus, we show that one can replace the use of the entropy inequality by the demand that for a given process balance of energy is invariant under the action of arbitrary diffeomorphisms on the surrounding space and under linear rescalings of the temperature. On the one hand, this demand also yields the desired restrictions to the form of the constitutive relations. On the other hand, it needs much weaker assumptions than the arguments in physics literature that are employing the entropy inequality. Again, our result generalizes a theorem of Marsden and Hughes. This time, our result is, like theirs, only valid for bodies that have the same dimension as the surrounding space.
The H.E.S.S. array is a third generation Imaging Atmospheric Cherenkov Telescope (IACT) array. It is located in the Khomas Highland in Namibia, and measures very high energy (VHE) gamma-rays. In Phase I, the array started data taking in 2004 with its four identical 13 m telescopes. Since then, H.E.S.S. has emerged as the most successful IACT experiment to date. Among the almost 150 sources of VHE gamma-ray radiation found so far, even the oldest detection, the Crab Nebula, keeps surprising the scientific community with unexplained phenomena such as the recently discovered very energetic flares of high energy gamma-ray radiation. During its most recent flare, which was detected by the Fermi satellite in March 2013, the Crab Nebula was simultaneously observed with the H.E.S.S. array for six nights. The results of the observations will be discussed in detail during the course of this work. During the nights of the flare, the new 24 m × 32 m H.E.S.S. II telescope was still being commissioned, but participated in the data taking for one night. To be able to reconstruct and analyze the data of the H.E.S.S. Phase II array, the algorithms and software used by the H.E.S.S. Phase I array had to be adapted. The most prominent advanced shower reconstruction technique developed by de Naurois and Rolland, the template-based model analysis, compares real shower images taken by the Cherenkov telescope cameras with shower templates obtained using a semi-analytical model. To find the best fitting image, and, therefore, the relevant parameters that describe the air shower best, a pixel-wise log-likelihood fit is done. The adaptation of this advanced shower reconstruction technique to the heterogeneous H.E.S.S. Phase II array for stereo events (i.e. air showers seen by at least two telescopes of any kind), its performance using MonteCarlo simulations as well as its application to real data will be described.
Planetary research is often user-based and requires considerable skill, time, and effort. Unfortunately, self-defined boundary conditions, definitions, and rules are often not documented or not easy to comprehend due to the complexity of research. This makes a comparison to other studies, or an extension of the already existing research, complicated. Comparisons are often distorted, because results rely on different, not well defined, or even unknown boundary conditions. The purpose of this research is to develop a standardized analysis method for planetary surfaces, which is adaptable to several research topics. The method provides a consistent quality of results. This also includes achieving reliable and comparable results and reducing the time and effort of conducting such studies. A standardized analysis method is provided by automated analysis tools that focus on statistical parameters. Specific key parameters and boundary conditions are defined for the tool application. The analysis relies on a database in which all key parameters are stored. These databases can be easily updated and adapted to various research questions. This increases the flexibility, reproducibility, and comparability of the research. However, the quality of the database and reliability of definitions directly influence the results. To ensure a high quality of results, the rules and definitions need to be well defined and based on previously conducted case studies. The tools then produce parameters, which are obtained by defined geostatistical techniques (measurements, calculations, classifications). The idea of an automated statistical analysis is tested to proof benefits but also potential problems of this method. In this study, I adapt automated tools for floor-fractured craters (FFCs) on Mars. These impact craters show a variety of surface features, occurring in different Martian environments, and having different fracturing origins. They provide a complex morphological and geological field of application. 433 FFCs are classified by the analysis tools due to their fracturing process. Spatial data, environmental context, and crater interior data are analyzed to distinguish between the processes involved in floor fracturing. Related geologic processes, such as glacial and fluvial activity, are too similar to be separately classified by the automated tools. Glacial and fluvial fracturing processes are merged together for the classification. The automated tools provide probability values for each origin model. To guarantee the quality and reliability of the results, classification tools need to achieve an origin probability above 50 %. This analysis method shows that 15 % of the FFCs are fractured by intrusive volcanism, 20 % by tectonic activity, and 43 % by water & ice related processes. In total, 75 % of the FFCs are classified to an origin type. This can be explained by a combination of origin models, superposition or erosion of key parameters, or an unknown fracturing model. Those features have to be manually analyzed in detail. Another possibility would be the improvement of key parameters and rules for the classification. This research shows that it is possible to conduct an automated statistical analysis of morphologic and geologic features based on analysis tools. Analysis tools provide additional information to the user and are therefore considered assistance systems.
Virtualized cloud data centers provide on-demand resources, enable agile resource provisioning, and host heterogeneous applications with different resource requirements. These data centers consume enormous amounts of energy, increasing operational expenses, inducing high thermal inside data centers, and raising carbon dioxide emissions. The increase in energy consumption can result from ineffective resource management that causes inefficient resource utilization. This dissertation presents detailed models and novel techniques and algorithms for virtual resource management in cloud data centers. The proposed techniques take into account Service Level Agreements (SLAs) and workload heterogeneity in terms of memory access demand and communication patterns of web applications and High Performance Computing (HPC) applications. To evaluate our proposed techniques, we use simulation and real workload traces of web applications and HPC applications and compare our techniques against the other recently proposed techniques using several performance metrics. The major contributions of this dissertation are the following: proactive resource provisioning technique based on robust optimization to increase the hosts' availability for hosting new VMs while minimizing the idle energy consumption. Additionally, this technique mitigates undesirable changes in the power state of the hosts by which the hosts' reliability can be enhanced in avoiding failure during a power state change. The proposed technique exploits the range-based prediction algorithm for implementing robust optimization, taking into consideration the uncertainty of demand. An adaptive range-based prediction for predicting workload with high fluctuations in the short-term. The range prediction is implemented in two ways: standard deviation and median absolute deviation. The range is changed based on an adaptive confidence window to cope with the workload fluctuations. A robust VM consolidation for efficient energy and performance management to achieve equilibrium between energy and performance trade-offs. Our technique reduces the number of VM migrations compared to recently proposed techniques. This also contributes to a reduction in energy consumption by the network infrastructure. Additionally, our technique reduces SLA violations and the number of power state changes. A generic model for the network of a data center to simulate the communication delay and its impact on VM performance, as well as network energy consumption. In addition, a generic model for a memory-bus of a server, including latency and energy consumption models for different memory frequencies. This allows simulating the memory delay and its influence on VM performance, as well as memory energy consumption. Communication-aware and energy-efficient consolidation for parallel applications to enable the dynamic discovery of communication patterns and reschedule VMs using migration based on the determined communication patterns. A novel dynamic pattern discovery technique is implemented, based on signal processing of network utilization of VMs instead of using the information from the hosts' virtual switches or initiation from VMs. The result shows that our proposed approach reduces the network's average utilization, achieves energy savings due to reducing the number of active switches, and provides better VM performance compared to CPU-based placement. Memory-aware VM consolidation for independent VMs, which exploits the diversity of VMs' memory access to balance memory-bus utilization of hosts. The proposed technique, Memory-bus Load Balancing (MLB), reactively redistributes VMs according to their utilization of a memory-bus using VM migration to improve the performance of the overall system. Furthermore, Dynamic Voltage and Frequency Scaling (DVFS) of the memory and the proposed MLB technique are combined to achieve better energy savings.