Refine
Year of publication
Document Type
- Doctoral Thesis (85) (remove)
Language
- English (51)
- German (33)
- Multiple languages (1)
Is part of the Bibliography
- no (85) (remove)
Keywords
- Adipositas (3)
- Klimawandel (3)
- climate change (3)
- Kinder (2)
- Kontext (2)
- Mars (2)
- Nanoparticles (2)
- Nanopartikel (2)
- Spektroskopie (2)
- Transkriptionsfaktoren (2)
- children (2)
- education (2)
- inflammatory bowel disease (2)
- metabolism (2)
- obesity (2)
- sensor (2)
- spectroscopy (2)
- transcription factors (2)
- ATP (1)
- Absorptionseigenschaften (1)
- Adana Basin (1)
- Adana Becken (1)
- Africa (1)
- Afrika (1)
- Agency theory (1)
- Agency-Theorie (1)
- Aggression (1)
- Aktinzytoskelett (1)
- Aktive Arbeitsmarktpolitik (1)
- Aktuatoren (1)
- Allgemeine Relativitätstheorie (1)
- Alliance Capability (1)
- Allianzfähigkeit (1)
- Allianzkompetenz (1)
- Amyloid peptide (1)
- Anaphora (1)
- Anfragepaare (1)
- Anomalien (1)
- Anorexia nervosa (1)
- Anoxie (1)
- Anpassung (1)
- Apollo (1)
- Apyrase (1)
- Arbeitssuchverhalten (1)
- ArcObjects (1)
- Aspect-Oriented Programming (1)
- Aspektorientierte Programmierung (1)
- Ausführungsgeschichte (1)
- Automatisierung (1)
- Azobenzolhaltige Polymerfilme (1)
- Bayesian networks (1)
- Bayessche Netze (1)
- Bewältigungsstrategien (1)
- Bildanalyse (1)
- Bildung (1)
- Binding Theory (1)
- Biosignaturen (1)
- Blickbewegungen (1)
- Blickbewegungen beim Lesen (1)
- Brachionus (1)
- Breitband (1)
- Brownification (1)
- Bürgerdienste (1)
- Bürgerschaftliches Engagement (1)
- CAP (1)
- CSCW (1)
- Caco-2 (1)
- Capabilities based View (1)
- Capsule (1)
- Carbide (1)
- Carbides (1)
- Carbon Cycling (1)
- Carbon cycling (1)
- Carbonate (1)
- Carbonates (1)
- Cavity Ring-Down (1)
- Cellulose (1)
- Centering Theory (1)
- Charnockit (1)
- Chile (1)
- Chlamydomonas (1)
- Chloroplast (1)
- Chromatin-Immunopräzipitation (1)
- Claudin-4 (1)
- Cloud Computing (1)
- Cloud computing (1)
- Cobalt (1)
- Colitis ulcerosa (1)
- Common Sense Model of Illness Representation (1)
- Composites (1)
- Computationale Modellierung (1)
- Coping (1)
- Core-Collapse Supernovae (1)
- Crohn's disease (1)
- DDR (1)
- Data Privacy (1)
- Datenbank (1)
- Datenschutz (1)
- Deichbruch (1)
- Depression (1)
- Desensibilisierung (1)
- Design Thinking (1)
- Diagenese (1)
- Diagenesis (1)
- Digitale Whiteboards (1)
- Dionysos (1)
- Disambiguierung (1)
- Diskursgegebenheit (1)
- Dysarthrie (1)
- E. coli (1)
- Effizienzanalyse (1)
- Ehrenamt (1)
- Eliassen Palm Flux (1)
- Eliassen-Palm-Fluss (1)
- Eltern (1)
- Energiehaushalt (1)
- Ernährung (1)
- Erosion (1)
- Erziehung (1)
- Erziehungsheim (1)
- Fasern (1)
- Fehlende Daten (1)
- Fehlerbeseitigung (1)
- Feldflussfraktionierung (1)
- Field Flow Fractionation (1)
- Flavonoid (1)
- Flavonoid-Metabolismus (1)
- Flood (1)
- Fluoreszenz (1)
- Fläming (1)
- Formyl-Peptid Rezeptor 2 (1)
- Freiwilligenmanagement (1)
- Freizeit (1)
- Führungspraxis (1)
- GDR (1)
- GIS (1)
- Galaxien (1)
- Gas Sorption (1)
- Gassensorik (1)
- Gedankenschweifen (1)
- Gedankenverlorenes Lesen (1)
- Gefährdungskarten (1)
- General Relativity (1)
- Geoinformation (1)
- Geomorphologie (1)
- German Greens (1)
- German history (1)
- German history of sovereignty (12th and 13th century) (1)
- German reunification (1)
- Geschäftsprozessmanagement (1)
- Gießfolien (1)
- Gitterdynamik (1)
- Globaler Wandel (1)
- Gold Cluster (1)
- Gravitational Waves (1)
- Gravitationswellen (1)
- Grundwasser (1)
- Grüne (1)
- HCI (1)
- Habitus (1)
- Halo (1)
- Hebung des Plateaus (1)
- Hochgeschwindigkeitswolken (1)
- Hochwasser (1)
- Holzprodukte (1)
- IRRAS (1)
- In-situ Rasterkraftmikroskopie (1)
- Information (1)
- Informationstechnik (1)
- Informationsvorhaltung (1)
- Internet applications (1)
- Internetanwendungen (1)
- Inverse Probability Weighting (1)
- Ionic Liquid (1)
- Ivy (1)
- Java Security Framework (1)
- Jugend (1)
- Jugendliche (1)
- Kartoffel (1)
- Kern-Kollaps-Supernovae (1)
- Klassifikator (1)
- Klassifizierung (1)
- Kleintriebe (1)
- Koalition (1)
- Kohlenstoffspeicherung (1)
- Kommunale Verwaltung (1)
- Kommunikation (1)
- Kompetenzmanagement (1)
- Komposite (1)
- Konfliktgeschichte (1)
- Konjugierten polyelektrolyt (1)
- Konsumorietirte Steuerreform (1)
- Koreferenz (1)
- Krankheitsbewältigung (1)
- Land (1)
- Landepositionen (1)
- Landnutzung (1)
- Landwirtschaft (1)
- Langmuir monolayers (1)
- Latin American literature (1)
- Lebensqualität (1)
- Leistungsfähigkeit (1)
- Levels-of-inattention Hypothese (1)
- Lipide (1)
- Mars Express (1)
- Mathematics Tasks (1)
- Mathematik (1)
- Mathematikaufgaben (1)
- Mediengewalt (1)
- Meereis (1)
- Mehrschichtsysteme (1)
- Mensch-Computer-Interaktion (1)
- Merkmale (1)
- Metal-organic framework (1)
- Metall/Graphen/Polymer Grenzfläch (1)
- Methanogene Archaeen (1)
- Microalgae (1)
- Microemulsion (1)
- Mikroalgen (1)
- Mikroemulsion (1)
- Mikrokapsel (1)
- Mikropolitik (1)
- Mineralverwitterungsreaktionen (1)
- Miocene (1)
- Miozän (1)
- Modell der Bayesianischen Sakkadenplanung (1)
- Modellierung (1)
- Modelllernen (1)
- Mongolei (1)
- Mongolia (1)
- Monoschicht (1)
- Morbus Crohn (1)
- Multilayers (1)
- Myofibroblasten (1)
- Mythos (1)
- Nachmittagsbetreuung (1)
- Nahinfrarot (NIR) (1)
- Nanostruktur (1)
- Narziss (1)
- Netzwerke (1)
- New Public Management (1)
- Nukleosidase (1)
- Nährstoffe (1)
- Oberflächengitter (1)
- Obesity (1)
- Objektive Hermeneutik (1)
- On.Line Monitoring (1)
- Opto-mechanische Spannungen (1)
- Organgröße (1)
- Organisationstheorie (1)
- Owner-Retained Access Control (ORAC) (1)
- PISA (1)
- PSP (1)
- PSP-P (1)
- PSP-RS (1)
- Parasiten (1)
- Parasites (1)
- Partizipation (1)
- Pathogenantwort (1)
- Patientenschulung (1)
- Peptide (1)
- Performance (1)
- Perowskit (1)
- Pflanzenwachstum (1)
- Pflanzenzellen (1)
- Phospholipid (1)
- Photon density waves (1)
- Photonendichtewellen (1)
- Photosynthese (1)
- Physik (1)
- Pinus sylvestris (1)
- Plastome-evolution (1)
- Plastomevolution (1)
- Polder (1)
- Policy Languages (1)
- Policy Sprachen (1)
- Polyadenylierung (1)
- Polyelectrolyte (1)
- Polyelektrolyt (1)
- Prevalence (1)
- Primärproduktion (1)
- Probabilistische Modelle (1)
- Probiotika (1)
- Process Mining (1)
- Progredienzangst (1)
- Project management (1)
- Projektmanagement (1)
- Pronomen (1)
- Pronouns (1)
- Propensity Score Matching (1)
- Proteine (1)
- Proteom (1)
- Protest Parties (1)
- Protestparteien (1)
- Prävalenz (1)
- Psycholinguistik (1)
- QTL (1)
- Qualitätsmanagement (1)
- Quantum Dots (1)
- Quercetin (1)
- Quercus (1)
- RDF (1)
- RNA-seq (1)
- Raman Spektroskopie (1)
- Raman spectroscopy (1)
- Resource based View (1)
- Responsive Polymere (1)
- Richardson Syndrom (1)
- Richardson Syndrome (1)
- Risiko- und Vulnerabilitätsfaktoren (1)
- Risk and Vulnerability Factors (1)
- SJL (1)
- SME (1)
- SPARQL (1)
- STATT Partei (1)
- STATT Party (1)
- SWIM (1)
- Saprolit (1)
- Sardinia (1)
- Sardinien (1)
- Scalability (1)
- Schill (1)
- Schule (1)
- Sedimentenabfolge (1)
- Sedimentologie (1)
- Sedimentology (1)
- Selbstmanagement (1)
- Semantische Analyse (1)
- Semiklassik (1)
- Sensor (1)
- Sexual Aggression (1)
- Sexuelle Aggression (1)
- Shrub encroachment (1)
- Siberian permafrost (1)
- Skalierbarkeit (1)
- Softwareentwicklung (1)
- Softwareentwicklungsprozesse (1)
- Softwaretest (1)
- Sorption (1)
- Sri Lanka (1)
- Steuer (1)
- Steuerreform (1)
- Steuersystem (1)
- Stoffwechsel (1)
- Strategic Alliances (1)
- Strategic Management (1)
- Strategische Allianzen (1)
- Strategisches Management (1)
- Stress (1)
- Streuamplitude (1)
- Streutheorie (1)
- Stärke (1)
- Subsidenzgeschichte (1)
- Sumatra Störung (1)
- Sumatra fault (1)
- Süd-Türkei (1)
- Tarutung (1)
- Tbc1d1 (1)
- Test-getriebene Fehlernavigation (1)
- Tests (1)
- Tight Junction (1)
- Transaktivierungs-Experimente (1)
- Transforming Growth Factor beta (1)
- Transitionmetals (1)
- Transkriptomanalyse (1)
- Turkey (1)
- Türkei (1)
- Unsicherheitsanalyse (1)
- VMP1 (1)
- Vernetzte Daten (1)
- Verteiltes Arbeiten (1)
- Verwitterungsfeedback (1)
- Videoanalyse (1)
- Videometadaten (1)
- Videospiele (1)
- Vorhersage (1)
- Waldwachstumsmodell 4C (1)
- Wasserqualität (1)
- Wellenausbreitung (1)
- Wende (1)
- Wissen (1)
- Wortgrenzen (1)
- Zellform (1)
- Zelltyp-spezifisch (1)
- Zellulose (1)
- Zellulärmaterialien (1)
- Zinc (1)
- Zivilgesellschaft (1)
- actin cytoskeleton machine (1)
- active labor market policies (1)
- actuating materials (1)
- adapation (1)
- adaptation (1)
- adolescents (1)
- afternoon care (1)
- aggression (1)
- agriculture (1)
- analysis of efficiency (1)
- anaphora (1)
- anomalies (1)
- anorexia nervosa (1)
- anoxia (1)
- anxiety (1)
- apyrase (1)
- arbuscular mycorrhizal symbiosis (1)
- arbuskuläre Mykorrhiza-Symbiose (1)
- architectured materials (1)
- assistive Technologien (1)
- assistive technologies (1)
- attenuation tomography (1)
- automation (1)
- autophagy (1)
- azobenzene polymer films (1)
- back-in-time (1)
- biodiversity (1)
- biosignatures (1)
- broadband (1)
- brownification (1)
- business process management (1)
- carbon flow (1)
- carbon sequestration (1)
- casted-films (1)
- cavity ring-down (1)
- cell morphogenesis (1)
- cell shape (1)
- cell type-specific (1)
- cells epidermis (1)
- cellular materials (1)
- changeability (1)
- charnockite (1)
- chemical weathering (1)
- chemische Verwitterung (1)
- chlamydomonas (1)
- chromatin immunoprecipitation (1)
- chronisch entzündliche Darmerkrankungen (1)
- chronisch-entzündliche Darmerkrankungen (1)
- citizen services (1)
- civic culture (1)
- classification (1)
- classifier (1)
- claudin-4 (1)
- coalition (1)
- cobalt (1)
- cognitive functions (1)
- common sense model of illness representation (1)
- communication (1)
- computational modeling (1)
- concepts of illness (1)
- conflict (1)
- conjugated polyelectrolyte (1)
- context (1)
- context awareness (1)
- coping (1)
- coreference (1)
- countryside (1)
- critical zone (1)
- crop modeling (1)
- cscw (1)
- cytokinesis (1)
- database (1)
- debugging (1)
- depression (1)
- desensitization (1)
- design thinking (1)
- deutsche Geschichte (1)
- deutsche Herrschaftsbildung (12./13. Jh.) (1)
- diet (1)
- diffuse Belastung (1)
- diffuse pollution (1)
- digital whiteboard (1)
- dike breach (1)
- directed projects (1)
- discourse-givenness (1)
- disease management (1)
- dysarthria (1)
- dysarthric features (1)
- empirical labor market studies (1)
- empirical studies (1)
- empirische Arbeitsmarktforschung (1)
- empirische Studien (1)
- energy budget (1)
- erosion (1)
- extracellular matrix (1)
- extrazelluläre Matrix (1)
- eye movements (1)
- eye movements during reading (1)
- fear of progression (1)
- fibres (1)
- films (1)
- fiscal policy (1)
- flavonoid (1)
- flavonoid biosynthesis (1)
- fluorescence (1)
- forest growth model 4C (1)
- forestry (1)
- formyl peptide receptor 2 (1)
- galaxies (1)
- gas sensing (1)
- gas sorption (1)
- gelenkte Projekte (1)
- geoinformation (1)
- geological processes (1)
- geologische Prozesse (1)
- geomorphology (1)
- global change (1)
- ground water (1)
- halo (1)
- hazard maps (1)
- high-velocity-clouds (1)
- hydrogel (1)
- illness representations (1)
- in-situ atomic force microscopy (1)
- indigenous chronic (1)
- information (1)
- information technology (1)
- integrated personal income tax (1)
- inverse probability weighting (1)
- ionic liquid (1)
- job search behavior (1)
- kinetic modeling (1)
- kinetische Modellierung (1)
- knowledge (1)
- kognitive Funktionen (1)
- kritische Zone (1)
- land use (1)
- landing positions (1)
- lattice dynamics (1)
- leadership (1)
- learning networks plant (1)
- leisure time (1)
- levels-of-inattention hypothesis (1)
- linked data (1)
- local government (1)
- loss-of-function mutation (1)
- loss-of-function-Mutation (1)
- mathematics (1)
- media violence (1)
- metabolic costs (1)
- metabolic networks (1)
- metabolische Kosten (1)
- metabolische Netzwerke (1)
- metal-organic framework (1)
- metal/polymer interfaces (1)
- methanogenic archaea (1)
- miRNA (1)
- miRNAs (1)
- micro politics (1)
- mind wandering (1)
- mindless reading (1)
- mineral weathering reactions (1)
- missing data (1)
- model of Bayesian saccade planning (1)
- model-driven engineering (1)
- modelgetriebene Entwicklung (1)
- modeling (1)
- modelling (1)
- myofibroblast (1)
- myth (1)
- nanostructure (1)
- near-infrared (NIR) (1)
- new public management (1)
- nonprofit (1)
- nucleosidase (1)
- nutrients (1)
- online assistance (1)
- opal (1)
- optische Anregung (1)
- opto-mechanical stresses (1)
- organ size (1)
- organizational theory (1)
- overgrazing (1)
- parents (1)
- participation (1)
- pathogen response (1)
- patient education (1)
- pavement cells image analysis (1)
- perovskite (1)
- photoexcitation (1)
- photosynthesis (1)
- physics (1)
- physiologische Verfahren (1)
- plateau uplift (1)
- polder (1)
- political utopia (1)
- polyadenylation (1)
- potato (1)
- prediction (1)
- prefetching (1)
- primary production (1)
- probabilistic models (1)
- probiotics (1)
- process mining (1)
- propensity score matching (1)
- protein (1)
- proteins (1)
- proteomics (1)
- psycholinguistics (1)
- psychophysiological measures (1)
- qualitative pathway interpretation (1)
- quality of life (1)
- qualitymanagement (1)
- quercetin (1)
- query matching (1)
- reconfigurable matter (1)
- remote collaboration (1)
- reptiles (1)
- responsive (1)
- responsive polymer (1)
- saprolite (1)
- scattering amplitude (1)
- scattering theory (1)
- school (1)
- sea ice (1)
- sedimentary record (1)
- seismische Geschwindigkeiten (1)
- self-management (1)
- semantic analysis (1)
- semiclassics (1)
- sibirischen Permafrost (1)
- sign language (1)
- software development (1)
- software development processes (1)
- southern Turkey (1)
- stable isotope tracing (1)
- starch (1)
- status quo and reform model (1)
- stochastic Petri nets (1)
- stochastische Petri Netze (1)
- stoichiometric modeling (1)
- stress (1)
- stöchiometrische Modellierung (1)
- subjektive Krankheitskonzepte (1)
- subsidence history (1)
- sucrose (1)
- surface relief grating (1)
- swelling (1)
- tax system (1)
- temperature (1)
- test-driven fault navigation (1)
- testing (1)
- tf-idf (1)
- the turnaround (1)
- thermoplastic (1)
- thermoplastisch (1)
- tight junction (1)
- transactivation assay (1)
- transcriptome (1)
- transcriptome analysis (1)
- transformational (1)
- transforming growth factor beta (1)
- ubiquity of new knowledge (1)
- ulcerative colitis (1)
- ultrafast X-ray diffraction (1)
- ultraschnelle Röntgendiffraktion (1)
- uncertainty analysis (1)
- unipolar affective disorders (1)
- unipolare affektive Störungen (1)
- user interfaces (1)
- velocity structure (1)
- video analysis (1)
- video games (1)
- video metadata (1)
- violence of latin letter (1)
- volunteer management (1)
- volunteering (1)
- water balance (1)
- water quality (1)
- wave propagation (1)
- weathering feedback (1)
- wood products (1)
- word boundaries (1)
- word sense disambiguation (1)
- word skipping (1)
- worries and concerns (1)
- youth (1)
- zinc (1)
- zooplankton (1)
- Änderbarkeit (1)
- Ängste und Sorgen (1)
- Übergangsmetalle (1)
- Überspringen von Wörtern (1)
- Čechov (1)
Institute
- Institut für Biochemie und Biologie (15)
- Institut für Chemie (9)
- Institut für Physik und Astronomie (8)
- Department Psychologie (7)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (7)
- Institut für Umweltwissenschaften und Geographie (7)
- Institut für Geowissenschaften (5)
- Wirtschaftswissenschaften (5)
- Extern (4)
- Sozialwissenschaften (4)
The term Linked Data refers to connected information sources comprising structured data about a wide range of topics and for a multitude of applications. In recent years, the conceptional and technical foundations of Linked Data have been formalized and refined. To this end, well-known technologies have been established, such as the Resource Description Framework (RDF) as a Linked Data model or the SPARQL Protocol and RDF Query Language (SPARQL) for retrieving this information. Whereas most research has been conducted in the area of generating and publishing Linked Data, this thesis presents novel approaches for improved management. In particular, we illustrate new methods for analyzing and processing SPARQL queries. Here, we present two algorithms suitable for identifying structural relationships between these queries. Both algorithms are applied to a large number of real-world requests to evaluate the performance of the approaches and the quality of their results. Based on this, we introduce different strategies enabling optimized access of Linked Data sources. We demonstrate how the presented approach facilitates effective utilization of SPARQL endpoints by prefetching results relevant for multiple subsequent requests. Furthermore, we contribute a set of metrics for determining technical characteristics of such knowledge bases. To this end, we devise practical heuristics and validate them through thorough analysis of real-world data sources. We discuss the findings and evaluate their impact on utilizing the endpoints. Moreover, we detail the adoption of a scalable infrastructure for improving Linked Data discovery and consumption. As we outline in an exemplary use case, this platform is eligible both for processing and provisioning the corresponding information.
Als ich anfing, ein Thema für meine Promotion zu erarbeiten, fand ich Massentests ziemlich beeindruckend. TIMSS: über 500000 Schüler getestet. PISA: 180000 Schüler getestet. Ich wollte diese Datenbasis nutzen, um Erkenntnisse für die Gestaltung von Unterricht zu gewinnen. Leider kam ich damit nicht weit. Je tiefer ich mich mit den Tests und den dahinterstehenden Theorien befasste, desto deutlicher schälte sich heraus, dass mit diesen Tests keine neue Erkenntnis generiert werden kann. Fast alle Schlussfolgerungen, die aus den Tests gezogen werden, konnten gar nicht aus den Tests selbst gewonnen werden. Ich konzentrierte mich zunehmend auf die Testaufgaben, weil die Geltung der Aussage eines Tests an der Aufgabe erzeugt wird: In der Aufgabe gerinnt das, was die Tester als „mathematische Leistungsfähigkeit“ konstruieren. Der Schüler wiederum hat nur die Aufgabe vor sich. Es gibt nur „gelöst“ (ein Punkt) und „ungelöst“ (kein Punkt). Damit der Schüler den Punkt bekommt, muss er an der richtigen Stelle ankreuzen, oder er muss etwas hinschrei-ben, wofür der Auswerter einen Punkt gibt. In der Dissertation wird untersucht, was die Aufgaben testen, was also alles in das Konstrukt von „mathematischer Leistungsfähigkeit“ einfließt, und ob es das ist, was der Test testen soll. Es stellte sich durchaus erstaunliches heraus: - Oftmals gibt es so viele Möglichkeiten, zur gewünschten Lösung (die nicht in jedem Fall die richtige Lösung ist) zu gelangen, dass man nicht benennen kann, welche Fähigkeit die Aufgabe eigentlich misst. Das Konstrukt „mathematische Leistungsfähigkeit“ wird damit zu einem zufälligen. - Es werden Komponenten von Testfähigkeit mitgemessen: Viele Aufgaben enthalten Irritationen, welche von testerfahrenen Schülern leichter überwunden werden können als von testunerfahrenen. Es gibt Aufgaben, die gelöst werden können, ohne dass man über die Fähigkeit verfügt, die getestet werden soll. Umgekehrt gibt es Aufgaben, die man eventuell nicht lösen kann, obwohl man über diese Fähigkeit verfügt. Als Kernkompetenz von Testfähigkeit stellt sich heraus, weder das gestellte mathematische Problem noch die angeblichen realen Proble-me ernst zu nehmen, sondern sich statt dessen auf das zu konzentrieren, was die Tester angekreuzt oder hinge-schrieben sehen wollen. Prinzipiell erweist es sich als günstig, mittelmäßig zu arbeiten, auf intellektuelle Tiefe in der Auseinandersetzung mit den Aufgaben also zu verzichten. - Man kann bei Multiple-Choice-Tests raten. Die PISA-Gruppe behauptet zwar, dieses Problem technisch über-winden zu können, dies erweist sich aber als Fehleinschätzung. - Sowohl bei TIMSS als auch bei PISA stellt sich heraus, dass die vorgeblich verwendeten didaktischen und psychologischen Theorien lediglich theoretische Mäntel für eine theoriearme Testerstellung sind. Am Beispiel der Theorie der mentalen Situationsmodelle (zur Bearbeitung von realitätsnahen Aufgaben) wird dies ausführlich exemplarisch ausgearbeitet. Das Problem reproduziert sich in anderen Theoriefeldern. Die Tests werden nicht durch Operationalisierungen von Messkonstrukten erstellt, sondern durch systematisches Zusammenstückeln von Aufgaben. - Bei PISA sollte „Mathematical Literacy“ getestet werden. Verkürzt sollte das die Fähigkeit sein, „die Rolle, die Mathematik in der Welt spielt, zu erkennen und zu verstehen, begründete mathematische Urteile abzugeben und sich auf eine Weise mit der Mathematik zu befassen, die den Anforderungen des gegenwärtigen und künftigen Lebens einer Person als eines konstruktiven, engagierten und reflektierten Bürgers entspricht“ (PISA-Eigendarstellung). Von all dem kann angesichts der Aufgaben keine Rede sein. - Bei der Untersuchung des PISA-Tests drängte sich ein mathematikdidaktischer Habitus auf, der eine separate Untersuchung erzwang. Ich habe ihn unter dem Stichwort der „Abkehr von der Sache“ zusammengefasst. Er ist geprägt von Zerstörungen des Mathematischen bei gleichzeitiger Überbetonung des Fachsprachlichen und durch Verwerfungen des Mathematischen und des Realen bei realitätsnahen Aufgaben. Letzteres gründet in der Nicht-beachtung der Authentizität sowohl des Realen als auch des Mathematischen. Die Arbeit versammelt neben den Untersuchungen zu TIMSS und PISA ein ausführliches Kapitel über das Prob-lem des Testens und eine Darstellung der Methodologie und Praxis der Objektiven Hermeneutik.
Ziel dieser Arbeit war es, die Stickstoff- und Phosphorprozesse im nordostdeutschen Tiefland detailliert zu untersuchen und Handlungsoptionen hinsichtlich der Landnutzung zur nachhaltigen Steuerung der Stickstoff- und Phosphoreinträge in die Fließgewässer aufzuzeigen. Als Grundvoraussetzung für die Modellierung des Nährstoffhaushaltes mussten zunächst die hydrologischen Prozesse und die Abflüsse für die Einzugsgebiete validiert werden. Dafür wurde in dieser Arbeit das ökohydrologische Modell SWIM verwendet. Die Abflussmodellierung umfasste den Zeitraum 1991 - 2000. Die Ergebnisse dazu zeigen, dass SWIM in der Lage war, die hydrologischen Prozesse in den Untersuchungsgebieten adäquat wiederzugeben. Auf der Grundlage der Modellierung des Wasserhaushaltes wurden mit SWIM die Stoffumsatzprozesse für den Zeitraum 1996 - 2000 simuliert. Um dabei besonders das Prozessgeschehen im Tiefland zu berücksichtigen, war die Erweiterung von SWIM um einen Ammonium-Pool mit dessen Umsatzprozessen erforderlich. Außerdem wurde der Prozess der Nährstoffversickerung so ergänzt, dass neben Nitrat auch Ammonium und Phosphat durch das gesamte Bodenprofil verlagert und über die Abflusskomponenten zum Gebietsauslass transportiert werden können. Mit diesen Modellerweiterungen konnten die Stickstoff und Phosphorprozesse in den Untersuchungsgebieten gut abgebildet werden. Mit dem so validierten Modell wurden weitere Anwendungen ermöglicht. Nährstoffsimulationen für den Zeitraum 1981 bis 2000 dienten der Untersuchung des abnehmenden Trends in den Nährstoffkonzentrationen der Nuthe. Die Untersuchungsergebnisse lassen deutlich erkennen, dass sich die Konzentrationen nach 1990 hauptsächlich auf Grund der Reduzierung der Einträge aus punktförmigen Quellen und Rieselfeldern verringert haben. Weitere Modellrechnungen zur Herkunft der Nährstoffe haben ergeben, dass Nitrat überwiegend aus diffusen Quellen, Ammonium und Phosphat dagegen aus punktförmigen Quellen stammen. Als besonders sensitiv auf die Modellergebnisse haben sich die Parameter zu Landnutzung und -management und die Durchwurzelungstiefe der Pflanzen herausgestellt. Abschließend wurden verschiedene Landnutzungsszenarien angewendet. Die Ergebnisse zu den Szenariorechnungen zeigen, dass fast alle vorgegebenen Landnutzungsszenarien zu einer Verringerung der Stickstoff- bzw. Phosphoremissionen führten. Die Anwendung von Szenarien, die alle relevanten Zielvorgaben und Empfehlungen zum Ressourcenschutz berücksichtigen, zeigen die größten Veränderungen.
User-centered design processes are the first choice when new interactive systems or services are developed to address real customer needs and provide a good user experience. Common tools for collecting user research data, conducting brainstormings, or sketching ideas are whiteboards and sticky notes. They are ubiquitously available, and no technical or domain knowledge is necessary to use them. However, traditional pen and paper tools fall short when saving the content and sharing it with others unable to be in the same location. They are also missing further digital advantages such as searching or sorting content. Although research on digital whiteboard and sticky note applications has been conducted for over 20 years, these tools are not widely adopted in company contexts. While many research prototypes exist, they have not been used for an extended period of time in a real-world context. The goal of this thesis is to investigate what the enablers and obstacles for the adoption of digital whiteboard systems are. As an instrument for different studies, we developed the Tele-Board software system for collaborative creative work. Based on interviews, observations, and findings from former research, we tried to transfer the analog way of working to the digital world. Being a software system, Tele-Board can be used with a variety of hardware and does not depend on special devices. This feature became one of the main factors for adoption on a larger scale. In this thesis, I will present three studies on the use of Tele-Board with different user groups and foci. I will use a combination of research methods (laboratory case studies and data from field research) with the overall goal of finding out when a digital whiteboard system is used and in which cases not. Not surprisingly, the system is used and accepted if a user sees a main benefit that neither analog tools nor other applications can offer. However, I found that these perceived benefits are very different for each user and usage context. If a tool provides possibilities to use in different ways and with different equipment, the chances of its adoption by a larger group increase. Tele-Board has now been in use for over 1.5 years in a global IT company in at least five countries with a constantly growing user base. Its use, advantages, and disadvantages will be described based on 42 interviews and usage statistics from server logs. Through these insights and findings from laboratory case studies, I will present a detailed analysis of digital whiteboard use in different contexts with design implications for future systems.
Within the course of this thesis, I have investigated the complex interplay between electron and lattice dynamics in nanostructures of perovskite oxides. Femtosecond hard X-ray pulses were utilized to probe the evolution of atomic rearrangement directly, which is driven by ultrafast optical excitation of electrons. The physics of complex materials with a large number of degrees of freedom can be interpreted once the exact fingerprint of ultrafast lattice dynamics in time-resolved X-ray diffraction experiments for a simple model system is well known. The motion of atoms in a crystal can be probed directly and in real-time by femtosecond pulses of hard X-ray radiation in a pump-probe scheme. In order to provide such ultrashort X-ray pulses, I have built up a laser-driven plasma X-ray source. The setup was extended by a stable goniometer, a two-dimensional X-ray detector and a cryogen-free cryostat. The data acquisition routines of the diffractometer for these ultrafast X-ray diffraction experiments were further improved in terms of signal-to-noise ratio and angular resolution. The implementation of a high-speed reciprocal-space mapping technique allowed for a two-dimensional structural analysis with femtosecond temporal resolution. I have studied the ultrafast lattice dynamics, namely the excitation and propagation of coherent phonons, in photoexcited thin films and superlattice structures of the metallic perovskite SrRuO3. Due to the quasi-instantaneous coupling of the lattice to the optically excited electrons in this material a spatially and temporally well-defined thermal stress profile is generated in SrRuO3. This enables understanding the effect of the resulting coherent lattice dynamics in time-resolved X-ray diffraction data in great detail, e.g. the appearance of a transient Bragg peak splitting in both thin films and superlattice structures of SrRuO3. In addition, a comprehensive simulation toolbox to calculate the ultrafast lattice dynamics and the resulting X-ray diffraction response in photoexcited one-dimensional crystalline structures was developed in this thesis work. With the powerful experimental and theoretical framework at hand, I have studied the excitation and propagation of coherent phonons in more complex material systems. In particular, I have revealed strongly localized charge carriers after above-bandgap femtosecond photoexcitation of the prototypical multiferroic BiFeO3, which are the origin of a quasi-instantaneous and spatially inhomogeneous stress that drives coherent phonons in a thin film of the multiferroic. In a structurally imperfect thin film of the ferroelectric Pb(Zr0.2Ti0.8)O3, the ultrafast reciprocal-space mapping technique was applied to follow a purely strain-induced change of mosaicity on a picosecond time scale. These results point to a strong coupling of in- and out-of-plane atomic motion exclusively mediated by structural defects.
Pronoun resolution normally takes place without conscious effort or awareness, yet the processes behind it are far from straightforward. A large number of cues and constraints have previously been recognised as playing a role in the identification and integration of potential antecedents, yet there is considerable debate over how these operate within the resolution process. The aim of this thesis is to investigate how the parser handles multiple antecedents in order to understand more about how certain information sources play a role during pronoun resolution. I consider how both structural information and information provided by the prior discourse is used during online processing. This is investigated through several eye tracking during reading experiments that are complemented by a number of offline questionnaire experiments. I begin by considering how condition B of the Binding Theory (Chomsky 1981; 1986) has been captured in pronoun processing models; some researchers have claimed that processing is faithful to syntactic constraints from the beginning of the search (e.g. Nicol and Swinney 1989), while others have claimed that potential antecedents which are ruled out on structural grounds nonetheless affect processing, because the parser must also pay attention to a potential antecedent’s features (e.g. Badecker and Straub 2002). My experimental findings demonstrate that the parser is sensitive to the subtle changes in syntactic configuration which either allow or disallow pronoun reference to a local antecedent, and indicate that the parser is normally faithful to condition B at all stages of processing. Secondly, I test the Primitives of Binding hypothesis proposed by Koornneef (2008) based on work by Reuland (2001), which is a modular approach to pronoun resolution in which variable binding (a semantic relationship between pronoun and antecedent) takes place before coreference. I demonstrate that a variable-binding (VB) antecedent is not systematically considered earlier than a coreference (CR) antecedent online. I then go on to explore whether these findings could be attributed to the linear order of the antecedents, and uncover a robust recency preference both online and offline. I consider what role the factor of recency plays in pronoun resolution and how it can be reconciled with the first-mention advantage (Gernsbacher and Hargreaves 1988; Arnold 2001; Arnold et al., 2007). Finally, I investigate how aspects of the prior discourse affect pronoun resolution. Prior discourse status clearly had an effect on pronoun resolution, but an antecedent’s appearance in the previous context was not always facilitative; I propose that this is due to the number of topic switches that a reader must make, leading to a lack of discourse coherence which has a detrimental effect on pronoun resolution. The sensitivity of the parser to structural cues does not entail that cue types can be easily separated into distinct sequential stages, and I therefore propose that the parser is structurally sensitive but not modular. Aspects of pronoun resolution can be captured within a parallel constraints model of pronoun resolution, however, such a model should be sensitive to the activation of potential antecedents based on discourse factors, and structural cues should be strongly weighted.
Adenylates are metabolites with essential function in metabolism and signaling in all living organisms. As Cofactors, they enable thermodynamically unfavorable reactions to be catalyzed enzymatically within cells. Outside the cell, adenylates are involved in signalling processes in animals and emerging evidence suggests similar signaling mechanisms in the plants’ apoplast. Presumably, apoplastic apyrases are involved in this signaling by hydrolyzing the signal mediating molecules ATP and ADP to AMP. This PhD thesis focused on the role of adenylates on metabolism and development of potato (Solanum tuberosum) by using reverse genetics and biochemical approaches. To study the short and long term effect of cellular ATP and the adenylate energy charge on potato tuber metabolism, an apyrase from Escherichia coli targeted into the amyloplast was expressed inducibly and constitutively. Both approaches led to the identification of adaptations to reduced ATP/energy charge levels on the molecular and developmental level. These comprised a reduction of metabolites and pathway fluxes that require significant amounts of ATP, like amino acid or starch synthesis, and an activation of processes that produce ATP, like respiration and an immense increase in the surface-to-volume ratio. To identify extracellular enzymes involved in adenylate conversion, green fluorescent protein and activity localization studies in potato tissue were carried out. It was found that extracellular ATP is imported into the cell by an apoplastic enzyme complement consisting of apyrase, unspecific phosphatase, adenosine nucleosidase and an adenine transport system. By changing the expression of a potato specific apyrase via transgenic approaches, it was found that this enzyme has strong impact on plant and particular tuber development in potato. Whereas metabolite levels were hardly altered, transcript profiling of tubers with reduced apyrase activity revealed a significant upregulation of genes coding for extensins, which are associated with polar growth. The results are discussed in context of adaptive responses of plants to changes in the adenylate levels and the proposed role of apyrase in apoplastic purinergic signaling and ATP salvaging. In summary, this thesis provides insight into adenylate regulated processes within and outside non-photosynthetic plant cells.
The Italian Army’s participation in Hitler’s war against the Soviet Union has remained unrecognized and understudied. Bastian Matteo Scianna offers a wide-ranging, in-depth corrective. Mining Italian, German and Russian sources, he examines the history of the Italian campaign in the East between 1941 and 1943, as well as how the campaign was remembered and memorialized in the domestic and international arena during the Cold War. Linking operational military history with memory studies, this book revises our understanding of the Italian Army in the Second World War.
Forests are a key resource serving a multitude of functions such as providing income to forest owners, supplying industries with timber, protecting water resources, and maintaining biodiversity. Recently much attention has been given to the role of forests in the global carbon cycle and their management for increased carbon sequestration as a possible mitigation option against climate change. Furthermore, the use of harvested wood can contribute to the reduction of atmospheric carbon through (i) carbon sequestration in wood products, (ii) the substitution of non-wood products with wood products, and (iii) through the use of wood as a biofuel to replace fossil fuels. Forest resource managers are challenged by the task to balance these multiple while simultaneously meeting economic requirements and taking into consideration the demands of stakeholder groups. Additionally, risks and uncertainties with regard to uncontrollable external variables such as climate have to be considered in the decision making process. In this study a scientific stakeholder dialogue with forest-related stakeholder groups in the Federal State of Brandenburg was accomplished. The main results of this dialogue were the definition of major forest functions (carbon sequestration, groundwater recharge, biodiversity, and timber production) and priority setting among them by the stakeholders using the pair-wise comparison technique. The impact of different forest management strategies and climate change scenarios on the main functions of forest ecosystems were evaluated at the Kleinsee management unit in south-east Brandenburg. Forest management strategies were simulated over 100 years using the forest growth model 4C and a wood product model (WPM). A current climate scenario and two climate change scenarios based on global circulation models (GCMs) HadCM2 and ECHAM4 were applied. The climate change scenario positively influenced stand productivity, carbon sequestration, and income. The impact on the other forest functions was small. Furthermore, the overall utility of forest management strategies were compared under the priority settings of stakeholders by a multi-criteria analysis (MCA) method. Significant differences in priority setting and the choice of an adequate management strategy were found for the environmentalists on one side and the more economy-oriented forest managers of public and private owned forests on the other side. From an ecological perspective, a conservation strategy would be preferable under all climate scenarios, but the business as usual management would also fit the expectations under the current climate. In contrast, a forest manager in public-owned forests or a private forest owner would prefer a management strategy with an intermediate thinning intensity and a high share of pine stands to enhance income from timber production while maintaining the other forest functions. The analysis served as an example for the combined application of simulation tools and a MCA method for the evaluation of management strategies under multi-purpose and multi-user settings with changing climatic conditions. Another focus was set on quantifying the overall effect of forest management on carbon sequestration in the forest sector and the wood industry sector plus substitution effects. To achieve this objective, the carbon emission reduction potential of material and energy substitution (Smat and Sen) was estimated based on a literature review. On average, for each tonne of dry wood used in a wood product substituting a non-wood product, 0.71 fewer tonnes of fossil carbon are emitted into to the atmosphere. Based on Smat and Sen, the calculation of the carbon emission reduction through substitution was implemented in the WPM. Carbon sequestration and substitution effects of management strategies were simulated at three local scales using the WPM and the forest growth models 4C (management unit level) or EFISCEN (federal state of Brandenburg and Germany). An investigation was conducted on the influence of uncertainties in the initialisation of the WPM, Smat, and basic conditions of the wood product sector on carbon sequestration. Results showed that carbon sequestration in the wood industry sector plus substitution effects exceeded sequestration in the forest sector. In contrast to the carbon pools in the forest sector, which acted as sink or source, the substitution effects continually reduced carbon emission as long as forests are managed and timber is harvested. The main climate protection function was investigated for energy substitution which accounted for about half of the total carbon sequestration, followed by carbon storage in landfills. In Germany, the absolute annual carbon sequestration in the forest and wood industry sector plus substitution effects was 19.9 Mt C. Over 50 years the wood industry sector contributed 70% of the total carbon sequestration plus substitution effects.
In semi-arid savannah ecosystems, the vegetation structure and composition, i.e. the architecture of trees, shrubs, grass tussocks and herbaceous plants, offer a great variety of habitats and niches to sustain animal diversity. In the last decades intensive human land use practises like livestock farming have altered the vegetation in savannah ecosystems worldwide. Extensive grazing leads to a reduction of the perennial and herbaceous vegetation cover, which results in an increased availability of bare soil. Both, the missing competition with perennial grasses and the increase of bare soils favour shrub on open ground and lead to area-wide shrub encroachment. As a consequence of the altered vegetation structure and composition, the structural diversity declines. It has been shown that with decreasing structural diversity animal diversity decline across a variety of taxa. Knowledge on the effects of overgrazing on reptiles, which are an important part of the ecosystem, are missing. Furthermore, the impact of habitat degradation on factors of a species population dynamic and life history, e.g., birth rate, survival rate, predation risk, space requirements or behavioural adaptations are poorly known. Therefore, I investigated the impact of overgrazing on the reptile community in the southern Kalahari. Secondly I analysed population dynamics and the behaviour of the Spotted Sand Lizard, Pedioplanis l. lineoocellata. All four chapters clearly demonstrate that habitat degradation caused by overgrazing had a severe negative impact upon (i) the reptile community as a whole and (ii) on population parameters of Pedioplanis l. lineoocellata. Chapter one showed a significant decline of regional reptile diversity and abundance in degraded habitats. In chapter two I demonstrated that P. lineoocellata moves more frequently, spends more time moving and covers larger distances in degraded than in non-degraded habitats. In addition, home range size of the lizard species increases in degraded habitats as shown by chapter three. Finally, chapter four showed the negative impacts of overgrazing on several population parameters of P. lineoocellata. Absolute population size of adult and juvenile lizards, survival rate and birth rate are significantly lower in degraded habitats. Furthermore, the predation risk was greatly increased in degraded habitats. A combination of a variety of aspects can explain the negative impact of habitat degradation on reptiles. First, reduced prey availability negatively affects survival rate, the birth rate and overall abundance. Second, the loss of perennial plant cover leads to a loss of niches and to a reduction of opportunities to thermoregulate. Furthermore, a loss of cover and is associated with increased predation risk. A major finding of my thesis is that the lizard P. lineoocellata can alter its foraging strategy. Species that are able to adapt and change behaviour, such as P. lineoocellata can effectively buffer against changes in their environment. Furthermore, perennial grass cover can be seen as a crucial ecological component of the vegetation in the semi-arid savannah system of the southern Kalahari. If perennial grass cover is reduced to a certain degree reptile diversity will decline and most other aspects of reptile life history will be negatively influenced. Savannah systems are characterised by a mixture of trees, shrubs and perennial grasses. These three vegetation components determine the composition and structure of the vegetation and accordingly influence the faunal diversity. Trees are viewed as keystone structures and focal points of animal activity for a variety of species. Trees supply animals with shelter, shade and food and act as safe sites, nesting sites, observation posts and foraging sites. Recent research demonstrates a positive influence of shrub patches on animal diversity. Moreover, it would seem that intermediate shrub cover can also sustain viable populations in savannah landscapes as has been demonstrated for small carnivores and rodent species. The influence of perennial grasses on faunal diversity did not receive the same attention as the influence of trees and shrubs. In my thesis I didn’t explicitly measure the direct effects of perennial grasses but my results strongly imply that it has an important role. If the perennial grass cover is significantly depleted my results suggest it will negatively influence reptile diversity and abundance and on several populations parameters of P. lineoocellata. Perennial grass cover is associated with the highest prey abundance, reptile diversity and reptile abundance. It provides reptiles both a refuge from predators and opportunities to optimise thermoregulation. The relevance of each of the three vegetation structural elements is different for each taxa and species. In conclusion, I can all three major vegetation structures in the savannah system are important for faunal diversity.
This thesis aims to quantify the human impact on the natural resource water at the landscape scale. The drivers in the federal state of Brandenburg (Germany), the area under investigation, are land-use changes induced by policy decisions at European and federal state level. The water resources of the federal state are particularly sensitive to changes in land-use due to low precipitation rates in the summer combined with sandy soils and high evapotranspiration rates. Key elements in landscape hydrology are forests because of their unique capacity to transport water from the soil to the atmosphere. Given these circumstances, decisions made at any level of administration that may have effects on the forest sector in the state are critical in relation to the water cycle. It is therefore essential to evaluate any decision that may change forest area and structure in such a sensitive region. Thus, as a first step, it was necessary to develop and implement a model able to simulate possible interactions and feedbacks between forested surfaces and the hydrological cycle at the landscape scale. The result is a model for simulating the hydrological properties of forest stands based on a robust computation of the temporal and spatial LAI (leaf area index) dynamics. The approach allows the simulation of all relevant hydrological processes with a low parameter demand. It includes the interception of precipitation and transpiration of forest stands with and without groundwater in the rooting zone. The model also considers phenology, biomass allocation, as well as mortality and simple management practices. It has been implemented as a module in the eco-hydrological model SWIM (Soil and Water Integrated Model). This model has been tested in two pre-studies to verify the applicability of its hydrological process description for the hydrological conditions typical for the state. The newly implemented forest module has been tested for Scots Pine (Pinus sylvestris) and in parts for Common Oak (Quercus robur and Q. petraea) in Brandenburg. For Scots Pine the results demonstrate a good simulation of annual biomass increase and LAI in addition to the satisfactory simulation of litter production. A comparison of the simulated and measured data of the May sprout for Scots pine and leaf unfolding for Oak, as well as the evaluation against daily transpiration measurements for Scots Pine, does support the applicability of the approach. The interception of precipitation has also been simulated and compared with weekly observed data for a Scots Pine stand which displays satisfactory results in both the vegetation periods and annual sums. After the development and testing phase, the model is used to analyse the effects of two scenarios. The first scenario is an increase in forest area on abandoned agricultural land that is triggered by a decrease in European agricultural production support. The second one is a shift in species composition from predominant Scots Pine to Common Oak that is based on decisions of the regional forestry authority to support a more natural species composition. The scenario effects are modelled for the federal state of Brandenburg on a 50m grid utilising spatially explicit land-use patterns. The results, for the first scenario, suggest a negative impact of an increase in forest area (9.4% total state area) on the regional water balance, causing an increase in mean long-term annual evapotranspiration of 3.7% at 100% afforestation when compared to no afforestation. The relatively small annual change conceals a much more pronounced seasonal effect of a mean long-term evapotranspiration increase by 25.1% in the spring causing a pronounced reduction in groundwater recharge and runoff. The reduction causes a lag effect that aggravates the scarcity of water resources in the summer. In contrast, in the second scenario, a change in species composition in existing forests (29.2% total state area) from predominantly Scots Pine to Common Oak decreases the long-term annual mean evapotranspiration by 3.4%, accompanied by a much weaker, but apparent, seasonal pattern. Both scenarios exhibit a high spatial heterogeneity because of the distinct natural conditions in the different regions of the state. Areas with groundwater levels near the surface are particularly sensitive to changes in forest area and regions with relatively high proportion of forest respond strongly to the change in species composition. In both cases this regional response is masked by a smaller linear mean effect for the total state area. Two critical sources of uncertainty in the model results have been investigated. The first one originates from the model calibration parameters estimated in the pre-study for lowland regions, such as the federal state. The combined effect of the parameters, when changed within their physical meaningful limits, unveils an overestimation of the mean water balance by 1.6%. However, the distribution has a wide spread with 14.7% for the 90th percentile and -9.9% for the 10th percentile. The second source of uncertainty emerges from the parameterisation of the forest module. The analysis exhibits a standard deviation of 0.6 % over a ten year period in the mean of the simulated evapotranspiration as a result of variance in the key forest parameters. The analysis suggests that the combined uncertainty in the model results is dominated by the uncertainties of calibration parameters. Therefore, the effect of the first scenario might be underestimated because the calculated increase in evapotranspiration is too small. This may lead to an overestimation of the water balance towards runoff and groundwater recharge. The opposite can be assumed for the second scenario in which the decrease in evapotranspiration might be overestimated.
The correction of software failures tends to be very cost-intensive because their debugging is an often time-consuming development activity. During this activity, developers largely attempt to understand what causes failures: Starting with a test case that reproduces the observable failure they have to follow failure causes on the infection chain back to the root cause (defect). This idealized procedure requires deep knowledge of the system and its behavior because failures and defects can be far apart from each other. Unfortunately, common debugging tools are inadequate for systematically investigating such infection chains in detail. Thus, developers have to rely primarily on their intuition and the localization of failure causes is not time-efficient. To prevent debugging by disorganized trial and error, experienced developers apply the scientific method and its systematic hypothesis-testing. However, even when using the scientific method, the search for failure causes can still be a laborious task. First, lacking expertise about the system makes it hard to understand incorrect behavior and to create reasonable hypotheses. Second, contemporary debugging approaches provide no or only partial support for the scientific method. In this dissertation, we present test-driven fault navigation as a debugging guide for localizing reproducible failures with the scientific method. Based on the analysis of passing and failing test cases, we reveal anomalies and integrate them into a breadth-first search that leads developers to defects. This systematic search consists of four specific navigation techniques that together support the creation, evaluation, and refinement of failure cause hypotheses for the scientific method. First, structure navigation localizes suspicious system parts and restricts the initial search space. Second, team navigation recommends experienced developers for helping with failures. Third, behavior navigation allows developers to follow emphasized infection chains back to root causes. Fourth, state navigation identifies corrupted state and reveals parts of the infection chain automatically. We implement test-driven fault navigation in our Path Tools framework for the Squeak/Smalltalk development environment and limit its computation cost with the help of our incremental dynamic analysis. This lightweight dynamic analysis ensures an immediate debugging experience with our tools by splitting the run-time overhead over multiple test runs depending on developers’ needs. Hence, our test-driven fault navigation in combination with our incremental dynamic analysis answers important questions in a short time: where to start debugging, who understands failure causes best, what happened before failures, and which state properties are infected.
In den vergangenen Jahren wurden stetig wachsende Produktionskapazitäten von Biokunststoffen aus nachwachsenden Rohstoffe nverzeichnet. Trotz großer Produktionskapazitäten und einem geeigneten Eigenschaftsprofil findet Stärke nur als hydrophile, mit Weichmachern verarbeitete thermoplastische Stärke (TPS) in Form von Blends mit z. B. Polyestern Anwendung. Gleiches gilt für Kunststoffe auf Proteinbasis. Die vorliegende Arbeit hat die Entwicklung von Biokunststoffen auf Stärkebasis zum Ziel, welche ohne externe Weichmacher thermoplastisch verarbeitbar und hydrophob sind sowie ein mechanisches Eigenschaftsprofil aufweisen, welches ein Potenzial zur Herstellung von Materialien für eine Anwendung als Verpackungsmittel bietet. Um die Rohstoffbasis für Biokunststoffe zu erweitern, soll das erarbeitete Konzept auf zwei industriell verfügbare Proteintypen, Zein und Molkenproteinisolat (WPI), übertragen werden. Als geeignete Materialklasse wurden Fettsäureester der Stärke herausgearbeitet. Zunächst fand ein Vergleich der Säurechlorid-Veresterung und der Umesterung von Fettsäurevinylestern statt, woraus letztere als geeignetere Methode hervorging. Durch Variation der Reaktionsparameter konnte diese optimiert und auf eine Serie der Fettsäurevinylester von Butanoat bis Stearat für DS-Werte bis zu 2,2-2,6 angewandt werden. Möglich war somit eine systematische Studie unter Variation der veresterten Fettsäure sowie des Substitutionsgrades (DS). Sämtliche Produkte mit einem DS ab 1,5 wiesen eine ausgprägte Löslichkeit in organischen Lösungsmitteln auf wodurch sowohl die Aufnahme von NMR-Spektren als auch Molmassenbestimmung mittels Größenausschlusschromatographie mit gekoppelter Mehrwinkel-Laserlichtstreuung (GPC-MALLS) möglich waren. Durch dynamische Lichtstreuung (DLS) wurde das Löslichkeitsverhalten veranschaulicht. Sämtliche Produkte konnten zu Filmen verarbeitet werden, wobei Materialien mit DS 1,5-1,7 hohe Zugfestigkeiten (bis zu 42 MPa) und Elastizitätsmodule (bis 1390 MPa) aufwiesen. Insbesondere Stärkehexanoat mit DS <2 sowie Stärkebutanoat mit DS >2 hatten ein mechanisches Eigenschaftsprofil, welches insbesondere in Bezug auf die Festigkeit/Steifigkeit vergleichbar mit Verpackungsmaterialien wie Polyethylen war (Zugfestigkeit: 15-32 MPa, E-Modul: 300-1300 MPa). Zugfestigkeit und Elastizitätsmodul nahmen mit steigender Kettenlänge der veresterten Fettsäure ab. Ester längerkettiger Fettsäuren (C16-C18) waren spröde. Über Weitwinkel-Röntgenstreuung (WAXS) und Infrarotspektroskopie (ATR-FTIR) konnte der Verlauf der Festigkeiten mit einer zunehmenden Distanz der Stärke im Material begründet werden. Es konnten von DS und Kettenlänge abhängige Glasübergänge detektiert werden, die kristallinen Strukturen der langkettigen Fettsäuren zeigten einen Schmelzpeak. Die Hydrophobie der Filme wurde anhand von Kontaktwinkeln >95° gegen Wasser dargestellt. Blends mit biobasierten Polyterpenen sowie den in der Arbeit hergestellten Zein-Acylderivaten ermöglichten eine weitere Verbesserung der Zugfestigkeit bzw. des Elastizitätsmoduls hochsubstituierter Produkte. Eine thermoplastische Verarbeitung mittels Spritzgießen war sowohl für Produkte mit hohem als auch mittlerem DS-Wert ohne jeglichen Zusatz von Weichmachern möglich. Es entstanden homogene, transparente Prüfstäbe. Untersuchungen der Härte ergaben auch hier für Stärkehexanoat und –butanoat mit Polyethylen vergleichbare Werte. Ausgewählte Produkte wurden zu Fasern nach dem Schmelzspinnverfahren verarbeitet. Hierbei wurden insbesondere für hochsubstituierte Derivate homogenen Fasern erstellt, welche im Vergleich zur Gießfolie signifikant höhere Zugfestigkeiten aufwiesen. Stärkeester mit mittlerem DS ließen sich ebenfalls verarbeiten. Zunächst wurden für eine Übertragung des Konzeptes auf die Proteine Zein und WPI verschiedene Synthesemethoden verglichen. Die Veresterung mit Säurechloriden ergab hierbei die höchsten Werte. Im Hinblick auf eine gute Löslichkeit in organischen Lösungsmitteln wurde für WPI die Veresterung mit carbonyldiimidazol (CDI)-aktivierten Fettsäuren in DMSO und für Zein die Veresterung mit Säu-rechloriden in Pyridin bevorzugt. Es stellte sich heraus, dass acyliertes WPI zwar hydrophob, jedoch ohne Weichmacher nicht thermoplastisch verarbeitet werden konnte. Die Erstellung von Gießfolien führte zu Sprödbruchverhalten. Unter Zugabe der biobasierten Ölsäure wurde die Anwendung von acyliertem WPI als thermoplastischer Filler z. B. in Blends mit Stärkeestern dargestellt. Im Gegensatz hierzu zeigte acyliertes Zein Glasübergänge <100 °C bei ausreichender Stabilität (150-200 °C). Zeinoleat konnte ohne Weichmacher zu einer transparenten Gießfolie verarbeitet werden. Sämtliche Derivate erwiesen sich als ausgeprägt hydrophob. Zeinoleat konnte über das Schmelzspinnverfahren zu thermoplastischen Fasern verarbeitet werden.
Adipositas gilt seit einigen Jahren als eine der häufigsten chronischen Erkrankungen des Kindes- und Jugendalters. Welche Faktoren zu einer erfolgreichen Behandlung der Adipositas im Kindes- und Jugendalter führen, sind jedoch noch immer nicht ausreichend geklärt. Ein wichtiger – bisher jedoch weitgehend unbeachteter – Faktor, welcher möglicherweise wegweisend für den Therapieverlauf sein kann, ist das subjektive Krankheitskonzept der betroffenen Kinder. Das bedeutsamste theoretische Modell, welches den Einfluss der individuellen Krankheitsvorstellungen auf den Regulationsprozess eines Menschen im Umgang mit Erkrankungen beschreibt, ist das Common Sense Model of Illness Representation (CSM) von Howard Leventhal. Ziel der vorliegenden Arbeit war es die subjektiven Krankheitskonzepte adipöser Kinder zu erfassen und ihren Einfluss auf den Regulationsprozess zu analysieren. In einer ersten Untersuchung wurde mittels Daten von 168 adipösen Kindern im Alter von 8 bis 12 Jahren zunächst ein Fragebogen zur Erfassung der subjektiven Krankheitskonzepte entwickelt. Die Ergebnisse weisen darauf hin, dass der Fragebogen als reliabel und valide eingeschätzt werden kann. Mit Hilfe dieses Fragebogens konnte nachgewiesen werden, dass adipöse Kinder Konstrukte über ihre Erkrankung haben, welche in eigenständigen Dimensionen gespeichert werden. Die gefundenen initialen Krankheitskonzepte adipöser Kinder ergeben ein homogenes erwartungskonformes Bild. In einer zweiten Untersuchung wurden anschließend die subjektiven Krankheitskonzepte adipöser Kinder, die Bewältigungsstrategien sowie gesundheits- und krankheitsrelevante Kriteriumsvariablen untersucht. Die Befragungen erfolgten vor Beginn einer stationären Reha (T1), am Ende der Reha (T2) sowie sechs Monate nach Reha-Ende (T3). Von 107 Kindern liegen Daten zu allen drei Messzeitpunkten vor. Es konnte ein Zusammenhang zwischen Krankheitskonzepten, Bewältigungsstrategien und spezifischen Kriteriumsvariablen bei adipösen Kindern nachgewiesen werden. Die Analyse der Wirkzusammenhänge konnte zeigen, dass die kindlichen Krankheitskonzepte – neben den indirekten Einflüssen über die Bewältigungsstrategien – die Kriteriumsvariablen vor allem auch direkt beeinflussen können. Der Einfluss der initialen Krankheitskonzepte adipöser Kinder konnte hierbei sowohl im querschnittlichen als auch im längsschnittlichen Design bestätigt werden. Zudem konnten vielfältige Einflüsse der Veränderung der subjektiven Krankheitskonzepte während der Therapie gefunden werden. Die Veränderungen der Krankheitskonzepte wirken sowohl mittelfristig auf die individuellen Bewältigungsstrategien am Ende der Reha als auch längerfristig auf die adipositasspezifischen Kriteriumsvariablen Gewicht, Ernährung, Bewegung und Lebensqualität. Die Befunde stärken die Relevanz und das Potential der zielgerichteten Modifikation adaptiver bzw. maladaptiver Krankheitskonzepte innerhalb der stationären Therapie der kindlichen Adipositas. Zudem konnte bestätigt werden, dass subjektive Krankheitskonzepte und ihre Veränderung innerhalb der Therapie einen relevanten Beitrag zur Vorhersage des kindlichen Therapieerfolgs über einen längerfristigen Zeitraum leisten können.
I perform and analyse the first ever calculations of rotating stellar iron core collapse in {3+1} general relativity that start out with presupernova models from stellar evolutionary calculations and include a microphysical finite-temperature nuclear equation of state, an approximate scheme for electron capture during collapse and neutrino pressure effects. Based on the results of these calculations, I obtain the to-date most realistic estimates for the gravitational wave signal from collapse, bounce and the early postbounce phase of core collapse supernovae. I supplement my {3+1} GR hydrodynamic simulations with 2D Newtonian neutrino radiation-hydrodynamic supernova calculations focussing on (1) the late postbounce gravitational wave emission owing to convective overturn, anisotropic neutrino emission and protoneutron star pulsations, and (2) on the gravitational wave signature of accretion-induced collapse of white dwarfs to neutron stars.
Semiclassical asymptotics for the scattering amplitude in the presence of focal points at infinity
(2006)
We consider scattering in $\R^n$, $n\ge 2$, described by the Schr\"odinger operator $P(h)=-h^2\Delta+V$, where $V$ is a short-range potential. With the aid of Maslov theory, we give a geometrical formula for the semiclassical asymptotics as $h\to 0$ of the scattering amplitude $f(\omega_-,\omega_+;\lambda,h)$ $\omega_+\neq\omega_-$) which remains valid in the presence of focal points at infinity (caustics). Crucial for this analysis are precise estimates on the asymptotics of the classical phase trajectories and the relationship between caustics in euclidean phase space and caustics at infinity.
Cloud computing is a model for enabling on-demand access to a shared pool of computing resources. With virtually limitless on-demand resources, a cloud environment enables the hosted Internet application to quickly cope when there is an increase in the workload. However, the overhead of provisioning resources exposes the Internet application to periods of under-provisioning and performance degradation. Moreover, the performance interference, due to the consolidation in the cloud environment, complicates the performance management of the Internet applications. In this dissertation, we propose two approaches to mitigate the impact of the resources provisioning overhead. The first approach employs control theory to scale resources vertically and cope fast with workload. This approach assumes that the provider has knowledge and control over the platform running in the virtual machines (VMs), which limits it to Platform as a Service (PaaS) and Software as a Service (SaaS) providers. The second approach is a customer-side one that deals with the horizontal scalability in an Infrastructure as a Service (IaaS) model. It addresses the trade-off problem between cost and performance with a multi-goal optimization solution. This approach finds the scale thresholds that achieve the highest performance with the lowest increase in the cost. Moreover, the second approach employs a proposed time series forecasting algorithm to scale the application proactively and avoid under-utilization periods. Furthermore, to mitigate the interference impact on the Internet application performance, we developed a system which finds and eliminates the VMs suffering from performance interference. The developed system is a light-weight solution which does not imply provider involvement. To evaluate our approaches and the designed algorithms at large-scale level, we developed a simulator called (ScaleSim). In the simulator, we implemented scalability components acting as the scalability components of Amazon EC2. The current scalability implementation in Amazon EC2 is used as a reference point for evaluating the improvement in the scalable application performance. ScaleSim is fed with realistic models of the RUBiS benchmark extracted from the real environment. The workload is generated from the access logs of the 1998 world cup website. The results show that optimizing the scalability thresholds and adopting proactive scalability can mitigate 88% of the resources provisioning overhead impact with only a 9% increase in the cost.
LCST-type synthetic thermoresponsive polymers can reversibly respond to certain stimuli in aqueous media with a massive change of their physical state. When fluorophores, that are sensitive to such changes, are incorporated into the polymeric structure, the response can be translated into a fluorescence signal. Based on this idea, this thesis presents sensing schemes which transduce the stimuli-induced variations in the solubility of polymer chains with covalently-bound fluorophores into a well-detectable fluorescence output. Benefiting from the principles of different photophysical phenomena, i.e. of fluorescence resonance energy transfer and solvatochromism, such fluorescent copolymers enabled monitoring of stimuli such as the solution temperature and ionic strength, but also of association/disassociation mechanisms with other macromolecules or of biochemical binding events through remarkable changes in their fluorescence properties. For instance, an aqueous ratiometric dual sensor for temperature and salts was developed, relying on the delicate supramolecular assembly of a thermoresponsive copolymer with a thiophene-based conjugated polyelectrolyte. Alternatively, by taking advantage of the sensitivity of solvatochromic fluorophores, an increase in solution temperature or the presence of analytes was signaled as an enhancement of the fluorescence intensity. A simultaneous use of the sensitivity of chains towards the temperature and a specific antibody allowed monitoring of more complex phenomena such as competitive binding of analytes. The use of different thermoresponsive polymers, namely poly(N-isopropylacrylamide) and poly(meth)acrylates bearing oligo(ethylene glycol) side chains, revealed that the responsive polymers differed widely in their ability to perform a particular sensing function. In order to address questions regarding the impact of the chemical structure of the host polymer on the sensing performance, the macromolecular assembly behavior below and above the phase transition temperature was evaluated by a combination of fluorescence and light scattering methods. It was found that although the temperature-triggered changes in the macroscopic absorption characteristics were similar for these polymers, properties such as the degree of hydration or the extent of interchain aggregations differed substantially. Therefore, in addition to the demonstration of strategies for fluorescence-based sensing with thermoresponsive polymers, this work highlights the role of the chemical structure of the two popular thermoresponsive polymers on the fluorescence response. The results are fundamentally important for the rational choice of polymeric materials for a specific sensing strategy.