Refine
Year of publication
- 2024 (288) (remove)
Document Type
- Doctoral Thesis (108)
- Article (69)
- Monograph/Edited Volume (29)
- Part of a Book (23)
- Other (15)
- Conference Proceeding (9)
- Working Paper (9)
- Master's Thesis (8)
- Part of Periodical (7)
- Report (3)
Is part of the Bibliography
- yes (288) (remove)
Keywords
- Judentum (5)
- Arctic (4)
- Arktis (4)
- Brandenburg (4)
- Kommunalwissenschaft (4)
- Kommune (4)
- digital transformation (4)
- experiment (4)
- machine learning (4)
- Christentum (3)
Institute
- Fachgruppe Politik- & Verwaltungswissenschaft (26)
- Fachgruppe Betriebswirtschaftslehre (22)
- Extern (20)
- Bürgerliches Recht (17)
- Historisches Institut (17)
- Institut für Biochemie und Biologie (16)
- Öffentliches Recht (16)
- Fachgruppe Soziologie (15)
- Fachgruppe Volkswirtschaftslehre (15)
- Hasso-Plattner-Institut für Digital Engineering GmbH (14)
This dissertation examines the integration of incongruent visual-scene and morphological-case information (“cues”) in building thematic-role representations of spoken relative clauses in German.
Addressing the mutual influence of visual and linguistic processing, the Coordinated Interplay Account (CIA) describes a mechanism in two steps supporting visuo-linguistic integration (Knoeferle & Crocker, 2006, Cog Sci). However, the outcomes and dynamics of integrating incongruent thematic-role representations from distinct sources have been investigated scarcely. Further, there is evidence that both second-language (L2) and older speakers may rely on non-syntactic cues relatively more than first-language (L1)/young speakers. Yet, the role of visual information for thematic-role comprehension has not been measured in L2 speakers, and only limitedly across the adult lifespan.
Thematically unambiguous canonically ordered (subject-extracted) and noncanonically ordered (object-extracted) spoken relative clauses in German (see 1a-b) were presented in isolation and alongside visual scenes conveying either the same (congruent) or the opposite (incongruent) thematic relations as the sentence did.
1 a Das ist der Koch, der die Braut verfolgt.
This is the.NOM cook who.NOM the.ACC bride follows
This is the cook who is following the bride.
b Das ist der Koch, den die Braut verfolgt.
This is the.NOM cook whom.ACC the.NOM bride follows
This is the cook whom the bride is following.
The relative contribution of each cue to thematic-role representations was assessed with agent identification. Accuracy and latency data were collected post-sentence from a sample of L1 and L2 speakers (Zona & Felser, 2023), and from a sample of L1 speakers from across the adult lifespan (Zona & Reifegerste, under review). In addition, the moment-by-moment dynamics of thematic-role assignment were investigated with mouse tracking in a young L1 sample (Zona, under review).
The following questions were addressed: (1) How do visual scenes influence thematic-role representations of canonical and noncanonical sentences? (2) How does reliance on visual-scene, case, and word-order cues vary in L1 and L2 speakers? (3) How does reliance on visual-scene, case, and word-order cues change across the lifespan?
The results showed reliable effects of incongruence of visually and linguistically conveyed thematic relations on thematic-role representations. Incongruent (vs. congruent) scenes yielded slower and less accurate responses to agent-identification probes presented post-sentence. The recently inspected agent was considered as the most likely agent ~300ms after trial onset, and the convergence of visual scenes and word order enabled comprehenders to assign thematic roles predictively.
L2 (vs. L1) participants relied more on word order overall. In response to noncanonical clauses presented with incongruent visual scenes, sensitivity to case predicted the size of incongruence effects better than L1-L2 grouping. These results suggest that the individual’s ability to exploit specific cues might predict their weighting.
Sensitivity to case was stable throughout the lifespan, while visual effects increased with increasing age and were modulated by individual interference-inhibition levels. Thus, age-related changes in comprehension may stem from stronger reliance on visually (vs. linguistically) conveyed meaning.
These patterns represent evidence for a recent-role preference – i.e., a tendency to re-assign visually conveyed thematic roles to the same referents in temporally coordinated utterances. The findings (i) extend the generalizability of CIA predictions across stimuli, tasks, populations, and measures of interest, (ii) contribute to specifying the outcomes and mechanisms of detecting and indexing incongruent representations within the CIA, and (iii) speak to current efforts to understand the sources of variability in sentence comprehension.
Have you already swiped or liked this morning? Have you taken part in a video conference at work, used or programmed a database? Have you paid with your smartphone on the way home, listened to a podcast, or extended the lending of books you borrowed from the library? And in the evening, have you filled out your tax return application on ELSTER.de on your tablet, shopped online, or paid invoices before you were tempted to watch a series on a streaming platform?
Our lives are entirely digitalized.
These changes make many things faster, easier, and more efficient. But keeping pace with these changes demands a lot from us, and not everyone succeeds. There are people who prefer to go to the bank to make a transfer, leave the programming to the experts, send their tax return by mail, and only use their smartphone to make phone calls. They don’t want to keep pace, or maybe they can’t. They haven’t learned these things. Others, younger people, grow up as “digital natives” surrounded by digital devices, tools, and processes. But does that mean they really know how to use them? Or do they also need digital education?
But what does successful digital education actually look like? Does it teach us how to use a tablet, how to google properly, and how to write Excel spreadsheets? Perhaps it’s about more than that. It’s about understanding the comprehensive change that has been taking hold of our world since it was broken down into digital ones and zeros and rebuilt virtually. But how do we learn to live in a world of digitality – with all that it entails, and to our benefit?
For the new issue of “Portal Wissen”, we looked around at the university and interviewed researchers about the role that the connection between digitalization and learning plays in the research of various disciplines. We spoke to Katharina Scheiter, Professor of Digital Education, about the future of German schools and had several experts show us examples of how digital tools can improve learning in schools. We also talked to computer science and agricultural researchers about how even experienced farmers can still learn a lot about their land and their work thanks to digital tools. We spoke to educational researchers who are using big data to analyze how boys and girls learn and what the possible causes for differences are. Education and political scientist Nina Kolleck, on the other hand, looks at education against the backdrop of globalization and relies on the analysis of large amounts of social media data.
Of course, we don’t lose sight of the diversity of research at the University of Potsdam. We learn, for example, what alternatives to antibiotics could soon be available. This magazine also looks at stress and how it makes us ill as well as the research into sustainable ore extraction.
A new feature of our magazine is a whole series of shorter articles that invite you to browse and read: from research news and photographic insights into laboratories to simple explanations of complex phenomena and outlooks into the wider world of research to a small scientific utopia and a personal thanks to research. All this in the name of education, of course. Enjoy your read!
Bildung:digital
(2024)
Heute Morgen schon im Bett geswiped, geliked oder gepostet? Auf Arbeit an einer Video-Konferenz teilgenommen, eine Datenbank benutzt oder programmiert? Auf dem Heimweg schnell noch im Laden mit dem Smartphone bezahlt, Podcasts gehört und die Ausleihe der Bibliotheksbücher verlängert? Und abends auf der Couch mit dem Tablet auf ELSTER.de die Steuererklärung ausgefüllt, online geshoppt oder Rechnungen bezahlt, ehe die Streaming-Plattform mit einer Serie lockt?
Unser Leben ist durch und durch digitalisiert. Diese Veränderungen machen vieles schneller, leichter, effizienter. Doch damit Schritt zu halten, verlangt uns einiges ab und gelingt beileibe nicht allen. Es gibt Menschen, die für eine Überweisung lieber zur Bank gehen, das Programmieren den Experten überlassen, die Steuererklärung per Post schicken und das Smartphone nur zum Telefonieren benutzen. Sie wollen nicht, vielleicht können sie auch nicht. Haben es nicht gelernt. Andere, jüngere Menschen, wachsen als „Digital Natives“ inmitten digitaler Geräte, Tools und Prozesse auf. Aber können sie deshalb wirklich damit umgehen? Oder brauchen auch sie digitale Bildung?
Aber wie sieht erfolgreiche digitale Bildung eigentlich aus? Lernen wir dabei ein Tablet zu bedienen, richtig zu googeln und Excel-Tabellen zu schreiben? Möglicherweise geht es um mehr: darum, den umfassenden Wandel zu verstehen, der unsere Welt erfasst, seitdem sie in Einsen und Nullen zerlegt und virtuell neu aufgebaut wird. Aber wie lernen wir, in einer Welt der Digitalität zu leben – mit allem, was dazu gehört und zu unserem Nutzen? Für die aktuelle Ausgabe der „Portal Wissen“ haben wir uns an der Universität Potsdam umgeschaut, welche Rolle die Verbindung von Digitalisierung und Lernen in der Forschung der verschiedenen Disziplinen spielt: Wir haben mit Katharina Scheiter, Professorin für digitale Bildung, über die Zukunft in deutschen Schulen gesprochen und uns gleich von mehreren Expert*innen Beispiele dafür zeigen lassen, wie digitale Instrumente schulisches Lernen, aber auch Weiterbildung im Berufsleben verbessern können. Außerdem haben uns Forschende aus Informatik und Agrarforschung vorgeführt, wie auch gestandene Landwirte dank digitaler Hilfsmittel noch viel über ihr Land und ihre Arbeit lernen können. Wir haben mit Bildungsforschenden gesprochen, die mithilfe von Big Data analysieren, wie Jungen und Mädchen lernen und wo mögliche Ursachen für Unterschiede zu suchen sind. Die Bildungsund Politikwissenschaftlerin Nina Kolleck wiederum schaut auf Bildung vor dem Hintergrund der Globalisierung und setzt dabei auf die Auswertung von großen Mengen Social-Media- Daten.
Dabei verlieren wir natürlich die Vielfalt der Forschung an der Uni Potsdam nicht aus den Augen: Wir stellen der Strafrechtlerin Anna Albrecht 33 Fragen, begleiten eine Gruppe von Geoforschenden in den Himalaya und lassen uns erklären, welche Alternativen es bald zu Antibiotika geben könnte. Außerdem geht es in diesem Magazin um Stress und wie er uns krankmacht, die Forschung zu nachhaltiger Erzgewinnung und neue Ansätze in der Schulentwicklung.
Neu ist auch eine ganze Reihe kürzerer Beiträge, die zum Blättern und Schmökern einladen: von Forschungsnews und Personalia- Infos über fotografische Einblicke in Labore, einfache Erklärungen komplexer Phänomene und Ausblicke in die weite Forschungswelt bis hin zu einer kleinen Wissenschaftsutopie, einem persönlichen Dank an die Forschung und einem Wissenschaftscomic. All das im Namen der Bildung, versteht sich. Viel Vergnügen bei der Lektüre!
Wo programmiert wird, da passieren Fehler. Um das Debugging, also die Suche sowie die Behebung von Fehlern in Quellcode, stärker explizit zu adressieren, verfolgt die vorliegende Arbeit das Ziel, entlang einer prototypischen Lernumgebung sowohl ein systematisches Vorgehen während des Debuggings zu vermitteln als auch Gestaltungsfolgerungen für ebensolche Lernumgebungen zu identifizieren. Dazu wird die folgende Forschungsfrage gestellt: Wie verhalten sich die Lernenden während des kurzzeitigen Gebrauchs einer Lernumgebung nach dem Cognitive Apprenticeship-Ansatz mit dem Ziel der expliziten Vermittlung eines systematischen Debuggingvorgehens und welche Eindrücke entstehen während der Bearbeitung?
Zur Beantwortung dieser Forschungsfrage wurde orientierend an literaturbasierten Implikationen für die Vermittlung von Debugging und (medien-)didaktischen Gestaltungsaspekten eine prototypische Lernumgebung entwickelt und im Rahmen einer qualitativen Nutzerstudie mit Bachelorstudierenden informatischer Studiengänge erprobt. Hierbei wurden zum einen anwendungsbezogene Verbesserungspotenziale identifiziert. Zum anderen zeigte sich insbesondere gegenüber der Systematisierung des Debuggingprozesses innerhalb der Aufgabenbearbeitung eine positive Resonanz. Eine Untersuchung, inwieweit sich die Nutzung der Lernumgebung längerfristig auf das Verhalten von Personen und ihre Vorgehensweisen während des Debuggings auswirkt, könnte Gegenstand kommender Arbeiten sein.
The European Water Framework Directive (WFD) has identified river morphological alteration and diffuse pollution as the two main pressures affecting water bodies in Europe at the catchment scale. Consequently, river restoration has become a priority to achieve the WFD's objective of good ecological status. However, little is known about the effects of stream morphological changes, such as re-meandering, on in-stream nitrate retention at the river network scale. Therefore, catchment nitrate modeling is necessary to guide the implementation of spatially targeted and cost-effective mitigation measures. Meanwhile, Germany, like many other regions in central Europe, has experienced consecutive summer droughts from 2015-2018, resulting in significant changes in river nitrate concentrations in various catchments. However, the mechanistic exploration of catchment nitrate responses to changing weather conditions is still lacking.
Firstly, a fully distributed, process-based catchment Nitrate model (mHM-Nitrate) was used, which was properly calibrated and comprehensively evaluated at numerous spatially distributed nitrate sampling locations. Three calibration schemes were designed, taking into account land use, stream order, and mean nitrate concentrations, and they varied in spatial coverage but used data from the same period (2011–2019). The model performance for discharge was similar among the three schemes, with Nash-Sutcliffe Efficiency (NSE) scores ranging from 0.88 to 0.92. However, for nitrate concentrations, scheme 2 outperformed schemes 1 and 3 when compared to observed data from eight gauging stations. This was likely because scheme 2 incorporated a diverse range of data, including low discharge values and nitrate concentrations, and thus provided a better representation of within-catchment heterogenous. Therefore, the study suggests that strategically selecting gauging stations that reflect the full range of within-catchment heterogeneity is more important for calibration than simply increasing the number of stations.
Secondly, the mHM-Nitrate model was used to reveal the causal relations between sequential droughts and nitrate concentration in the Bode catchment (3200 km2) in central Germany, where stream nitrate concentrations exhibited contrasting trends from upstream to downstream reaches. The model was evaluated using data from six gauging stations, reflecting different levels of runoff components and their associated nitrate-mixing from upstream to downstream. Results indicated that the mHM-Nitrate model reproduced dynamics of daily discharge and nitrate concentration well, with Nash-Sutcliffe Efficiency ≥ 0.73 for discharge and Kling-Gupta Efficiency ≥ 0.50 for nitrate concentration at most stations. Particularly, the spatially contrasting trends of nitrate concentration were successfully captured by the model. The decrease of nitrate concentration in the lowland area in drought years (2015-2018) was presumably due to (1) limited terrestrial export loading (ca. 40% lower than that of normal years 2004-2014), and (2) increased in-stream retention efficiency (20% higher in summer within the whole river network). From a mechanistic modelling perspective, this study provided insights into spatially heterogeneous flow and nitrate dynamics and effects of sequential droughts, which shed light on water-quality responses to future climate change, as droughts are projected to be more frequent.
Thirdly, this study investigated the effects of stream restoration via re-meandering on in-stream nitrate retention at network-scale in the well-monitored Bode catchment. The mHM-Nitrate model showed good performance in reproducing daily discharge and nitrate concentrations, with median Kling-Gupta values of 0.78 and 0.74, respectively. The mean and standard deviation of gross nitrate retention efficiency, which accounted for both denitrification and assimilatory uptake, were 5.1 ± 0.61% and 74.7 ± 23.2% in winter and summer, respectively, within the stream network. The study found that in the summer, denitrification rates were about two times higher in lowland sub-catchments dominated by agricultural lands than in mountainous sub-catchments dominated by forested areas, with median ± SD of 204 ± 22.6 and 102 ± 22.1 mg N m-2 d-1, respectively. Similarly, assimilatory uptake rates were approximately five times higher in streams surrounded by lowland agricultural areas than in those in higher-elevation, forested areas, with median ± SD of 200 ± 27.1 and 39.1 ± 8.7 mg N m-2 d-1, respectively. Therefore, restoration strategies targeting lowland agricultural areas may have greater potential for increasing nitrate retention. The study also found that restoring stream sinuosity could increase net nitrate retention efficiency by up to 25.4 ± 5.3%, with greater effects seen in small streams. These results suggest that restoration efforts should consider augmenting stream sinuosity to increase nitrate retention and decrease nitrate concentrations at the catchment scale.
The dark side of metaverse: a multi-perspective of deviant behaviors from PLS-SEM and fsQCA findings
(2024)
The metaverse has created a huge buzz of interest because such a phenomenon is emerging. The behavioral aspect of the metaverse includes user engagement and deviant behaviors in the metaverse. Such technology has brought various dangers to individuals and society. There are growing cases reported of sexual abuse, racism, harassment, hate speech, and bullying because of online disinhibition make us feel more relaxed. This study responded to the literature call by investigating the effect of technical and social features through mediating roles of security and privacy on deviant behaviors in the metaverse. The data collected from virtual network users reached 1121 respondents. Partial Least Squares based structural equation modeling (PLS-SEM) and fuzzy set Qualitative Comparative Analysis (fsQCA) were used. PLS-SEM results revealed that social features such as user-to-user interaction, homophily, social ties, and social identity, and technical design such as immersive experience and invisibility significantly affect users’ deviant behavior in the metaverse. The fsQCA results provided insights into the multiple causal solutions and configurations. This study is exceptional because it provided decisive results by understanding the deviant behavior of users based on the symmetrical and asymmetrical approach to virtual networks.
Körper – Karte – Text
(2024)
Rabelais' Pentalogie um die Riesen Gargantua und Pantagruel spiegelt Aspekte des sich verändernden Weltbildes ihrer Entstehungszeit. Diese Studie untersucht auf der Folie der Theorie des Simulakrum Schrift, wie Körpermodellierungen und kartographisches imaginaire durch den Autor als Strategien der Verhüllung verborgener Botschaften eingesetzt werden. Sie zeigt an ausgewählten Beispielen des Quart Livre die Aufweichung der Grenzen von Körper, Karte und Text und deren Durchdringung. Die Metaphorizität des Textes gibt Aufschluss über seine Autoreflexivität und bewirkt eine gleichsam ganzheitliche Lektüreerfahrung. Schließlich avanciert die Fiktion in ihrer Trugbildhaftigkeit als grotesk-sinnlicher Körper und polysemantische Karte zum Welterklärungsmodell, das jedoch erst dechiffriert werden muss.
Human activities modify nature worldwide via changes in the environment, biodiversity and the functioning of ecosystems, which in turn disrupt ecosystem services and feed back negatively on humans. A pressing challenge is thus to limit our impact on nature, and this requires detailed understanding of the interconnections between the environment, biodiversity and ecosystem functioning. These three components of ecosystems each include multiple dimensions, which interact with each other in different ways, but we lack a comprehensive picture of their interconnections and underlying mechanisms. Notably, diversity is often viewed as a single facet, namely species diversity, while many more facets exist at different levels of biological organisation (e.g. genetic, phenotypic, functional, multitrophic diversity), and multiple diversity facets together constitute the raw material for adaptation to environmental changes and shape ecosystem functioning. Consequently, investigating the multidimensionality of ecosystems, and in particular the links between multifaceted diversity, environmental changes and ecosystem functions, is crucial for ecological research, management and conservation. This thesis aims to explore several aspects of this question theoretically.
I investigate three broad topics in this thesis. First, I focus on how food webs with varying levels of functional diversity across three trophic levels buffer environmental changes, such as a sudden addition of nutrients or long-term changes (e.g. warming or eutrophication). I observed that functional diversity generally enhanced ecological stability (i.e. the buffering capacity of the food web) by increasing trophic coupling. More precisely, two aspects of ecological stability (resistance and resilience) increased even though a third aspect (the inverse of the time required for the system to reach its post-perturbation state) decreased with increasing functional diversity. Second, I explore how several diversity facets served as a raw material for different sources of adaptation and how these sources affected multiple ecosystem functions across two trophic levels. Considering several sources of adaptation enabled the interplay between ecological and evolutionary processes, which affected trophic coupling and thereby ecosystem functioning. Third, I reflect further on the multifaceted nature of diversity by developing an index K able to quantify the facet of functional diversity, which is itself multifaceted. K can provide a comprehensive picture of functional diversity and is a rather good predictor of ecosystem functioning. Finally I synthesise the interdependent mechanisms (complementarity and selection effects, trophic coupling and adaptation) underlying the relationships between multifaceted diversity, ecosystem functioning and the environment, and discuss the generalisation of my findings across ecosystems and further perspectives towards elaborating an operational biodiversity-ecosystem functioning framework for research and conservation.
Enterprise solutions, specifically enterprise systems, have allowed companies to integrate enterprises’ operations throughout. The integration scope of enterprise solutions has increasingly widened, now often covering customer activities, activities along supply chains, and platform ecosystems. IS research has contributed a wide range of explanatory and design knowledge dealing with this class of IS. During the last two decades, many technological as well as managerial/organizational innovations extended the affordances of enterprise solutions—but this broader scope also challenges traditional approaches to their analysis and design. This position paper presents an enterprise-level (i.e., cross-solution) perspective on IS, discusses the challenges of complexity and coordination for IS design and management, presents selected enterprise-level insights for IS coordination and governance, and explores avenues towards a more comprehensive body of knowledge on this important level of analysis.
With Arctic ground as a huge and temperature-sensitive carbon reservoir, maintaining low ground temperatures and frozen conditions to prevent further carbon emissions that contrib-ute to global climate warming is a key element in humankind’s fight to maintain habitable con-ditions on earth. Former studies showed that during the late Pleistocene, Arctic ground condi-tions were generally colder and more stable as the result of an ecosystem dominated by large herbivorous mammals and vast extents of graminoid vegetation – the mammoth steppe. Characterised by high plant productivity (grassland) and low ground insulation due to animal-caused compression and removal of snow, this ecosystem enabled deep permafrost aggrad-ation. Now, with tundra and shrub vegetation common in the terrestrial Arctic, these effects are not in place anymore. However, it appears to be possible to recreate this ecosystem local-ly by artificially increasing animal numbers, and hence keep Arctic ground cold to reduce or-ganic matter decomposition and carbon release into the atmosphere.
By measuring thaw depth, total organic carbon and total nitrogen content, stable carbon iso-tope ratio, radiocarbon age, n-alkane and alcohol characteristics and assessing dominant vegetation types along grazing intensity transects in two contrasting Arctic areas, it was found that recreating conditions locally, similar to the mammoth steppe, seems to be possible. For permafrost-affected soil, it was shown that intensive grazing in direct comparison to non-grazed areas reduces active layer depth and leads to higher TOC contents in the active layer soil. For soil only frozen on top in winter, an increase of TOC with grazing intensity could not be found, most likely because of confounding factors such as vertical water and carbon movement, which is not possible with an impermeable layer in permafrost. In both areas, high animal activity led to a vegetation transformation towards species-poor graminoid-dominated landscapes with less shrubs. Lipid biomarker analysis revealed that, even though the available organic material is different between the study areas, in both permafrost-affected and sea-sonally frozen soils the organic material in sites affected by high animal activity was less de-composed than under less intensive grazing pressure. In conclusion, high animal activity af-fects decomposition processes in Arctic soils and the ground thermal regime, visible from reduced active layer depth in permafrost areas. Therefore, grazing management might be utilised to locally stabilise permafrost and reduce Arctic carbon emissions in the future, but is likely not scalable to the entire permafrost region.
The present thesis looks at cultural conceptualisations in relation to DEATH in Irish English from a Cultural Linguistic perspective and puts a special focus on the diachronic development of these conceptualisations. For the study, a corpus consisting of 1,400 death notices from the Dublin-based national newspaper The Irish Times from 14 historical periods between 1859 and 2023 was compiled, resulting in a highly specialised 70,000-word corpus. First, the manual qualitative analysis of the death notices produced evidence for eight superordinate cultural conceptualisations surrounding DEATH, namely, in the order of their frequency THE DEAD ARE TO BE REMEMBERED OR REGRETTED, DEATH IS SOMETHING POSITIVE, DEATH IS REST, DEATH IS A JOURNEY, DYING IS THE BEGINNING OF ANOTHER LIFE, DEATH IS (NOT) A TABOO, DEATH IS GOD’S WILL, and DEATH IS THE END. These conceptualisations were derived from linguistic expressions in the death notices that have these conceptualisations as a cognitive basis. Second, the quantitative comparison of the individual conceptualisations detected diachronic variation, which is interconnected with historical and social developments in Ireland. The thesis, therefore, illustrates the applicability of Cultural Linguistics as an adequate method for diachronic studies interested in culturally determined developments of conceptualisations.
Margrethe, the 80, and who?
(2024)
Personalmanagement und KWI
(2024)
Das in diesem Beitrag vorgestellte Projektseminarkonzept reagiert auf eine wahrgenommene Distanz und Unsicherheit Studierender im Fach Lebensgestaltung-Ethik-Religionskunde gegenüber religionsbezogenen Themen. Mittels verschiedener Strategien wurde, ausgehend von der Conceptual Change-Forschung, zur Wahrnehmung und Reflexion des eigenen kulturellen Standortes und der eigenen Konzepte in Bezug auf Religion(en) angeregt. Ihren Lernprozess haben die Studierenden in Arbeitsjournaleinträgen festgehalten. Diese Einträge wurden wiederum mittels einer qualitative Inhaltsanalyse untersucht. Nach der Darstellung der dabei erhobenen religions- und unterrichtsbezogenen Vorstellungen der Studierenden werden im Beitrag Anregungen gegeben, inwiefern die analysierten Befunde als Grundlage für die Verbesserung der Hochschullehre im Fachbereich dienen können.
Der Band präsentiert eine systematische Aufbereitung empirischer Befunde zum Lobbyismus in Deutschland und vermittelt, wie Lobbyist*innen, Entscheidungsträger*innen und institutionelle Rahmen miteinander interagieren. Untersucht werden politische Aktivitäten von sozialen Bewegungen, Verbänden, Unternehmen und Beratungsfirmen im Bundestag, der Bundesregierung und der Öffentlichkeit.
To manage tabular data files and leverage their content in a given downstream task, practitioners often design and execute complex transformation pipelines to prepare them. The complexity of such pipelines stems from different factors, including the nature of the preparation tasks, often exploratory or ad-hoc to specific datasets; the large repertory of tools, algorithms, and frameworks that practitioners need to master; and the volume, variety, and velocity of the files to be prepared. Metadata plays a fundamental role in reducing this complexity: characterizing a file assists end users in the design of data preprocessing pipelines, and furthermore paves the way for suggestion, automation, and optimization of data preparation tasks.
Previous research in the areas of data profiling, data integration, and data cleaning, has focused on extracting and characterizing metadata regarding the content of tabular data files, i.e., about the records and attributes of tables. Content metadata are useful for the latter stages of a preprocessing pipeline, e.g., error correction, duplicate detection, or value normalization, but they require a properly formed tabular input. Therefore, these metadata are not relevant for the early stages of a preparation pipeline, i.e., to correctly parse tables out of files. In this dissertation, we turn our focus to what we call the structure of a tabular data file, i.e., the set of characters within a file that do not represent data values but are required to parse and understand the content of the file. We provide three different approaches to represent file structure, an explicit representation based on context-free grammars; an implicit representation based on file-wise similarity; and a learned representation based on machine learning.
In our first contribution, we use the grammar-based representation to characterize a set of over 3000 real-world csv files and identify multiple structural issues that let files deviate from the csv standard, e.g., by having inconsistent delimiters or containing multiple tables. We leverage our learnings about real-world files and propose Pollock, a benchmark to test how well systems parse csv files that have a non-standard structure, without any previous preparation. We report on our experiments on using Pollock to evaluate the performance of 16 real-world data management systems.
Following, we characterize the structure of files implicitly, by defining a measure of structural similarity for file pairs. We design a novel algorithm to compute this measure, which is based on a graph representation of the files' content. We leverage this algorithm and propose Mondrian, a graphical system to assist users in identifying layout templates in a dataset, classes of files that have the same structure, and therefore can be prepared by applying the same preparation pipeline.
Finally, we introduce MaGRiTTE, a novel architecture that uses self-supervised learning to automatically learn structural representations of files in the form of vectorial embeddings at three different levels: cell level, row level, and file level. We experiment with the application of structural embeddings for several tasks, namely dialect detection, row classification, and data preparation efforts estimation.
Our experimental results show that structural metadata, either identified explicitly on parsing grammars, derived implicitly as file-wise similarity, or learned with the help of machine learning architectures, is fundamental to automate several tasks, to scale up preparation to large quantities of files, and to provide repeatable preparation pipelines.
Social institutions
(2024)
Social institutions are a system of behavioral and relationship patterns that are densely interwoven and enduring and function across an entire society. They order and structure the behavior of individuals in core areas of society and thus have a strong impact on the quality of life of individuals. Institutions regulate the following: (a) family and relationship networks carry out social reproduction and socialization; (b) institutions in the realm of education and training ensure the transmission and cultivation of knowledge, abilities, and specialized skills; (c) institutions in the labor market and economy provide for the production and distribution of goods and services; (d) institutions in the realm of law, governance, and politics provide for the maintenance of the social order; (e) while cultural, media, and religious institutions further the development of contexts of meaning, value orientations, and symbolic codes.
Comparamos la labialización no asimiladora de nasales finales en español en tres corpus de español americano (mexicano, colombiano y paraguayo). Si bien es conocida la labialización no asimiladora en español yucateco, es en gran parte desconocida en otras regiones de habla hispana, por lo que a menudo se atribuye a la influencia maya. Ahora bien, se han señalado casualmente hábitos de pronunciación similares tanto en Paraguay como en Colombia. Comparando empíricamente la labialización en tres corpus constituidos sobre la misma base metodológica, concluimos que la evidencia a favor del contacto lingüístico es como mucho sumamente indirecta. Independientemente de esto, encontramos que la diferencia más marcada es que la tasa de labialización parece ser determinada por la duración de la pausa subsiguiente en los datos de la península yucateca, mas no en aquellos de Colombia y Paraguay. Argumentamos que es cierto que el contacto puede eventualmente haber desencadenado el desarrollo de este rasgo en el español yucateco, puesto que el español actual casi no conoce nasales labiales finales, pero el maya sí. Sin embargo, el perfil lingüístico (hablantes monolingües vs. bilingües) no tiene ningún efecto en nuestros datos yucatecos y paraguayos, y en el total de nuestros datos tampoco encontramos evidencia en favor de la hipótesis que el contacto lingüístico hubiera jugado un rol (importante) en el desarrollo de las labiales nasales en las tres variedades.
Sexualität in der Geschichte
(2024)
Jelena Tomović führt in diesem Band durch die Entwicklungen unserer sexuellen Sprache und Praktiken. Sie zeigt, dass die Art und Weise, wie über Sexualität gesprochen wird, nicht nur ein Spiegelbild, sondern auch ein treibender Faktor für soziale Veränderungen ist. Die Studie stellt die konventionelle Vorstellung von Sexualität in Frage und führt die Lesenden in eine Welt der subtilen Nuancen und kulturellen Veränderungen. Mit kommunikationstheoretischen Ansätzen, dem praxeologischen Ansatz, ihrer sozialkonstruktivistischen Grundannahme und einem klaren Fokus auf Akteur*innen bietet die Autorin eine frische Perspektive auf die Geschichte der Sexualität. Das Buch eröffnet neue Wege für die Erforschung und das Verständnis von Intimität und sozialer Kommunikation.
Enhancing higher entrepreneurship education: insights from practitioners for curriculum improvement
(2024)
Curricula for higher entrepreneurship education should meet the requirements of both a solid theoretical foundation and a practical orientation. When these curricula are designed by education specialists, entrepreneurs are usually not consulted. To explore practitioners’ curricular recommendations, we conducted 73 semi-structured interviews with entrepreneurs with at least five years of professional experience. We collected 49 items for teaching and learning objectives, 37 for contents, 28 for teaching methods, and 17 for assessment methods. The respondents are convinced that students should acquire solid knowledge in business and management, legal issues, and entrepreneurship. For the latter, only some core aspects are provided. The entrepreneurs put greater emphasis on entrepreneurial skills and attitudes and consider experiential learning designs as most suitable, both in the secure setting of the classroom and in real life. The findings can help reflect on current entrepreneurship curriculum designs.
Germany’s relatively stable party system faces a new left-authoritarian challenger: Sahra Wagenknecht’s Bündnis Sahra Wagenknecht (BSW) party. First polls indicate that for the BSW, election results above 10% are within reach. While Wagenknecht’s positions in economic and cultural terms have already been discussed, this article elaborates on another highly relevant feature of Wagenknecht, namely her populist communication. Exploring Wagenknecht’s and BSW’s populist appeal helps us to understand why the party is said to also have potential among seemingly different voter groups coming from the far right Alternative for Germany (AfD) and far left Die Linke, which share high levels of populist attitudes. To analyse the role that populist communication plays for Wagenknecht and the BSW, this article combines quantitative and qualitative methods. The quantitative analysis covers all speeches (10,000) and press releases (19,000) published by Die Linke members of Parliament (MPs; 2005–2023). The results show that Wagenknecht is the (former) Die Linke MP with the highest share of populist communication. Furthermore, she was also able to convince a group of populist MPs to join the BSW. The article closes with a qualitative analysis of BSW’s manifesto that reveals how populist framing plays a major role in this document, in which the political and economic elites are accused of working against the interest of “the majority”. Based on this analysis, the classification of the BSW as a populist party seems to be appropriate.
Deep learning has seen widespread application in many domains, mainly for its ability to learn data representations from raw input data. Nevertheless, its success has so far been coupled with the availability of large annotated (labelled) datasets. This is a requirement that is difficult to fulfil in several domains, such as in medical imaging. Annotation costs form a barrier in extending deep learning to clinically-relevant use cases. The labels associated with medical images are scarce, since the generation of expert annotations of multimodal patient data at scale is non-trivial, expensive, and time-consuming. This substantiates the need for algorithms that learn from the increasing amounts of unlabeled data. Self-supervised representation learning algorithms offer a pertinent solution, as they allow solving real-world (downstream) deep learning tasks with fewer annotations. Self-supervised approaches leverage unlabeled samples to acquire generic features about different concepts, enabling annotation-efficient downstream task solving subsequently.
Nevertheless, medical images present multiple unique and inherent challenges for existing self-supervised learning approaches, which we seek to address in this thesis: (i) medical images are multimodal, and their multiple modalities are heterogeneous in nature and imbalanced in quantities, e.g. MRI and CT; (ii) medical scans are multi-dimensional, often in 3D instead of 2D; (iii) disease patterns in medical scans are numerous and their incidence exhibits a long-tail distribution, so it is oftentimes essential to fuse knowledge from different data modalities, e.g. genomics or clinical data, to capture disease traits more comprehensively; (iv) Medical scans usually exhibit more uniform color density distributions, e.g. in dental X-Rays, than natural images. Our proposed self-supervised methods meet these challenges, besides significantly reducing the amounts of required annotations.
We evaluate our self-supervised methods on a wide array of medical imaging applications and tasks. Our experimental results demonstrate the obtained gains in both annotation-efficiency and performance; our proposed methods outperform many approaches from related literature. Additionally, in case of fusion with genetic modalities, our methods also allow for cross-modal interpretability. In this thesis, not only we show that self-supervised learning is capable of mitigating manual annotation costs, but also our proposed solutions demonstrate how to better utilize it in the medical imaging domain. Progress in self-supervised learning has the potential to extend deep learning algorithms application to clinical scenarios.
«Musik erfinden und gestalten» hat grosses musikpädagogisches Potenzial: mit Klängen experimentieren, ein Gespür für dramaturgische Verläufe entwickeln, nonverbal kommunizieren – Musik erfinden und gestalten eröffnet ein breites Feld musikalischer Aktivitäten und Erfahrungsmöglichkeiten. Doch im regulären Musikunterricht in der Volksschule der Schweiz sind produktionsdidaktische Ansätze noch eher die Ausnahme und Musiklehrkräften fehlt es an Anleitungsstrategien.
Für das vorliegende Buch untersuchte der Autor in Form einer Design-based-Research-Studie, wie Primarlehrkräfte ihre Anleitungsstrategien bei der Durchführung von musikalischen Gestaltungsprozessen in ihren Schulklassen schrittweise entwickeln. Dabei begleitete der Forscher die Lehrkräfte in der schulischen Praxis und intervenierte gezielt mit Reflexionsimpulsen, um den Professionalisierungsprozess zu unterstützen.
Daraus wurden drei Reflexionstools generiert: Das Reflexionstool try-outs beinhaltet konkrete Handlungsanregungen und Reflexionsfragen für das Anleiten musikalischer Gestaltungsprozesse. Das Onlinetool improspider ist ein Selbstreflexionsinstrument zur Einschätzung personaler Orientierungen. Das Kompetenzmodell Kompetenzflyer bietet eine Reflexionsfolie für die Ansteuerung eigenständiger Kompetenzerwerbsschritte.
Die Reflexionstools sind außerdem online in Form eines Lernobjekts verfügbar.
Organic solar cells (OSCs) represent a new generation of solar cells with a range of captivating attributes including low-cost, light-weight, aesthetically pleasing appearance, and flexibility. Different from traditional silicon solar cells, the photon-electron conversion in OSCs is usually accomplished in an active layer formed by blending two kinds of organic molecules (donor and acceptor) with different energy levels together.
The first part of this thesis focuses on a better understanding of the role of the energetic offset and each recombination channel on the performance of these low-offset OSCs. By combining advanced experimental techniques with optical and electrical simulation, the energetic offsets between CT and excitons, several important insights were achieved: 1. The short circuit current density and fill-factor of low-offset systems are largely determined by field-dependent charge generation in such low-offset OSCs. Interestingly, it is strongly evident that such field-dependent charge generation originates from a field-dependent exciton dissociation yield. 2. The reduced energetic offset was found to be accompanied by strongly enhanced bimolecular recombination coefficient, which cannot be explained solely by exciton repopulation from CT states. This implies the existence of another dark decay channel apart from CT.
The second focus of the thesis was on the technical perspective. In this thesis, the influence of optical artifacts in differential absorption spectroscopy upon the change of sample configuration and active layer thickness was studied. It is exemplified and discussed thoroughly and systematically in terms of optical simulations and experiments, how optical artifacts originated from non-uniform carrier profile and interference can manipulate not only the measured spectra, but also the decay dynamics in various measurement conditions. In the end of this study, a generalized methodology based on an inverse optical transfer matrix formalism was provided to correct the spectra and decay dynamics manipulated by optical artifacts.
Overall, this thesis paves the way for a deeper understanding of the keys toward higher PCEs in low-offset OSC devices, from the perspectives of both device physics and characterization techniques.
Von Koscher bis Frutarismus
(2024)
Preisalgorithmenkartelle
(2024)
Mithilfe von Preisalgorithmen sind Unternehmen in der Lage, automatische und wechselseitige Preisanpassungen vorzunehmen. Dadurch können klassische Kartellkonstellationen mangels konspirativer Treffen in den Hintergrund treten. Die Arbeit zeigt auf, unter welchen Voraussetzungen der Einsatz von Preisalgorithmen einen Verstoß gegen das europäische Kartellverbot begründen kann. Dazu werden Fallkonstellationen beleuchtet, die ein algorithmisches Zusammenwirken sowohl unmittelbar zwischen Wettbewerbern als auch mittelbar über einen Dritten begründen. Ferner wird auch auf algorithmenspezifische Compliance-Maßnahmen eingegangen. Schließlich werden die praktischen Herausforderungen bei der Aufdeckung und dem Nachweis solcher Kartelle aufgezeigt.
What does the future hold for corporate communications? The Communications Trend Radar is an applied research project. On an annual basis, it identifies relevant trends for corporate communications from the fields of society, management, and technology. The research team at the University of Potsdam (Professor Stefan Stieglitz, Sünje Clausen, MS.) and Leipzig University (Professor Ansgar Zerfass, Dr Michelle Wloka) identified the following trends for 2024: Information Inflation, AI Literacy, Workforce Shift, Content Integrity, Decoding Humans. More information on the trends can be found in the Communications Trend Radar Report 2024
We would like to inform the readers and editors of the journal that we have discovered some errors in the references of our paper. These errors were brought to our attention by a reader who noticed some inconsistencies between the citations in the text and the bibliography. Upon further investigation, we realized that our literature management software had mistakenly linked some of the references to wrong or non-existent sources. We apologize for this oversight and assure you that it did not affect the validity or quality of our arguments and results, which were based on the correct sources. Below you find a list of the incorrect references along with their corresponding correct ones. We hope that this correction statement will clarify any confusion or misunderstanding that may have arisen from this mistake. The authors would like to apologise for any inconvenience caused.
An die Revolution 1848/49 wird als wesentliches Ereignis deutscher Demokratiegeschichte erinnert; die Beteiligung von Frauen nimmt jedoch bis heute im kollektiven Gedächtnis einen untergeordneten Stellenwert ein. Die vorliegende Masterarbeit beschäftigt sich aus diesem Grund spezifisch mit der Rolle der Frauen in der Revolution 1848/49 und bietet Anregungen für die Integration des Themas in den Politikunterricht.
Wie die Ergebnisse der Arbeit verdeutlichen, nutzten zahlreiche Frauen die Aufbruchsstimmung der 1840er Jahre, um sich auf verschiedene Arten und Weisen politisch zu engagieren. Zwar blieben viele dabei innerhalb der dichotomen Geschlechterteilung verhaftet, welche auf dem sich im 19. Jahrhundert herausbildenden bürgerlichen Geschlechtermodell beruhte. Einige überschritten diese Grenzen jedoch trotz harter Sanktionen auch bewusst.
Sichtbar wird, dass die weibliche Beteiligung zu diesem Zeitpunkt noch nicht zur grundsätzlichen Infragestellung der Geschlechterpolarität führte, aber die Frauen zunehmend den öffentlichen Raum auch für sich beanspruchten und damit die Grundlagen für die deutsche Frauenbewegung der folgenden Jahrzehnte schufen.
Die schulische Thematisierung der Rolle der Frauen in der Revolution 1848/49 bietet sich sowohl im Geschichts- und Politikunterricht als auch fächerübergreifend hinsichtlich vielfältiger Anknüpfungspunkte an. Die Integration des Themas in den Unterricht kann insbesondere dazu beitragen, das historische Erbe der Anfänge der Frauenbewegung zu bewahren, und es zudem für die Vermittlung demokratischer Werte nutzbar machen.
Actin is one of the most highly conserved proteins in eukaryotes and distinct actin-related proteins with filament-forming properties are even found in prokaryotes. Due to these commonalities, actin-modulating proteins of many species share similar structural properties and proposed functions. The polymerization and depolymerization of actin are critical processes for a cell as they can contribute to shape changes to adapt to its environment and to move and distribute nutrients and cellular components within the cell. However, to what extent functions of actin-binding proteins are conserved between distantly related species, has only been addressed in a few cases. In this work, functions of Coronin-A (CorA) and Actin-interacting protein 1 (Aip1), two proteins involved in actin dynamics, were characterized. In addition, the interchangeability and function of Aip1 were investigated in two phylogenetically distant model organisms. The flowering plant Arabidopsis thaliana (encoding two homologs, AIP1-1 and AIP1-2) and in the amoeba Dictyostelium discoideum (encoding one homolog, DdAip1) were chosen because the functions of their actin cytoskeletons may differ in many aspects. Functional analyses between species were conducted for AIP1 homologs as flowering plants do not harbor a CorA gene.
In the first part of the study, the effect of four different mutation methods on the function of Coronin-A protein and the resulting phenotype in D. discoideum was revealed in two genetic knockouts, one RNAi knockdown and a sudden loss-of-function mutant created by chemical-induced dislocation (CID). The advantages and disadvantages of the different mutation methods on the motility, appearance and development of the amoebae were investigated, and the results showed that not all observed properties were affected with the same intensity. Remarkably, a new combination of Selection-Linked Integration and CID could be established.
In the second and third parts of the thesis, the exchange of Aip1 between plant and amoeba was carried out. For A. thaliana, the two homologs (AIP1-1 and AIP1-2) were analyzed for functionality as well as in D. discoideum. In the Aip1-deficient amoeba, rescue with AIP1-1 was more effective than with AIP1-2. The main results in the plant showed that in the aip1-2 mutant background, reintroduced AIP1-2 displayed the most efficient rescue and A. thaliana AIP1-1 rescued better than DdAip1. The choice of the tagging site was important for the function of Aip1 as steric hindrance is a problem. The DdAip1 was less effective when tagged at the C-terminus, while the plant AIP1s showed mixed results depending on the tag position. In conclusion, the foreign proteins partially rescued phenotypes of mutant plants and mutant amoebae, despite the organisms only being very distantly related in evolutionary terms.
Um Mitarbeitende langfristig ans Unternehmen zu binden und das Engagement in Teams hochzuhalten, fokussieren sich viele Maßnahmen zur Verbesserung des Betriebsklimas oder der Zufriedenheit auf die Motivation der Mitarbeitenden. Meist ist es jedoch lohnenswerter, die Zumutungen der organisationalen Struktur zuerst anzufassen. Häufig liegen dort die größten Motivationskiller - und somit auch die Hebel mit dem größten Potenzial für mehr Zufriedenheit.
Optimizing power analysis for randomized experiments: Design parameters for student achievement
(2024)
Randomized trials (RTs) are promising methodological tools to inform evidence-based reform to enhance schooling. Establishing a robust knowledge base on how to promote student achievement requires sensitive RT designs demonstrating sufficient statistical power and precision to draw conclusive and correct inferences on the effectiveness of educational programs and innovations. Proper power analysis is therefore an integral component of any informative RT on student achievement. This venture critically hinges on the availability of reasonable input variance design parameters (and their inherent uncertainties) that optimally reflect the realities around the prospective RT—precisely, its target population and outcome, possibly applied covariates, the concrete design as well as the planned analysis. However, existing compilations in this vein show far-reaching shortcomings.
The overarching endeavor of the present doctoral thesis was to substantively expand available resources devoted to tweak the planning of RTs evaluating educational interventions. At the core of this thesis is a systematic analysis of design parameters for student achievement, generating reliable and versatile compendia and developing thorough guidance to support apt power analysis to design strong RTs. To this end, the thesis at hand bundles two complementary studies which capitalize on rich data of several national probability samples from major German longitudinal large-scale assessments.
Study I applied two- and three-level latent (covariate) modeling to analyze design parameters for a wide spectrum of mathematical-scientific, verbal, and domain-general achievement outcomes. Three vital covariate sets were covered comprising (a) pretests, (b) sociodemographic characteristics, and (c) their combination. The accumulated estimates were additionally summarized in terms of normative distributions.
Study II specified (manifest) single-, two-, and three-level models and referred to influential psychometric heuristics to analyze design parameters and develop concise selection guidelines for covariate (a) types of varying bandwidth-fidelity (domain-identical, cross-domain, fluid intelligence pretests; sociodemographic characteristics), (b) combinations quantifying incremental validities, and (c) time lags of 1- to 7-year-lagged pretests scrutinizing validity degradation. The estimates for various mathematical-scientific and verbal achievement outcomes were meta-analytically integrated and employed in precision simulations.
In doing so, Studies I and II addressed essential gaps identified in previous repertoires in six major dimensions: Taken together, this thesis accumulated novel design parameters and deliberate guidance for RT power analysis (1) tailored to four German student (sub)populations across the entire school career from Grade 1 to 12, (2) matched to 21 achievement (sub)domains, (3) adjusted for 11 covariate sets enriched by empirically supported guidelines, (4) adapted to six RT designs, (5) suitable for latent and manifest analysis models, (6) which were cataloged along with quantifications of their associated uncertainties. These resources are complemented by a plethora of illustrative application examples to gently direct psychological and educational researchers through pivotal steps in the process of RT design.
The striking heterogeneity of the design parameter estimates across all these dimensions constitutes the overall, joint key result of Studies I and II. Hence, this work convincingly reinforces calls for a close match between design parameters and the specific peculiarities of the target RT’s research context.
All in all, the present doctoral thesis offers a—so far unique—nuanced and extensive toolkit to optimize power analysis for sound RTs on student achievement in the German (and similar) school context. It is of utmost importance that research does not tire to spawn robust evidence on what actually works to improve schooling. With this in mind, I hope that the emerging compendia and guidance contribute to the quality and rigor of our randomized experiments in psychology and education.
MARLA
(2024)
Heat stress (HS) is a major abiotic stress that negatively affects plant growth and productivity. However, plants have developed various adaptive mechanisms to cope with HS, including the acquisition and maintenance of thermotolerance, which allows them to respond more effectively to subsequent stress episodes. HS memory includes type II transcriptional memory which is characterized by enhanced re-induction of a subset of HS memory genes upon recurrent HS. In this study, new regulators of HS memory in A. thaliana were identified through the characterization of rein mutants.
The rein1 mutant carries a premature stop in CYCLIN-DEPENDENT-KINASE 8 (CDK8) which is part of the cyclin kinase module of the Mediator complex. Rein1 seedlings show impaired type II transcriptional memory in multiple heat-responsive genes upon re-exposure to HS. Additionally, the mutants exhibit a significant deficiency in HS memory at the physiological level. Interaction studies conducted in this work indicate that CDK8 associates with the memory HEAT SHOCK FACTORs HSAF2 and HSFA3. The results suggest that CDK8 plays a crucial role in HS memory in plants together with other memory HSFs, which may be potential targets of the CDK8 kinase function. Understanding the role and interaction network of the Mediator complex during HS-induced transcriptional memory will be an exciting aspect of future HS memory research.
The second characterized mutant, rein2, was selected based on its strongly impaired pAPX2::LUC re-induction phenotype. In gene expression analysis, the mutant revealed additional defects in the initial induction of HS memory genes. Along with this observation, basal thermotolerance was impaired similarly as HS memory at the physiological level in rein2. Sequencing of backcrossed bulk segregants with subsequent fine mapping narrowed the location of REIN2 to a 1 Mb region on chromosome 1. This interval contains the At1g65440 gene, which encodes the histone chaperone SPT6L. SPT6L interacts with chromatin remodelers and bridges them to the transcription machinery to regulate nucleosome and Pol II occupancy around the transcriptional start site. The EMS-induced missense mutation in SPT6L may cause altered HS-induced gene expression in rein2, possibly triggered by changes in the chromatin environment resulting from altered histone chaperone function.
Expanding research on screen-derived factors that modify type II transcriptional memory has the potential to enhance our understanding of HS memory in plants. Discovering connections between previously identified memory factors will help to elucidate the underlying network of HS memory. This knowledge can initiate new approaches to improve heat resilience in crops.
Èto-clefts are Russian focus constructions with the demonstrative pronoun èto ‘this’ at the beginning: “Èto Mark vyigral gonku” (“It was Mark who won the race”). They are often being compared with English it-clefts, German es-clefts, as well as the corresponding focus-background structures in other languages.
In terms of semantics, èto-clefts have two important properties which are cross-linguistically typical for clefts: existence presupposition (“Someone won the race”) and exhaustivity (“Nobody except Mark won the race”). However, the exhaustivity effects are not as strong as exhaustivity effects in structures with the exclusive only and require more research.
At the same time, the question if the syntactic structure of èto-clefts matches the biclausal structure of English and German clefts, remains open. There are arguments in favor of biclausality, as well as monoclausality. Besides, there is no consistency regarding the status of èto itself.
Finally, the information structure of èto-clefts has remained underexplored in the existing literature.
This research investigates the information-structural, syntactic, and semantic properties of Russian clefts, both theoretically (supported by examples from Russian text corpora and judgments from native speakers) and experimentally. It is determined which desired changes in the information structure motivate native speakers to choose an èto-cleft and not the canonical structure or other focus realization tools. Novel syntactic tests are conducted to find evidence for bi-/monoclausality of èto-clefts, as well as for base-generation or movement of the cleft pivot. It is hypothesized that èto has a certain important function in clefts, and its status is investigated. Finally, new experiments on the nature of exhaustivity in èto-clefts are conducted. They allow for direct cross-linguistic comparison, using an incremental-information paradigm with truth-value judgments.
In terms of information structure, this research makes a new proposal that presents èto-clefts as structures with an inherent focus-background bipartitioning. Even though èto-clefts are used in typical focus contexts, evidence was found that èto-clefts (as well as Russian thetic clefts) allow for both new information focus and contrastive focus. Èto-clefts are pragmatically acceptable when a singleton answer to the implied question is expected (e.g. “It was Mark who won the race” but not “It was Mark who came to the party”). Importantly, èto in Russian clefts is neither dummy, nor redundant, but is a topic expression; conveys familiarity which triggers existence presupposition; refers to an instantiated event, or a known/perceivable situation; finally, èto plays an important role in the spoken language as a tool for speech coherency and a focus marker.
In terms of syntax, this research makes a new monoclausal proposal and shows evidence that the cleft pivot undergoes movement to the left peripheral position. Èto is proposed to be TopP.
Finally, in terms of semantics, a novel cross-linguistic evaluation of Russian clefts is made. Experiments show that the exhaustivity inference in èto-clefts is not robust. Participants used different strategies in resolving exhaustivity, falling into 2 groups: one group considered èto-clefts exhaustive, while another group considered them non-exhaustive. Hence, there is evidence for the pragmatic nature of exhaustivity in èto-clefts. The experimental results for èto-clefts are similar to the experimental results for clefts in German, French and Akan. It is concluded that speakers use different tools available in their languages to produce structures with similar interpretive properties.
The origin and structure of magnetic fields in the Galaxy are largely unknown. What is known is that they are essential for several astrophysical processes, in particular the propagation of cosmic rays. Our ability to describe the propagation of cosmic rays through the Galaxy is severely limited by the lack of observational data needed to probe the structure of the Galactic magnetic field on many different length scales. This is particularly true for modelling the propagation of cosmic rays into the Galactic halo, where our knowledge of the magnetic field is particularly poor.
In the last decade, observations of the Galactic halo in different frequency regimes have revealed the existence of out-of-plane bubble emission in the Galactic halo. In gamma rays these bubbles have been termed Fermi bubbles with a radial extent of ≈ 3 kpc and an azimuthal height of ≈ 6 kpc. The radio counterparts of the Fermi bubbles were seen by both the S-PASS telescopes and the Planck satellite, and showed a clear spatial overlap. The X-ray counterparts of the Fermi bubbles were named eROSITA bubbles after the eROSITA satellite, with a radial width of ≈ 7 kpc and an azimuthal height of ≈ 14 kpc. Taken together, these observations suggest the presence of large extended Galactic Halo Bubbles (GHB) and have stimulated interest in exploring the less explored Galactic halo.
In this thesis, a new toy model (GHB model) for the magnetic field and non-thermal electron distribution in the Galactic halo has been proposed. The new toy model has been used to produce polarised synchrotron emission sky maps. Chi-square analysis was used to compare the synthetic skymaps with the Planck 30 GHz polarised skymaps. The obtained constraints on the strength and azimuthal height were found to be in agreement with the S-PASS radio observations.
The upper, lower and best-fit values obtained from the above chi-squared analysis were used to generate three separate toy models. These three models were used to propagate ultra-high energy cosmic rays. This study was carried out for two potential sources, Centaurus A and NGC 253, to produce magnification maps and arrival direction skymaps. The simulated arrival direction skymaps were found to be consistent with the hotspots of Centaurus A and NGC 253 as seen in the observed arrival direction skymaps provided by the Pierre Auger Observatory (PAO).
The turbulent magnetic field component of the GHB model was also used to investigate the extragalactic dipole suppression seen by PAO. UHECRs with an extragalactic dipole were forward-tracked through the turbulent GHB model at different field strengths. The suppression in the dipole due to the varying diffusion coefficient from the simulations was noted. The results could also be compared with an analytical analogy of electrostatics. The simulations of the extragalactic dipole suppression were in agreement with similar studies carried out for galactic cosmic rays.
A comprehensive study on seismic hazard and earthquake triggering is crucial for effective mitigation of earthquake risks. The destructive nature of earthquakes motivates researchers to work on forecasting despite the apparent randomness of the earthquake occurrences. Understanding their underlying mechanisms and patterns is vital, given their potential for widespread devastation and loss of life. This thesis combines methodologies, including Coulomb stress calculations and aftershock analysis, to shed light on earthquake complexities, ultimately enhancing seismic hazard assessment.
The Coulomb failure stress (CFS) criterion is widely used to predict the spatial distributions of aftershocks following large earthquakes. However, uncertainties associated with CFS calculations arise from non-unique slip inversions and unknown fault networks, particularly due to the choice of the assumed aftershocks (receiver) mechanisms. Recent studies have proposed alternative stress quantities and deep neural network approaches as superior to CFS with predefined receiver mechanisms. To challenge these propositions, I utilized 289 slip inversions from the SRCMOD database to calculate more realistic CFS values for a layered-half space and variable receiver mechanisms. The analysis also investigates the impact of magnitude cutoff, grid size variation, and aftershock duration on the ranking of stress metrics using receiver operating characteristic (ROC) analysis. Results reveal the performance of stress metrics significantly improves after accounting for receiver variability and for larger aftershocks and shorter time periods, without altering the relative ranking of the different stress metrics.
To corroborate Coulomb stress calculations with the findings of earthquake source studies in more detail, I studied the source properties of the 2005 Kashmir earthquake and its aftershocks, aiming to unravel the seismotectonics of the NW Himalayan syntaxis. I simultaneously relocated the mainshock and its largest aftershocks using phase data, followed by a comprehensive analysis of Coulomb stress changes on the aftershock planes. By computing the Coulomb failure stress changes on the aftershock faults, I found that all large aftershocks lie in regions of positive stress change, indicating triggering by either co-seismic or post-seismic slip on the mainshock fault.
Finally, I investigated the relationship between mainshock-induced stress changes and associated seismicity parameters, in particular those of the frequency-magnitude (Gutenberg-Richter) distribution and the temporal aftershock decay (Omori-Utsu law). For that purpose, I used my global data set of 127 mainshock-aftershock sequences with the calculated Coulomb Stress (ΔCFS) and the alternative receiver-independent stress metrics in the vicinity of the mainshocks and analyzed the aftershocks properties depend on the stress values. Surprisingly, the results show a clear positive correlation between the Gutenberg-Richter b-value and induced stress, contrary to expectations from laboratory experiments. This observation highlights the significance of structural heterogeneity and strength variations in seismicity patterns. Furthermore, the study demonstrates that aftershock productivity increases nonlinearly with stress, while the Omori-Utsu parameters c and p systematically decrease with increasing stress changes. These partly unexpected findings have significant implications for future estimations of aftershock hazard.
The findings in this thesis provides valuable insights into earthquake triggering mechanisms by examining the relationship between stress changes and aftershock occurrence. The results contribute to improved understanding of earthquake behavior and can aid in the development of more accurate probabilistic-seismic hazard forecasts and risk reduction strategies.
Arachidonsäurelipoxygenasen (ALOX-Isoformen) sind Lipid-peroxidierenden Enzyme, die bei der Zelldifferenzierung und bei der Pathogenese verschiedener Erkrankungen bedeutsam sind. Im menschlichen Genom gibt es sechs funktionelle ALOX-Gene, die als Einzelkopiegene vorliegen. Für jedes humane ALOX-Gen gibt es ein orthologes Mausgen. Obwohl sich die sechs humanen ALOX-Isoformen strukturell sehr ähnlich sind, unterscheiden sich ihre funktionellen Eigenschaften deutlich voneinander. In der vorliegenden Arbeit wurden vier unterschiedliche Fragestellungen zum Vorkommen, zur biologischen Rolle und zur Evolutionsabhängigkeit der enzymatischen Eigenschaften von Säugetier-ALOX-Isoformen untersucht:
1) Spitzhörnchen (Tupaiidae) sind evolutionär näher mit dem Menschen verwandt als Nagetiere und wurden deshalb als Alternativmodelle für die Untersuchung menschlicher Erkrankungen vorgeschlagen. In dieser Arbeit wurde erstmals der Arachidonsäurestoffwechsel von Spitzhörnchen untersucht. Dabei wurde festgestellt, dass im Genom von Tupaia belangeri vier unterschiedliche ALOX15-Gene vorkommen und die Enzyme sich hinsichtlich ihrer katalytischen Eigenschaften ähneln. Diese genomische Vielfalt, die weder beim Menschen noch bei Mäusen vorhanden ist, erschwert die funktionellen Untersuchungen zur biologischen Rolle des ALOX15-Weges. Damit scheint Tupaia belangeri kein geeigneteres Tiermodel für die Untersuchung des ALOX15-Weges des Menschen zu sein.
2) Entsprechend der Evolutionshypothese können Säugetier-ALOX15-Orthologe in Arachidonsäure-12-lipoxygenierende- und Arachidonsäure-15-lipoxygenierende Enzyme eingeteilt werden. Dabei exprimieren Säugetierspezies, die einen höheren Evolutionsgrad als Gibbons aufweisen, Arachidonsäure-15-lipoxygenierende ALOX15-Orthologe, während evolutionär weniger weit entwickelte Säugetiere Arachidonsäure-12 lipoxygenierende Enzyme besitzen. In dieser Arbeit wurden elf neue ALOX15-Orthologe als rekombinante Proteine exprimiert und funktionell charakterisiert. Die erhaltenen Ergebnisse fügen sich widerspruchsfrei in die Evolutionshypothese ein und verbreitern deren experimentelle Basis. Die experimentellen Daten bestätigen auch das Triadenkonzept.
3) Da humane und murine ALOX15B-Orthologe unterschiedliche funktionelle Eigenschaften aufweisen, können Ergebnisse aus murinen Krankheitsmodellen zur biologischen Rolle der ALOX15B nicht direkt auf den Menschen übertragen werden. Um die ALOX15B-Orthologen von Maus und Mensch funktionell einander anzugleichen, wurden im Rahmen der vorliegenden Arbeit Knock-in Mäuse durch die In vivo Mutagenese mittels CRISPR/Cas9-Technik hergestellt. Diese exprimieren eine humanisierte Mutante (Doppelmutation von Tyrosin603Asparaginsäure+Histidin604Valin) der murinen Alox15b. Diese Mäuse waren lebens- und fortpflanzungsfähig, zeigten aber geschlechtsspezifische Unterschiede zu ausgekreuzten Wildtyp-Kontrolltieren im Rahmen ihre Individualentwicklung.
4) In vorhergehenden Untersuchungen zur Rolle der ALOX15B in Rahmen der Entzündungsreaktion wurde eine antiinflammatorische Wirkung des Enzyms postuliert. In der vorliegenden Arbeit wurde untersucht, ob eine Humanisierung der murinen Alox15b die Entzündungsreaktion in zwei verschiedenen murinen Entzündungsmodellen beeinflusst. Eine Humanisierung der murinen Alox15b führte zu einer verstärkten Ausbildung von Entzündungssymptomen im induzierten Dextran-Natrium-Sulfat-Kolitismodell. Im Gegensatz dazu bewirkte die Humanisierung der Alox15b eine Abschwächung der Entzündungssymptome im Freund‘schen Adjuvans Pfotenödemmodell. Diese Daten deuten darauf hin, dass sich die Rolle der ALOX15B in verschiedenen Entzündungsmodellen unterscheidet.
Hässlich aber gut
(2024)
Im Kontext der zunehmenden Relevanz des Umgangs mit Digitalität im schulischen Unterricht und der daraus resultierenden Popularität von Gaming und Gamification als Lehrmethoden ist das Ziel dieser Arbeit, Game Design als konstruktivistische Herangehensweise an Computerspiele zu untersuchen. Genauer geht es darum, diese Methode hinsichtlich der Tauglichkeit für den Kunstunterricht zu analysieren. Dazu wird darauf eingegangen, inwiefern Game Design als Instruktionsmethode generell Lernen fördert bzw. zur Ausbildung einer Digital Literacy geeignet ist. Der Schwerpunkt liegt darin, Game Design im Hinblick auf die zentralen Kompetenz- und Lerndimensionen des Kunstunterrichts zu beleuchten. Genauer sind damit die künstlerische Produktion und die ästhetische Rezeption als die beiden maßgeblichen künstlerisch-ästhetischen Handlungskompetenzen gemeint sowie die ästhetische Erfahrung als besonderes Lernerlebnis, welches im kunstpädagogischen Diskurs neben den beschriebenen Kompetenzen als höchstes Ziel der Lehre gilt. Ebendiese drei Dimensionen funktionieren hierbei als Analyseebenen der untersuchten Methode. Game Design stellt sich dabei als weitestgehend förderlich für alle drei benannten Bereiche heraus, wobei es in Bezug auf die sinnliche Wahrnehmung im Prozess der ästhetischen Rezeption nur eine ergänzende Funktion annimmt. Es werden nicht alle Bereiche der Gestaltungsfelder der künstlerischen Produktion angesprochen. Ein experimentell-offenes künstlerisches Arbeiten wird ebenso nicht zwangsläufig ermöglicht. Jedoch werden alle anderen Bestandteile dieser Kompetenzdimensionen angesprochen und insbesondere die ästhetische Erfahrung vollumfänglich gefördert. Die digitale Spielentwicklung lässt sich somit aus kunstpädagogischer Perspektive für den Einsatz im Kunstunterricht legitimieren. Mit Ausblick auf STEAM Education und einen projektorientierten Unterricht ist sie sogar zu empfehlen.
Die fortschreitende Digitalisierung durchzieht immer mehr Lebensbereiche und führt zu immer komplexeren sozio-technischen Systemen. Obwohl diese Systeme zur Lebenserleichterung entwickelt werden, können auch unerwünschte Nebeneffekte entstehen. Ein solcher Nebeneffekt könnte z.B. die Datennutzung aus Fitness-Apps für nachteilige Versicherungsentscheidungen sein. Diese Nebeneffekte manifestieren sich auf allen Ebenen zwischen Individuum und Gesellschaft. Systeme mit zuvor unerwarteten Nebeneffekten können zu sinkender Akzeptanz oder einem Verlust von Vertrauen führen. Da solche Nebeneffekte oft erst im Gebrauch in Erscheinung treten, bedarf es einer besonderen Betrachtung bereits im Konstruktionsprozess. Mit dieser Arbeit soll ein Beitrag geleistet werden, um den Konstruktionsprozess um ein geeignetes Hilfsmittel zur systematischen Reflexion zu ergänzen.
In vorliegender Arbeit wurde ein Analysetool zur Identifikation und Analyse komplexer Interaktionssituationen in Software-Entwicklungsprojekten entwickelt. Komplexe Interaktionssituationen sind von hoher Dynamik geprägt, aus der eine Unvorhersehbarkeit der Ursache-Wirkungs-Beziehungen folgt. Hierdurch können die Akteur*innen die Auswirkungen der eigenen Handlungen nicht mehr überblicken, sondern lediglich im Nachhinein rekonstruieren. Hieraus können sich fehlerhafte Interaktionsverläufe auf vielfältigen Ebenen ergeben und oben genannte Nebeneffekte entstehen. Das Analysetool unterstützt die Konstrukteur*innen in jeder Phase der Entwicklung durch eine angeleitete Reflexion, um potenziell komplexe Interaktionssituationen zu antizipieren und ihnen durch Analyse der möglichen Ursachen der Komplexitätswahrnehmung zu begegnen.
Ausgehend von der Definition für Interaktionskomplexität wurden Item-Indikatoren zur Erfassung komplexer Interaktionssituationen entwickelt, die dann anhand von geeigneten Kriterien für Komplexität analysiert werden. Das Analysetool ist als „Do-It-Yourself“ Fragebogen mit eigenständiger Auswertung aufgebaut. Die Genese des Fragebogens und die Ergebnisse der durchgeführten Evaluation an fünf Softwarentwickler*innen werden dargestellt. Es konnte festgestellt werden, dass das Analysetool bei den Befragten als anwendbar, effektiv und hilfreich wahrgenommen wurde und damit eine hohe Akzeptanz bei der Zielgruppe genießt. Dieser Befund unterstützt die gute Einbindung des Analysetools in den Software-Entwicklungsprozess.
Additive manufacturing (AM) processes enable the production of metal structures with exceptional design freedom, of which laser powder bed fusion (PBF-LB) is one of the most common. In this process, a laser melts a bed of loose feedstock powder particles layer-by-layer to build a structure with the desired geometry. During fabrication, the repeated melting and rapid, directional solidification create large temperature gradients that generate large thermal stress. This thermal stress can itself lead to cracking or delamination during fabrication. More often, large residual stresses remain in the final part as a footprint of the thermal stress. This residual stress can cause premature distortion or even failure of the part in service. Hence, knowledge of the residual stress field is critical for both process optimization and structural integrity.
Diffraction-based techniques allow the non-destructive characterization of the residual stress fields. However, such methods require a good knowledge of the material of interest, as certain assumptions must be made to accurately determine residual stress. First, the measured lattice plane spacings must be converted to lattice strains with the knowledge of a strain-free material state. Second, the measured lattice strains must be related to the macroscopic stress using Hooke's law, which requires knowledge of the stiffness of the material. Since most crystal structures exhibit anisotropic material behavior, the elastic behavior is specific to each lattice plane of the single crystal. Thus, the use of individual lattice planes in monochromatic diffraction residual stress analysis requires knowledge of the lattice plane-specific elastic properties. In addition, knowledge of the microstructure of the material is required for a reliable assessment of residual stress.
This work presents a toolbox for reliable diffraction-based residual stress analysis. This is presented for a nickel-based superalloy produced by PBF-LB. First, this work reviews the existing literature in the field of residual stress analysis of laser-based AM using diffraction-based techniques. Second, the elastic and plastic anisotropy of the nickel-based superalloy Inconel 718 produced by PBF-LB is studied using in situ energy dispersive synchrotron X-ray and neutron diffraction techniques. These experiments are complemented by ex situ material characterization techniques. These methods establish the relationship between the microstructure and texture of the material and its elastic and plastic anisotropy. Finally, surface, sub-surface, and bulk residual stress are determined using a texture-based approach. Uncertainties of different methods for obtaining stress-free reference values are discussed.
The tensile behavior in the as-built condition is shown to be controlled by texture and cellular sub-grain structure, while in the heat-treated condition the precipitation of strengthening phases and grain morphology dictate the behavior. In fact, the results of this thesis show that the diffraction elastic constants depend on the underlying microstructure, including texture and grain morphology. For columnar microstructures in both as-built and heat-treated conditions, the diffraction elastic constants are best described by the Reuss iso-stress model. Furthermore, the low accumulation of intergranular strains during deformation demonstrates the robustness of using the 311 reflection for the diffraction-based residual stress analysis with columnar textured microstructures. The differences between texture-based and quasi-isotropic approaches for the residual stress analysis are shown to be insignificant in the observed case. However, the analysis of the sub-surface residual stress distributions show, that different scanning strategies result in a change in the orientation of the residual stress tensor. Furthermore, the location of the critical sub-surface tensile residual stress is related to the surface roughness and the microstructure. Finally, recommendations are given for the diffraction-based determination and evaluation of residual stress in textured additively manufactured alloys.
Portal = Welt retten
(2024)
Fragen beantworten, Unbekanntes erklären, Rätsel lösen – und die gewonnenen Erkenntnisse zum Nutzen der Menschheit einsetzen: Das treibt Wissenschaftler*innen auf der ganzen Welt an. Forschung ist keine Geheimwissenschaft, die im stillen Kämmerlein passiert. Sie dient im besten Fall allen. Sie funktioniert voraussetzungsfrei und ergebnisoffen, und gerade deshalb können Forschungsergebnisse notwendige Innovationen, Transformation oder Umdenken fördern und auf diese Weise die Welt verändern. Zum Besseren, so die Hoffnung. Für diese Ausgabe der „Portal“ haben wir Universitätspräsident Prof. Oliver Günther, Ph.D. und die Ökologin Prof. Dr. Damaris Zurell gefragt, ob Wissenschaft die Welt retten kann. Sie sind sich einig: Forschung trägt dazu bei, dass viele Menschen ein lebenswertes und erfülltes Leben führen können. Sie betonen aber auch: Wissenschaft kann das nicht allein erreichen, für echte Veränderungen braucht es Politik, Wirtschaft und Gesellschaft.
Wie wichtig es ist, dass wissenschaftliche Erkenntnisse uns zum Handeln bewegen, davon erzählen auch die vielen anderen Geschichten in diesem Heft. Denn in Potsdam tragen nicht nur Wissenschaftler*innen, sondern auch Studierende und Beschäftigte in Technik und Verwaltung dazu bei, die Universität, ihr Umfeld oder „die Welt da draußen“ Stück für Stück besser zu machen. Jonathan Schorsch zum Beispiel, Professor für Jüdische Religions- und Geistesgeschichte, hat den „Grünen Sabbat“ ins Leben gerufen: einen Tag in der Woche, an dem wir der Erde – und uns selbst – eine kleine Pause gönnen. Der Jurist Andreas Zimmermann berichtet von einem Verfahren vor dem Internationalen Gerichtshof zum Klimawandel, an dem er als Forscher beteiligt ist, und seine Kollegin Dr. Anna von Rebay kämpft als Anwältin für die Rechte des Meeres vor Ausbeutung und Verschmutzung. Der Voltaire-Preisträger Gera Gizaw erzählt von einem Flüchtlingscamp in Kenia aus die Geschichten der Menschen dort und der Medizinethiker Robert Ranisch zeigt, wie die Pflege künftig für noch mehr Wohl sorgen kann. Hochschulangehörige engagieren sich für den Bildungsaufstieg von Menschen aus nicht-akademischen Familien und der Student Tobias Föhl kämpft bei ONE gegen Armut auf der Welt. Mitarbeiter aus der Musikwissenschaft verlängern das Leben von alten Möbeln und Musikinstrumenten, Studierende arbeiten mit Jugendfeuerwehren aus der Region zusammen. Der Better World Award wirft ein Licht auf innovative Ideen, die schnellstmöglich ihren Weg aus der Uni in die Öffentlichkeit finden sollten. Wie wichtig die Kommunikation wissenschaftlicher Erkenntnisse ist, zeigen Julia Wandt und Kristin Küter, die Menschen aus dem Wissenschaftsbetrieb beraten, die Anfeindungen ausgesetzt sind. Denn damit es vorangeht, damit Lösungen für Probleme dieser Welt gefunden werden, darf eines nicht geschehen: dass die Forschung verstummt.
The paper argues that economists’ position-taking in discourses of crises should be understood in the light of economists’ positions in the academic field of economics. This hypothesis is investigated by performing a multiple correspondence analysis (MCA) on a prosopographical data set of 144 French economists who positioned themselves between 2008 and 2021 in controversies over the euro crisis, the French political economic model, and French economics. In these disciplinary controversies, different forms of (post-)national academic capital are used by economists to either initiate change or defend the status quo. These strategies are then interpreted as part of more general power struggles over the basic national or post-national constitution and legitimate governance of economy and society.