Refine
Year of publication
Document Type
- Article (596)
- Preprint (299)
- Postprint (257)
- Conference Proceeding (160)
- Doctoral Thesis (51)
- Working Paper (48)
- Review (36)
- Monograph/Edited Volume (26)
- Part of a Book (24)
- Other (16)
Language
- English (1517) (remove)
Is part of the Bibliography
- no (1517) (remove)
Keywords
- Curriculum Framework (31)
- European values education (31)
- Europäische Werteerziehung (31)
- Familie (31)
- Family (31)
- Lehrevaluation (31)
- Studierendenaustausch (31)
- Unterrichtseinheiten (31)
- curriculum framework (31)
- lesson evaluation (31)
Institute
- Institut für Mathematik (309)
- Extern (259)
- Institut für Physik und Astronomie (119)
- Vereinigung für Jüdische Studien e. V. (108)
- Department Linguistik (91)
- Department Psychologie (86)
- Institut für Chemie (65)
- Hasso-Plattner-Institut für Digital Engineering GmbH (55)
- Historisches Institut (52)
- Institut für Umweltwissenschaften und Geographie (49)
The simple generator
(2006)
I argue that the shift of explanatory burden from the generator to the evaluator in OT syntax – together with the difficulties that arise when we try to formulate a working theory of the interfaces of syntax – leads to a number of assumptions about syntactic structures in OT which are quite different from those typical of minimalist syntax: formal features, as driving forces behind syntactic movement, are useless, and derivational and representational economy are problematic for both empirical and conceptual reasons. The notion of markedness, central in Optimality Theory, is not fully compatible with the idea of synactic economy. Even more so, seemingly obvious cases of blocking by structural economy do not seem to result from grammar proper, but reflect (economical) aspects of language use.
Content: 1 The Typology 1.1 Object Placement 2 Treatment of StG in terms of LF Movement – with and without Head Movement 3 An OT-solution in terms of linearisation (‘LF-to-PF-Mapping’) 3.1 The trigger for additional orders: Focus 3.2 Competitions 3.3 Summary 4 RP 4.1 LF Movement – with and without Head Movement 4.2 The OT-account for RP 4.3 Competitions 5 Summary
Counting Markedness
(2003)
This paper reports the results of a corpus investigation on case conflicts in German argument free relative constructions. We investigate how corpus frequencies reflect the relative markedness of free relative and correlative constructions, the relative markedness of different case conflict configurations, and the relative markedness of different conflict resolution strategies. Section 1 introduces the conception of markedness as used in Optimality Theory. Section 2 introduces the facts about German free relative clauses, and section 3 presents the results of the corpus study. By and large, markedness and frequency go hand in hand. However, configurations at the highest end of the markedness scale rarely show up in corpus data, and for the configuration at the lowest end we found an unexpected outcome: the more marked structure is preferred.
The nonlinear interaction of waves excited by the modified two-stream instability (Farley-Buneman instability) is considered. It is found that, during the linear stage of wave growth, the enhanced pressure of the high-frequency part of the waves locally generates a ponderomotive force. This force acts on the plasma particles and redistributes them. Thus an additional electrostatic polarization field occurs, which influences the low-frequency part of the waves. Then, the low-frequency waves also cause a redistribution of the high-frequency waves. In the paper, a self-consistent system of equations is obtained, which describes the nonlinear interaction of the waves. It is shown that the considered mechanism of wave interaction causes a nonlinear stabilization of the high-frequency waves’ growth and a formation of local density structures of the charged particles. The density modifications of the charged particles during the non-linear stage of wave growth and the possible interval of aspect angles of the high-frequency waves are estimated.
Connecting the new world
(2012)
This article explores the link between the profound technological transformations of the nineteenth century and the life and work of the Prussian scholar Alexander von Humboldt (1769-1859). It analyses how Humboldt sought to appropriate the revolutionary new communication and transportation technologies of the time in order to integrate the American continent into global networks of commercial, intellectual and material exchange. Recent scholarship on Humboldt’s expedition to the New World (1799-1804) has claimed that his descriptions of tropical landscapes opened up South America to a range of ‘transformative interventions’ (Pratt) by European capitalists and investors. These studies, however, have not analysed the motivations underlying Humboldt’s support for such intrusions into nature. Furthermore, they have not explored the role that such projects played in shaping Humboldt’s understanding of the forces behind the progress of societies. To comprehend Humboldt’s approval for human interventions in America’s natural world, this study first explores the role that eighteenth-century theories of progress and the notion of geographical determinism played in shaping his conception of civilisational development. It will look at concrete examples of transformative interventions in the American hemisphere that were actively proposed by Humboldt and intended to overcome natural obstacles to human interaction. These were the use of steamships, electric telegraphy, railroads and large-scale canals that together enabled global trade and communication to occur at an unprecedented pace. All these contemporary innovations will be linked to the four motifs of nets, mobility, progress and acceleration, which were driving forces behind the ‘transformation of the world’ that took place in the course of the nineteenth century.
River reaches protected by dikes exhibit high damage potential due to strong value accumulation in the hinterland areas. While providing an efficient protection against low magnitude flood events, dikes may fail under the load of extreme water levels and long flood durations. Hazard and risk assessments for river reaches protected by dikes have not adequately considered the fluvial inundation processes up to now. Particularly, the processes of dike failures and their influence on the hinterland inundation and flood wave propagation lack comprehensive consideration. This study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The model was developed and tested on a ca. 91 km heavily diked river reach on the German part of the Elbe River between gauges Torgau and Vockerode. The reach is characterised by low slope and fairly flat extended hinterland areas. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100, 200, 500, 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. In the disaggregated display mode, the dike hazard maps indicate the failure probabilities for each considered breach mechanism. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. Finally, scenarios of polder deployment for the extreme floods with T = 200, 500, 1000 were simulated with IHAM. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.
Using a special technique of data analysis, we have found out 34 grand minima of solar activity obtained from a 7,700 years long Δ14C record. The method used rests on a proper filtering of the Δ14C record and the extrapolation of verifiable results for the later history back in time. Additionally, we use a method of nonlinear dynamics, the recurrence rate, to back up the results. Our findings are not contradictory to the record of solar maxima resp. minima by Eddy [5], but constitute a considerable extension. Hence, it has become possible to look closer at the validity of models. This way, we have tested several models for solar activity, esp. the model of Barnes et al. [1]. There are hints for that the grand minima might solely be driven by the 209 year period found in the Δ14C record.
The factors that determine the efficiency of energy transfer in aquatic food webs have been investigated for many decades. The plant-animal interface is the most variable and least predictable of all levels in the food web. In order to study determinants of food quality in a large lake and to test the recently proposed central importance of the long-chained eicosapentaenoic acid (EPA) at the pelagic producer-grazer interface, we tested the importance of polyunsaturated fatty acids (PUFAs) at the pelagic producerconsumer interface by correlating sestonic food parameters with somatic growth rates of a clone of Daphnia galeata. Daphnia growth rates were obtained from standardized laboratory experiments spanning one season with Daphnia feeding on natural seston from Lake Constance, a large pre-alpine lake. Somatic growth rates were fitted to sestonic parameters by using a saturation function. A moderate amount of variation was explained when the model included the elemental parameters carbon (r2 = 0.6) and nitrogen (r2 = 0.71). A tighter fit was obtained when sestonic phosphorus was incorporated (r2 = 0.86). The nonlinear regression with EPA was relatively weak (r2 = 0.77), whereas the highest degree of variance was explained by three C18-PUFAs. The best (r2 = 0.95), and only significant, correlation of Daphnia's growth was found with the C18-PUFA α-linolenic acid (α-LA; C18:3n-3). This correlation was weakest in late August when C:P values increased to 300, suggesting that mineral and PUFA-limitation of Daphnia's growth changed seasonally. Sestonic phosphorus and some PUFAs showed not only tight correlations with growth, but also with sestonic α-LA content. We computed Monte Carlo simulations to test whether the observed effects of α-LA on growth could be accounted for by EPA, phosphorus, or one of the two C18-PUFAs, stearidonic acid (C18:4n-3) and linoleic acid (C18:2n-6). With >99 % probability, the correlation of growth with α-LA could not be explained by any of these parameters. In order to test for EPA limitation of Daphnia's growth, in parallel with experiments on pure seston, growth was determined on seston supplemented with chemostat-grown, P-limited Stephanodiscus hantzschii, which is rich in EPA. Although supplementation increased the EPA content 80-800x, no significant changes in the nonlinear regression of the growth rates with α-LA were found, indicating that growth of Daphnia on pure seston was not EPA limited. This indicates that the two fatty acids, EPA and α-LA, were not mutually substitutable biochemical resources and points to different physiological functions of these two PUFAs. These results support the PUFA-limitation hypothesis for sestonic C:P < 300 but are contrary to the hypothesis of a general importance of EPA, since no evidence for EPA limitation was found. It is suggested that the resource ratios of EPA and α-LA rather than the absolute concentrations determine which of the two resources is limiting growth.
Significant seasonal variation in size at settlement has been observed in newly settled larvae of Dreissena polymorpha in Lake Constance. Diet quality, which varies temporally and spatially in freshwater habitats, has been suggested as a significant factor influencing life history and development of freshwater invertebrates. Accordingly, experiments were conducted with field-collected larvae to test the hypothesis that diet quality can determine planktonic larval growth rates, size at settlement and subsequent post-metamorphic growth rates. Larvae were fed one of two diets or starved. One diet was composed of cyanobacterial cells which are deficient in polyunsaturated fatty acids (PUFAs), and the other was a mixed diet rich in PUFAs. Freshly metamorphosed animals from the starvation treatment had a carbon content per individual 70% lower than that of larvae fed the mixed diet. This apparent exhaustion of larval internal reserves resulted in a 50% reduction of the postmetamorphic growth rates. Growth was also reduced in animals previously fed the cyanobacterial diet. Hence, low food quantity or low food quality during the larval stage of D. polymorpha lead to irreversible effects for postmetamorphic animals, and is related to inferior competitive abilities.
Unity in diversity
(2005)
This paper describes the creation and preparation of TUSNELDA, a collection of corpus data built for linguistic research. This collection contains a number of linguistically annotated corpora which differ in various aspects such as language, text sorts / data types, encoded annotation levels, and linguistic theories underlying the annotation. The paper focuses on this variation on the one hand and the way how these heterogeneous data are integrated into one resource on the other hand.
Agriculture is one of the most important human activities providing food and more agricultural goods for seven billion people around the world and is of special importance in sub-Saharan Africa. The majority of people depends on the agricultural sector for their livelihoods and will suffer from negative climate change impacts on agriculture until the middle and end of the 21st century, even more if weak governments, economic crises or violent conflicts endanger the countries’ food security. The impact of temperature increases and changing precipitation patterns on agricultural vegetation motivated this thesis in the first place. Analyzing the potentials of reducing negative climate change impacts by adapting crop management to changing climate is a second objective of the thesis. As a precondition for simulating climate change impacts on agricultural crops with a global crop model first the timing of sowing in the tropics was improved and validated as this is an important factor determining the length and timing of the crops´ development phases, the occurrence of water stress and final crop yield. Crop yields are projected to decline in most regions which is evident from the results of this thesis, but the uncertainties that exist in climate projections and in the efficiency of adaptation options because of political, economical or institutional obstacles have to be considered. The effect of temperature increases and changing precipitation patterns on crop yields can be analyzed separately and varies in space across the continent. Southern Africa is clearly the region most susceptible to climate change, especially to precipitation changes. The Sahel north of 13° N and parts of Eastern Africa with short growing seasons below 120 days and limited wet season precipitation of less than 500 mm are also vulnerable to precipitation changes while in most other part of East and Central Africa, in contrast, the effect of temperature increase on crops overbalances the precipitation effect and is most pronounced in a band stretching from Angola to Ethiopia in the 2060s. The results of this thesis confirm the findings from previous studies on the magnitude of climate change impact on crops in sub-Saharan Africa but beyond that helps to understand the drivers of these changes and the potential of certain management strategies for adaptation in more detail. Crop yield changes depend on the initial growing conditions, on the magnitude of climate change, and on the crop, cropping system and adaptive capacity of African farmers which is only now evident from this comprehensive study for sub-Saharan Africa. Furthermore this study improves the representation of tropical cropping systems in a global crop model and considers the major food crops cultivated in sub-Saharan Africa and climate change impacts throughout the continent.
rezensiertes Werk: Gartner, Isabella: Menorah : Jüdisches Familienblatt für Wissenschaft, Kunst und Literatur (1923–1932) ; Materialien zur Geschichte einer Wiener zionistischen Zeitschrift. - Würzburg : Königshausen & Neumann, 2009. - 356 S. ISBN 978-3-8260-3864-8
The interaction between massive star formation and gas is a key ingredient in galaxy evolution. Given the level of observational detail currently achievable in nearby starbursts, they constitute ideal laboratories to study interaction process that contribute to global evolution in all types of galaxies. Wolf-Rayet (WR) stars, as an observational marker of high mass star formation, play a pivotal role and their winds can strongly influence the surrounding gas. Imaging spectroscopy of two nearby (<4 Mpc) starbursts, both of which show multiple regions with WR stars, are discussed. The relation between the WR content and the physical and chemical properties of the surrounding ionized gas is explored.
INTEGRAL tripled the number of super-giant high-mass X-ray binaries (sgHMXB) known in the Galaxy by revealing absorbed and fast transient (SFXT) systems. Quantitative constraints on the wind clumping of massive stars can be obtained from the study of the hard X-ray variability of SFXT. A large fraction of the hard X-ray emission is emitted in the form of flares with a typical duration of 3 ksec, frequency of 7 days and luminosity of $10^{36}$ erg/s. Such flares are most probably emitted by the interaction of a compact object orbiting at $\sim10~R_*$ with wind clumps ($10^{22 ... 23}$ g) representing a large fraction of the stellar mass-loss rate. The density ratio between the clumps and the inter-clump medium is $10^{2 ... 4}$. The parameters of the clumps and of the inter-clump medium, derived from the SFXT flaring behavior, are in good agreement with macro-clumping scenario and line-driven instability simulations. SFXT are likely to have larger orbital radius than classical sgHMXB.
In this contribution, we gather major academic and design approaches for explaining how space in games is constructed and how it constructs games, thereby defining the conceptual dimensions of gamespace. Each concept’s major inquiry is briefly discussed, iterated if applicable, as well as named. Thus, we conclude with an overview of the locative, the representational, the programmatic, the dramaturgical, the typological, the perspectivistic, the form-functional, and the form-emotive dimensions.
Content: Synopsis The Attitudes toward Rape Victims Scale: Psychometric Data from 14 Countries Scale Construction and Validation - Study One: Preliminary Analyses - Study Two: Test-Retest Reliability - Study Three: Construct Validity Cross-cultural Extensions - United States - United Kingdom - Germany - New Zealand - Canada - West Indies - Israel - Turkey - India - Hong Kong - Malaysia - Zimbabwe - Mexico - Metric Equivalence Discussion
Logic as a medium
(2010)
Computer games are rigid in a peculiar way: the logic of computation was the first to shape the early games. The logic of interactivity marked the action genre of games in the second place, while in massive multiplayer online gaming all the emergences of the net occur to confront us with just another type of logic. These logics are the media in which the specific forms of computer games evolve. Therefore, a look at gaming supposing that there are three eras of computation is taken: the early synthetical era, ruled by the Turing machine and by mainframe computers, by the IPO principle of computing; the second, mimetical era, when interactivity and graphical user interfaces dominate, the domain of the feedback loop; and the third, emergent era, in which the complexity of networked personal computers and their users is dominant.
“Financial Analysis” is an online course designed for professionals consisting of three MOOCs, offering a professionally and institutionally recognized certificate in finance. The course is open but not free of charge and attracts mostly professionals from the banking industry. The primary objective of this study is to identify indicators that can predict learners at high risk of failure. To achieve this, we analyzed data from a previous course that had 875 enrolled learners and involve in the course during Fall 2021. We utilized correspondence analysis to examine demographic and behavioral variables.
The initial results indicate that demographic factors have a minor impact on the risk of failure in comparison to learners’ behaviors on the course platform. Two primary profiles were identified: (1) successful learners who utilized all the documents offered and spent between one to two hours per week, and (2) unsuccessful learners who used less than half of the proposed documents and spent less than one hour per week. Between these groups, at-risk students were identified as those who used more than half of the proposed documents and spent more than two hours per week. The goal is to identify those in group 1 who may be at risk of failing and those in group 2 who may succeed in the current MOOC, and to implement strategies to assist all learners in achieving success.
In semi-arid savannah ecosystems, the vegetation structure and composition, i.e. the architecture of trees, shrubs, grass tussocks and herbaceous plants, offer a great variety of habitats and niches to sustain animal diversity. In the last decades intensive human land use practises like livestock farming have altered the vegetation in savannah ecosystems worldwide. Extensive grazing leads to a reduction of the perennial and herbaceous vegetation cover, which results in an increased availability of bare soil. Both, the missing competition with perennial grasses and the increase of bare soils favour shrub on open ground and lead to area-wide shrub encroachment. As a consequence of the altered vegetation structure and composition, the structural diversity declines. It has been shown that with decreasing structural diversity animal diversity decline across a variety of taxa. Knowledge on the effects of overgrazing on reptiles, which are an important part of the ecosystem, are missing. Furthermore, the impact of habitat degradation on factors of a species population dynamic and life history, e.g., birth rate, survival rate, predation risk, space requirements or behavioural adaptations are poorly known. Therefore, I investigated the impact of overgrazing on the reptile community in the southern Kalahari. Secondly I analysed population dynamics and the behaviour of the Spotted Sand Lizard, Pedioplanis l. lineoocellata. All four chapters clearly demonstrate that habitat degradation caused by overgrazing had a severe negative impact upon (i) the reptile community as a whole and (ii) on population parameters of Pedioplanis l. lineoocellata. Chapter one showed a significant decline of regional reptile diversity and abundance in degraded habitats. In chapter two I demonstrated that P. lineoocellata moves more frequently, spends more time moving and covers larger distances in degraded than in non-degraded habitats. In addition, home range size of the lizard species increases in degraded habitats as shown by chapter three. Finally, chapter four showed the negative impacts of overgrazing on several population parameters of P. lineoocellata. Absolute population size of adult and juvenile lizards, survival rate and birth rate are significantly lower in degraded habitats. Furthermore, the predation risk was greatly increased in degraded habitats. A combination of a variety of aspects can explain the negative impact of habitat degradation on reptiles. First, reduced prey availability negatively affects survival rate, the birth rate and overall abundance. Second, the loss of perennial plant cover leads to a loss of niches and to a reduction of opportunities to thermoregulate. Furthermore, a loss of cover and is associated with increased predation risk. A major finding of my thesis is that the lizard P. lineoocellata can alter its foraging strategy. Species that are able to adapt and change behaviour, such as P. lineoocellata can effectively buffer against changes in their environment. Furthermore, perennial grass cover can be seen as a crucial ecological component of the vegetation in the semi-arid savannah system of the southern Kalahari. If perennial grass cover is reduced to a certain degree reptile diversity will decline and most other aspects of reptile life history will be negatively influenced. Savannah systems are characterised by a mixture of trees, shrubs and perennial grasses. These three vegetation components determine the composition and structure of the vegetation and accordingly influence the faunal diversity. Trees are viewed as keystone structures and focal points of animal activity for a variety of species. Trees supply animals with shelter, shade and food and act as safe sites, nesting sites, observation posts and foraging sites. Recent research demonstrates a positive influence of shrub patches on animal diversity. Moreover, it would seem that intermediate shrub cover can also sustain viable populations in savannah landscapes as has been demonstrated for small carnivores and rodent species. The influence of perennial grasses on faunal diversity did not receive the same attention as the influence of trees and shrubs. In my thesis I didn’t explicitly measure the direct effects of perennial grasses but my results strongly imply that it has an important role. If the perennial grass cover is significantly depleted my results suggest it will negatively influence reptile diversity and abundance and on several populations parameters of P. lineoocellata. Perennial grass cover is associated with the highest prey abundance, reptile diversity and reptile abundance. It provides reptiles both a refuge from predators and opportunities to optimise thermoregulation. The relevance of each of the three vegetation structural elements is different for each taxa and species. In conclusion, I can all three major vegetation structures in the savannah system are important for faunal diversity.
Jacob Brandon Maduro’s Memoirs and Related Observations (Havana, 1953) speak to the lasting yet malleable legacy of Jewish Caribbean/Atlantic mercantile communities that defined early modern settlement in the Americas. A close reading of the Memoirs, alongside relevant archival records and community narratives, lends new perspectives to scholarship on Port Jewries and the Atlantic Diaspora. Specifically concerned with Jacob’s adoption of such leading intellectual and political tropes as the Monroe doctrine, José Martí’s Nuestra America, and a Zionism that evolved from an ideology to a reality, the Memoirs reveal a narrative at once defined by the tremendous upheavals of the first half of the 20th century, and an enduring sense of Jewish diasporic peoplehood defined through a Port Jew paradigm whereby the preservation of Jewish ethnicity is understood as synonymous with the championing of modernity.
During the National Multiplication Training in Kenya in 2018, participants raised concerns about attrition, completion rates and quality of PhD programmes in Kenya’s public universities. This led the authors of this article to further examine the question of PhD completion rates. Available data underlined that PhD students across various disciplines in Kenya’s public universities take unnecessarily long to complete their studies due to a myriad of factors that are related to their supervisors, university guidelines for post-graduate studies, or the students themselves. This article examines inertia areas along the PhD training pathway at three public universities in Kenya and provides suggestions on structural and operational changes universities must make to shorten completion periods.
The site of confluence of the artery and the portal vein in the liver still appears to be controversial. Anatomical studies suggested a presinusoidal or an intrasinusoidal confluence in the first, second or even final third of the sinusoids. The objective of this investigation was to study the problem with functional biochemical techniques. Rat livers were perfused through the hepatic artery and simultaneously either in the orthograde direction from the portal vein to the hepatic vein or in the retrograde direction from the hepatic vein to the portal vein. Arterial how was linearly dependent on arterial pressure between 70 cm H2O and 120 cm H2O at a constant portal or hepatovenous pressure of 18 cm H2O. An arterial pressure of 100 cm H2O was required for the maintenance of a homogeneous orthograde perfusion of the whole parenchyma and of a physiologic ratio of arterial to portal how of about 1:3. Glucagon was infused either through the artery or the portal vein and hepatic vein, respectively, to a submaximally effective ''calculated'' sinusoidal concentration after mixing of 0.1 nmol/L. During orthograde perfusions, arterial and portal glucagon caused the same increases in glucose output. Yet during retrograde perfusions, hepatovenous glucagon elicited metabolic alterations equal to those in orthograde perfusions, whereas arterial glucagon effected changes strongly reduced to between 10% and 50%. Arterially infused trypan blue was distributed homogeneously in the parenchyma during orthograde perfusions, whereas it reached clearly smaller areas of parenchyma during retrograde perfusions. Finally, arterially applied acridine orange was taken up by all periportal hepatocytes in the proximal half of the acinus during orthograde perfusions but only by a much smaller portion of periportal cells in the proximal third of the acinus during retrograde perfusions. These findings suggest that in rat liver, the hepatic artery and the portal vein mix before and within the first third of the sinusoids, rather than in the middle or even last third.
This thesis aims to quantify the human impact on the natural resource water at the landscape scale. The drivers in the federal state of Brandenburg (Germany), the area under investigation, are land-use changes induced by policy decisions at European and federal state level. The water resources of the federal state are particularly sensitive to changes in land-use due to low precipitation rates in the summer combined with sandy soils and high evapotranspiration rates. Key elements in landscape hydrology are forests because of their unique capacity to transport water from the soil to the atmosphere. Given these circumstances, decisions made at any level of administration that may have effects on the forest sector in the state are critical in relation to the water cycle. It is therefore essential to evaluate any decision that may change forest area and structure in such a sensitive region. Thus, as a first step, it was necessary to develop and implement a model able to simulate possible interactions and feedbacks between forested surfaces and the hydrological cycle at the landscape scale. The result is a model for simulating the hydrological properties of forest stands based on a robust computation of the temporal and spatial LAI (leaf area index) dynamics. The approach allows the simulation of all relevant hydrological processes with a low parameter demand. It includes the interception of precipitation and transpiration of forest stands with and without groundwater in the rooting zone. The model also considers phenology, biomass allocation, as well as mortality and simple management practices. It has been implemented as a module in the eco-hydrological model SWIM (Soil and Water Integrated Model). This model has been tested in two pre-studies to verify the applicability of its hydrological process description for the hydrological conditions typical for the state. The newly implemented forest module has been tested for Scots Pine (Pinus sylvestris) and in parts for Common Oak (Quercus robur and Q. petraea) in Brandenburg. For Scots Pine the results demonstrate a good simulation of annual biomass increase and LAI in addition to the satisfactory simulation of litter production. A comparison of the simulated and measured data of the May sprout for Scots pine and leaf unfolding for Oak, as well as the evaluation against daily transpiration measurements for Scots Pine, does support the applicability of the approach. The interception of precipitation has also been simulated and compared with weekly observed data for a Scots Pine stand which displays satisfactory results in both the vegetation periods and annual sums. After the development and testing phase, the model is used to analyse the effects of two scenarios. The first scenario is an increase in forest area on abandoned agricultural land that is triggered by a decrease in European agricultural production support. The second one is a shift in species composition from predominant Scots Pine to Common Oak that is based on decisions of the regional forestry authority to support a more natural species composition. The scenario effects are modelled for the federal state of Brandenburg on a 50m grid utilising spatially explicit land-use patterns. The results, for the first scenario, suggest a negative impact of an increase in forest area (9.4% total state area) on the regional water balance, causing an increase in mean long-term annual evapotranspiration of 3.7% at 100% afforestation when compared to no afforestation. The relatively small annual change conceals a much more pronounced seasonal effect of a mean long-term evapotranspiration increase by 25.1% in the spring causing a pronounced reduction in groundwater recharge and runoff. The reduction causes a lag effect that aggravates the scarcity of water resources in the summer. In contrast, in the second scenario, a change in species composition in existing forests (29.2% total state area) from predominantly Scots Pine to Common Oak decreases the long-term annual mean evapotranspiration by 3.4%, accompanied by a much weaker, but apparent, seasonal pattern. Both scenarios exhibit a high spatial heterogeneity because of the distinct natural conditions in the different regions of the state. Areas with groundwater levels near the surface are particularly sensitive to changes in forest area and regions with relatively high proportion of forest respond strongly to the change in species composition. In both cases this regional response is masked by a smaller linear mean effect for the total state area. Two critical sources of uncertainty in the model results have been investigated. The first one originates from the model calibration parameters estimated in the pre-study for lowland regions, such as the federal state. The combined effect of the parameters, when changed within their physical meaningful limits, unveils an overestimation of the mean water balance by 1.6%. However, the distribution has a wide spread with 14.7% for the 90th percentile and -9.9% for the 10th percentile. The second source of uncertainty emerges from the parameterisation of the forest module. The analysis exhibits a standard deviation of 0.6 % over a ten year period in the mean of the simulated evapotranspiration as a result of variance in the key forest parameters. The analysis suggests that the combined uncertainty in the model results is dominated by the uncertainties of calibration parameters. Therefore, the effect of the first scenario might be underestimated because the calculated increase in evapotranspiration is too small. This may lead to an overestimation of the water balance towards runoff and groundwater recharge. The opposite can be assumed for the second scenario in which the decrease in evapotranspiration might be overestimated.
This paper originated from discussions about the need for
important changes in the curriculum for Computing including two focus
group meetings at IFIP conferences over the last two years. The
paper examines how recent developments in curriculum, together with
insights from curriculum thinking in other subject areas, especially mathematics
and science, can inform curriculum design for Computing.
The analysis presented in the paper provides insights into the complexity
of curriculum design as well as identifying important constraints and
considerations for the ongoing development of a vision and framework
for a Computing curriculum.
How Things Work
(2015)
Recognizing and defining functionality is a key competence
adopted in all kinds of programming projects. This study investigates
how far students without specific informatics training are able to identify
and verbalize functions and parameters. It presents observations
from classroom activities on functional modeling in high school chemistry
lessons with altogether 154 students. Finally it discusses the potential
of functional modelling to improve the comprehension of scientific
content.
Acclimatization
(2003)
Together with their wives Otto and Richard Schomburgk arrived in Port Adelaide (South Australia) on August 16th 1849. The essay looks at how these two brothers, who had received their scientific training and promotion in the circle surrounding Alexander von Humboldt, reacted to the unfamiliar conditions in the young British colony. Some indication will be given as to the differences between the Schomburgk brothers treatment of the natural resources of the new colony and that of the English colonists of the time.
In the middle of the 19th century the question whether expanding civilization and industrialization had an effect on climate was discussed intensely worldwide. It was feared that increasing deforestation would lead to continuous decrease in rainfall. This first scientific discussion about climate change as the result of human intervention was strongly influenced by the research Alexander von Humboldt and Jean-Baptiste Boussingault had undertaken when they investigated the falling water levels of Lake Valencia in Venezuela. This essay aims to clarify the question whether Alexander von Humboldt can be counted among the leading figures of modern environmentalism on account of this research as is being claimed by Richard H. Grove in his influential book Green Imperialism. Colonial Expansion, Tropical Island Edens and Origins of Environmentalism, 1600-1860 (1995).
The paper is an enquiry into dynamic social contract theory. The social contract defines the rules of resource use. An intergenerational social contract in an economy with a single exhaustible resource is examined within a framework of an overlapping generations model. It is assumed that new generations do not accept the old social contract, and access to resources will be renegotiated between any incumbent generation and their successors. It turns out that later generations will be in an unfortunate position regardless of their bargaining power.
Mothers of Seafaring
(2023)
The article aims to trace the contribution of Jewish women in the Yishuv’s maritime history. Taking the example of Henrietta Diamond, a founding member and chairperson of the Zebulun Seafaring Society, the article seeks to explore the representation and role of women in a growing Jewish maritime domain from the 1930s to the 1950s. It examines Zionist narratives on the ‘New Jew’ and the Jewish body and studies their relevance for the emerging field of maritime activities in the Yishuv. By contextualizing the work and depiction of Henrietta Diamond, the article sheds new light on the gendered notions that underlay the emergence of the Jewish maritime domain and illustrates the patterns of inclusion and exclusion in it.
The most massive stars are those with the shortest but most active life. One group of massive stars, the Luminous Blue Variables (LBVs), of which only a few objects are known, are in particular of interest concerning the stability of stars. They have a high mass loss rate and are close to being instable. This is even more likely as rotation becomes an important factor in stellar evolution of these stars. Through massive stellar winds and sometimes giant eruptions, LBV nebulae are formed. Various aspects in the evolution in the LBV phase lead, beside the large scale morphological and kinematical differences, to a diversity of small structures like clumps, rims, and outflows in these nebulae.
Luminous Blue Variables (LBVs) are stars is a transitional phase massive stars may enter while evolving from main-sequence to Wolf-Rayet stars. The to LBVs intrinsic photometric variability is based on the modulation of the stellar spectrum. Within a few years the spectrum shifts from OB to AF type and back. During their cool phase LBVs are close to the Humphreys-Davidson (equivalent to Eddington/Omega-Gamma) limit. LBVs have a rather high mass loss rate, with stellar winds that are fast in the hot and slower in the cool phase of an LBV. These alternating wind velocities lead to the formation of LBV nebulae by wind-wind interactions. A nebula can also be formed in a spontaneous giant eruption in which larger amounts of mass are ejected. LBV nebulae are generally small (< 5 pc) mainly gaseous circumstellar nebulae, with a rather large fraction of LBV nebulae being bipolar. After the LBV phase the star will turn into a Wolf-Rayet star, but note that not all WR stars need to have passed the LBV phase. Some follow from the RSG and the most massive directly from the MS phase. In general WRs have a large mass loss and really fast stellar winds. The WR wind may interact with winds of earlier phases (MS, RSG) to form WR nebulae. As for WR with LBV progenitors the scenario might be different, here no older wind is present but an LBV nebula! The nature of WR nebulae are therefore manifold and in particular the connection (or family ties) of WR to LBV nebulae is important to understand the transition between these two phases, the evolution of massive stars, their winds, wind-wind and wind-nebula interactions. Looking at the similarities and differences of LBV and WR nebula, figuring what is a genuine LBV and WR nebula are the basic question addressed in the analysis presented here.
Informatics as a school subject has been virtually absent from bilingual education programs in German secondary schools. Most bilingual programs in German secondary education started out by focusing on subjects from the field of social sciences. Teachers and bilingual curriculum experts alike have been regarding those as the most suitable subjects for bilingual instruction – largely due to the intercultural perspective that a bilingual approach provides. And though one cannot deny the gain that ensues from an intercultural perspective on subjects such as history or geography, this benefit is certainly not limited to social science subjects. In consequence, bilingual curriculum designers have already begun to include other subjects such as physics or chemistry in bilingual school programs. It only seems a small step to extend this to informatics. This paper will start out by addressing potential benefits of adding informatics to the range of subjects taught as part of English-language bilingual programs in German secondary education. In a second step it will sketch out a methodological (= didactical) model for teaching informatics to German learners through English. It will then provide two items of hands-on and tested teaching material in accordance with this model. The discussion will conclude with a brief outlook on the chances and prerequisites of firmly establishing informatics as part of bilingual school curricula in Germany.
This paper describes an almost forgotten chapter in the relatively short history of Jewish- Buddhist interactions. The popularization of Buddhism in Germany in the second half of 19th century, effected mainly by its positive appraisal in the philosophy of Arthur Schopenhauer, made it a common referent for both critics of Judaism and Christianity as well as their defenders. At the same time, Judaism was viewed by many as a historically antiquated religion and Jewish elements in Christianity were regarded as impediments to the progress of European religiosity and culture. Schopenhauerian conception of “pessimistic” Buddhism and “optimistic” Judaism as the two most distant religious ideas was proudly appropriated by many Jewish thinkers. These Jews portrayed Buddhism as an anti-worldly and anti-social religion of egoistic individuals who seek their own salvation (i. e. annihilation into Nothingness), the most extreme form of pessimism and asceticism which negates every being, will, work, social structures and transcendence. Judaism, in contrast, represented direct opposites of all the aforementioned characteristics. In comparisons to Buddhism, Judaism stood out as a religion which carried the most needed social and psychological values for a healthy modern society: decisive affirmation of the world, optimism, social activity, co-operation with others, social egalitarianism, true charitability, and religious purity free from all remnants of polytheism, asceticism, and the inefficiently excessive moral demands ascribed to both Buddhism and Christianity. Through the analysis of texts by Ludwig Philippson, Ludwig Stein, Leo Baeck, Max Eschelbacher, Juda Bergmann, Fritz-Leopold Steinthal, Elieser David and others, this paper tries to show how the image of Buddhism as an antithesis to Judaism helped the German Jewish reform thinkers in defining the “essence of Judaism” and in proving to both Jewish and Christian audiences its enduring meaningfulness and superiority for the modern society.
The debate whether to locate the narrative of digital games a) as part of the code or b) as part of the performance will be the starting point for an analysis of two roleplaying games: the single-player game ZELDA: MAJORA’S MASK and the Korean MMORPG AION and their respective narrative logics. When we understand games as abstract code systems, then the narrative logic can be understood as embedded on the code level. With a focus on the player’s performance, the actualization of the possibilities given in the code system is central. Both logics, that of code and that of performance, are reflected in players’ narratives based on the playing experience. They do reflect on the underlying code and rules of the game system as they do reflect on the game world and their own performance within. These narratives rely heavily on the source text – the digital game –, which means that they give insights into the underlying logics of the source text. I will discuss the game structure, the players’ performance while playing the game and the performance of the player after playing the game producing fan narratives. I conceive the narrative structure and the performance of the player playing as necessarily interconnected when we discuss the narrative logics of a game. Producing fan narratives is understood as a performance as well. This performance is based on the experience the players made while playing and refers to both logics of the game they use as their source text.
The spectral efficiency of blackness induction was measured in three normal trichromatic observers and in one deuteranomalous observer. The psychophysical task was to adjust the radiance of a monochromatic 60–120′ annulus until a 45′ central broadband field just turned black and its contour became indiscriminable from a dark surrounding gap that separated it from the annulus. The reciprocal of the radiance required to induce blackness with annulus wavelengths between 420 and 680 nm was used to define a spectral-efficiency function for the blackness component of the achromatic process. For each observer, the shape of this blackness-sensitivity function agreed with the spectral-efficiency function based on heterochromatic flicker photometry when measured with the same 60–120′ annulus. Both of these functions matched the Commission Internationale de l'Eclairage Vλ function except at short wavelengths. Ancillary measurements showed that the latter difference in sensitivity can be ascribed to nonuniformities of preretinal absorption, since the annular field excluded the central 60′ of the fovea. Thus our evidence indicates that, at least to a good first approximation, induced blackness is inversely related to the spectral-luminosity function. These findings are consistent with a model that separates the achromatic and the chromatic pathways.
The optical density of human macular pigment was measured for 50 observers ranging in age from 10 to 90 years. The psychophysical method required adjusting the radiance of a 1°, monochromatic light (400–550 nm) to minimize flicker (15 Hz) when presented in counterphase with a 460 nm standard. This test stimulus was presented superimposed on a broad-band, short-wave background. Macular pigment density was determined by comparing sensitivity under these conditions for the fovea, where macular pigment is maximal, and 5° temporally. This difference spectrum, measured for 12 observers, matched Wyszecki and Stiles's standard density spectrum for macular pigment. To study variation in macular pigment density for a larger group of observers, measurements were made at only selected spectral points (460, 500 and 550 nm). The mean optical density at 460 nm for the complete sample of 50 subjects was 0.39. Substantial individual differences in density were found (ca. 0.10–0.80), but this variation was not systematically related to age.
Stop bashing givenness!
(2005)
Elke Kasimir’s paper (in this volume) argues against employing the notion of Givenness in the explanation of accent assignment. I will claim that the arguments against Givenness put forward by Kasimir are inconclusive because they beg the question of the role of Givenness. It is concluded that, more generally, arguments against Givenness as a diagnostic for information structural partitions should not be accepted offhand, since the notion of Givenness of discourse referents is (a) theoretically simple, (b) readily observable and quantifiable, and (c) bears cognitive significance.
Content: 1. Introduction 2. Getting to the Seen from the Unseen 2.1. The Theory of the Zones 2.2. Brief Comments on Mechanism 3. The Areal Evidence: Shared Features and Their Dialectal Provenance 4. Explaining the Evidence Seen 4.1. Why It Is Not Due to Mere Misleading Coincidence 4.2. Why It Is Not Due to French Influence 4.3. Why It Is Not Due to Norse Influence 4.4. Why It Is Not Due to English Influence over Brittonic 4.5. Why It Is Due to Brittonic Influence 5. Conclusion 5.1. The Areal Pattern and Its Explanation 5.2. Substrate versus Superstrate 5.3. Some Final Arguments, and Good Questions 6. Addenda
In recent years computer games have been discussed by a variety of disciplines from various perspectives. A fundamental difference with other media, which is a point of continuous consideration, is the specific relationship between the viewer and the image, the player and the game apparatus, which is a characteristic of video games as a dispositive. Terms such as immersion, participation, interactivity, or ergodic are an indication of the deep interest in this constellation. This paper explores the resonance between body and image in video games like REZ, SOUL CALIBUR and DANCE DANCE REVOLUTION from the perspective of a temporal ontology of the image, taking particular account of the structuring power of the interface and its subject positioning aspects.
Current contestations of the liberal international order stand in notable contrast with the earlier rise of international law during the post-cold war period. As Krieger and Liese argue, this situation calls for assessment of the type of change that is currently observed, i.e. norm change (Wandel) or a more fundamental transformation of international law – a metamorphosis (Verwandlung)? To address this question, this paper details the bi-focal approach to norms in order to reflect and take account of the complex interrelation between fact-based and value-based conceptions of norms. The paper is organised in three sections. The first section presents three axioms underlying the conceptual framework to study norm(ative) change which are visualised by a triangular operation to analyse this change in relation with practices and norms. The second section recalls three key interests that have guided IR norms research after the return to norms in the late 1980s. They include, first, allocating change in and through practice, second, identifying behavioural change with reference to norm- following, and third, identifying norm(ative) change with reference to discursive practice. The third section presents the two analytical tools of the conceptual frame, namely, the norm-typology and the cycle-grid model. It also indicates how to apply these tools with reference to illustrative case scenarios. The conclusion recalls the key elements of the conceptual framework for research on norm(ative) change in international relations in light of the challenge of establishing sustainable normativity in the global order.