Refine
Year of publication
Document Type
- Article (596)
- Preprint (299)
- Postprint (257)
- Conference Proceeding (160)
- Doctoral Thesis (51)
- Working Paper (48)
- Review (36)
- Monograph/Edited Volume (26)
- Part of a Book (24)
- Other (16)
Language
- English (1517) (remove)
Is part of the Bibliography
- no (1517) (remove)
Keywords
- Curriculum Framework (31)
- European values education (31)
- Europäische Werteerziehung (31)
- Familie (31)
- Family (31)
- Lehrevaluation (31)
- Studierendenaustausch (31)
- Unterrichtseinheiten (31)
- curriculum framework (31)
- lesson evaluation (31)
Institute
- Institut für Mathematik (309)
- Extern (259)
- Institut für Physik und Astronomie (119)
- Vereinigung für Jüdische Studien e. V. (108)
- Department Linguistik (91)
- Department Psychologie (86)
- Institut für Chemie (65)
- Hasso-Plattner-Institut für Digital Engineering GmbH (55)
- Historisches Institut (52)
- Institut für Umweltwissenschaften und Geographie (49)
The factors that determine the efficiency of energy transfer in aquatic food webs have been investigated for many decades. The plant-animal interface is the most variable and least predictable of all levels in the food web. In order to study determinants of food quality in a large lake and to test the recently proposed central importance of the long-chained eicosapentaenoic acid (EPA) at the pelagic producer-grazer interface, we tested the importance of polyunsaturated fatty acids (PUFAs) at the pelagic producerconsumer interface by correlating sestonic food parameters with somatic growth rates of a clone of Daphnia galeata. Daphnia growth rates were obtained from standardized laboratory experiments spanning one season with Daphnia feeding on natural seston from Lake Constance, a large pre-alpine lake. Somatic growth rates were fitted to sestonic parameters by using a saturation function. A moderate amount of variation was explained when the model included the elemental parameters carbon (r2 = 0.6) and nitrogen (r2 = 0.71). A tighter fit was obtained when sestonic phosphorus was incorporated (r2 = 0.86). The nonlinear regression with EPA was relatively weak (r2 = 0.77), whereas the highest degree of variance was explained by three C18-PUFAs. The best (r2 = 0.95), and only significant, correlation of Daphnia's growth was found with the C18-PUFA α-linolenic acid (α-LA; C18:3n-3). This correlation was weakest in late August when C:P values increased to 300, suggesting that mineral and PUFA-limitation of Daphnia's growth changed seasonally. Sestonic phosphorus and some PUFAs showed not only tight correlations with growth, but also with sestonic α-LA content. We computed Monte Carlo simulations to test whether the observed effects of α-LA on growth could be accounted for by EPA, phosphorus, or one of the two C18-PUFAs, stearidonic acid (C18:4n-3) and linoleic acid (C18:2n-6). With >99 % probability, the correlation of growth with α-LA could not be explained by any of these parameters. In order to test for EPA limitation of Daphnia's growth, in parallel with experiments on pure seston, growth was determined on seston supplemented with chemostat-grown, P-limited Stephanodiscus hantzschii, which is rich in EPA. Although supplementation increased the EPA content 80-800x, no significant changes in the nonlinear regression of the growth rates with α-LA were found, indicating that growth of Daphnia on pure seston was not EPA limited. This indicates that the two fatty acids, EPA and α-LA, were not mutually substitutable biochemical resources and points to different physiological functions of these two PUFAs. These results support the PUFA-limitation hypothesis for sestonic C:P < 300 but are contrary to the hypothesis of a general importance of EPA, since no evidence for EPA limitation was found. It is suggested that the resource ratios of EPA and α-LA rather than the absolute concentrations determine which of the two resources is limiting growth.
Significant seasonal variation in size at settlement has been observed in newly settled larvae of Dreissena polymorpha in Lake Constance. Diet quality, which varies temporally and spatially in freshwater habitats, has been suggested as a significant factor influencing life history and development of freshwater invertebrates. Accordingly, experiments were conducted with field-collected larvae to test the hypothesis that diet quality can determine planktonic larval growth rates, size at settlement and subsequent post-metamorphic growth rates. Larvae were fed one of two diets or starved. One diet was composed of cyanobacterial cells which are deficient in polyunsaturated fatty acids (PUFAs), and the other was a mixed diet rich in PUFAs. Freshly metamorphosed animals from the starvation treatment had a carbon content per individual 70% lower than that of larvae fed the mixed diet. This apparent exhaustion of larval internal reserves resulted in a 50% reduction of the postmetamorphic growth rates. Growth was also reduced in animals previously fed the cyanobacterial diet. Hence, low food quantity or low food quality during the larval stage of D. polymorpha lead to irreversible effects for postmetamorphic animals, and is related to inferior competitive abilities.
Unity in diversity
(2005)
This paper describes the creation and preparation of TUSNELDA, a collection of corpus data built for linguistic research. This collection contains a number of linguistically annotated corpora which differ in various aspects such as language, text sorts / data types, encoded annotation levels, and linguistic theories underlying the annotation. The paper focuses on this variation on the one hand and the way how these heterogeneous data are integrated into one resource on the other hand.
Agriculture is one of the most important human activities providing food and more agricultural goods for seven billion people around the world and is of special importance in sub-Saharan Africa. The majority of people depends on the agricultural sector for their livelihoods and will suffer from negative climate change impacts on agriculture until the middle and end of the 21st century, even more if weak governments, economic crises or violent conflicts endanger the countries’ food security. The impact of temperature increases and changing precipitation patterns on agricultural vegetation motivated this thesis in the first place. Analyzing the potentials of reducing negative climate change impacts by adapting crop management to changing climate is a second objective of the thesis. As a precondition for simulating climate change impacts on agricultural crops with a global crop model first the timing of sowing in the tropics was improved and validated as this is an important factor determining the length and timing of the crops´ development phases, the occurrence of water stress and final crop yield. Crop yields are projected to decline in most regions which is evident from the results of this thesis, but the uncertainties that exist in climate projections and in the efficiency of adaptation options because of political, economical or institutional obstacles have to be considered. The effect of temperature increases and changing precipitation patterns on crop yields can be analyzed separately and varies in space across the continent. Southern Africa is clearly the region most susceptible to climate change, especially to precipitation changes. The Sahel north of 13° N and parts of Eastern Africa with short growing seasons below 120 days and limited wet season precipitation of less than 500 mm are also vulnerable to precipitation changes while in most other part of East and Central Africa, in contrast, the effect of temperature increase on crops overbalances the precipitation effect and is most pronounced in a band stretching from Angola to Ethiopia in the 2060s. The results of this thesis confirm the findings from previous studies on the magnitude of climate change impact on crops in sub-Saharan Africa but beyond that helps to understand the drivers of these changes and the potential of certain management strategies for adaptation in more detail. Crop yield changes depend on the initial growing conditions, on the magnitude of climate change, and on the crop, cropping system and adaptive capacity of African farmers which is only now evident from this comprehensive study for sub-Saharan Africa. Furthermore this study improves the representation of tropical cropping systems in a global crop model and considers the major food crops cultivated in sub-Saharan Africa and climate change impacts throughout the continent.
rezensiertes Werk: Gartner, Isabella: Menorah : Jüdisches Familienblatt für Wissenschaft, Kunst und Literatur (1923–1932) ; Materialien zur Geschichte einer Wiener zionistischen Zeitschrift. - Würzburg : Königshausen & Neumann, 2009. - 356 S. ISBN 978-3-8260-3864-8
The interaction between massive star formation and gas is a key ingredient in galaxy evolution. Given the level of observational detail currently achievable in nearby starbursts, they constitute ideal laboratories to study interaction process that contribute to global evolution in all types of galaxies. Wolf-Rayet (WR) stars, as an observational marker of high mass star formation, play a pivotal role and their winds can strongly influence the surrounding gas. Imaging spectroscopy of two nearby (<4 Mpc) starbursts, both of which show multiple regions with WR stars, are discussed. The relation between the WR content and the physical and chemical properties of the surrounding ionized gas is explored.
INTEGRAL tripled the number of super-giant high-mass X-ray binaries (sgHMXB) known in the Galaxy by revealing absorbed and fast transient (SFXT) systems. Quantitative constraints on the wind clumping of massive stars can be obtained from the study of the hard X-ray variability of SFXT. A large fraction of the hard X-ray emission is emitted in the form of flares with a typical duration of 3 ksec, frequency of 7 days and luminosity of $10^{36}$ erg/s. Such flares are most probably emitted by the interaction of a compact object orbiting at $\sim10~R_*$ with wind clumps ($10^{22 ... 23}$ g) representing a large fraction of the stellar mass-loss rate. The density ratio between the clumps and the inter-clump medium is $10^{2 ... 4}$. The parameters of the clumps and of the inter-clump medium, derived from the SFXT flaring behavior, are in good agreement with macro-clumping scenario and line-driven instability simulations. SFXT are likely to have larger orbital radius than classical sgHMXB.
In this contribution, we gather major academic and design approaches for explaining how space in games is constructed and how it constructs games, thereby defining the conceptual dimensions of gamespace. Each concept’s major inquiry is briefly discussed, iterated if applicable, as well as named. Thus, we conclude with an overview of the locative, the representational, the programmatic, the dramaturgical, the typological, the perspectivistic, the form-functional, and the form-emotive dimensions.
Content: Synopsis The Attitudes toward Rape Victims Scale: Psychometric Data from 14 Countries Scale Construction and Validation - Study One: Preliminary Analyses - Study Two: Test-Retest Reliability - Study Three: Construct Validity Cross-cultural Extensions - United States - United Kingdom - Germany - New Zealand - Canada - West Indies - Israel - Turkey - India - Hong Kong - Malaysia - Zimbabwe - Mexico - Metric Equivalence Discussion
Logic as a medium
(2010)
Computer games are rigid in a peculiar way: the logic of computation was the first to shape the early games. The logic of interactivity marked the action genre of games in the second place, while in massive multiplayer online gaming all the emergences of the net occur to confront us with just another type of logic. These logics are the media in which the specific forms of computer games evolve. Therefore, a look at gaming supposing that there are three eras of computation is taken: the early synthetical era, ruled by the Turing machine and by mainframe computers, by the IPO principle of computing; the second, mimetical era, when interactivity and graphical user interfaces dominate, the domain of the feedback loop; and the third, emergent era, in which the complexity of networked personal computers and their users is dominant.
“Financial Analysis” is an online course designed for professionals consisting of three MOOCs, offering a professionally and institutionally recognized certificate in finance. The course is open but not free of charge and attracts mostly professionals from the banking industry. The primary objective of this study is to identify indicators that can predict learners at high risk of failure. To achieve this, we analyzed data from a previous course that had 875 enrolled learners and involve in the course during Fall 2021. We utilized correspondence analysis to examine demographic and behavioral variables.
The initial results indicate that demographic factors have a minor impact on the risk of failure in comparison to learners’ behaviors on the course platform. Two primary profiles were identified: (1) successful learners who utilized all the documents offered and spent between one to two hours per week, and (2) unsuccessful learners who used less than half of the proposed documents and spent less than one hour per week. Between these groups, at-risk students were identified as those who used more than half of the proposed documents and spent more than two hours per week. The goal is to identify those in group 1 who may be at risk of failing and those in group 2 who may succeed in the current MOOC, and to implement strategies to assist all learners in achieving success.
In semi-arid savannah ecosystems, the vegetation structure and composition, i.e. the architecture of trees, shrubs, grass tussocks and herbaceous plants, offer a great variety of habitats and niches to sustain animal diversity. In the last decades intensive human land use practises like livestock farming have altered the vegetation in savannah ecosystems worldwide. Extensive grazing leads to a reduction of the perennial and herbaceous vegetation cover, which results in an increased availability of bare soil. Both, the missing competition with perennial grasses and the increase of bare soils favour shrub on open ground and lead to area-wide shrub encroachment. As a consequence of the altered vegetation structure and composition, the structural diversity declines. It has been shown that with decreasing structural diversity animal diversity decline across a variety of taxa. Knowledge on the effects of overgrazing on reptiles, which are an important part of the ecosystem, are missing. Furthermore, the impact of habitat degradation on factors of a species population dynamic and life history, e.g., birth rate, survival rate, predation risk, space requirements or behavioural adaptations are poorly known. Therefore, I investigated the impact of overgrazing on the reptile community in the southern Kalahari. Secondly I analysed population dynamics and the behaviour of the Spotted Sand Lizard, Pedioplanis l. lineoocellata. All four chapters clearly demonstrate that habitat degradation caused by overgrazing had a severe negative impact upon (i) the reptile community as a whole and (ii) on population parameters of Pedioplanis l. lineoocellata. Chapter one showed a significant decline of regional reptile diversity and abundance in degraded habitats. In chapter two I demonstrated that P. lineoocellata moves more frequently, spends more time moving and covers larger distances in degraded than in non-degraded habitats. In addition, home range size of the lizard species increases in degraded habitats as shown by chapter three. Finally, chapter four showed the negative impacts of overgrazing on several population parameters of P. lineoocellata. Absolute population size of adult and juvenile lizards, survival rate and birth rate are significantly lower in degraded habitats. Furthermore, the predation risk was greatly increased in degraded habitats. A combination of a variety of aspects can explain the negative impact of habitat degradation on reptiles. First, reduced prey availability negatively affects survival rate, the birth rate and overall abundance. Second, the loss of perennial plant cover leads to a loss of niches and to a reduction of opportunities to thermoregulate. Furthermore, a loss of cover and is associated with increased predation risk. A major finding of my thesis is that the lizard P. lineoocellata can alter its foraging strategy. Species that are able to adapt and change behaviour, such as P. lineoocellata can effectively buffer against changes in their environment. Furthermore, perennial grass cover can be seen as a crucial ecological component of the vegetation in the semi-arid savannah system of the southern Kalahari. If perennial grass cover is reduced to a certain degree reptile diversity will decline and most other aspects of reptile life history will be negatively influenced. Savannah systems are characterised by a mixture of trees, shrubs and perennial grasses. These three vegetation components determine the composition and structure of the vegetation and accordingly influence the faunal diversity. Trees are viewed as keystone structures and focal points of animal activity for a variety of species. Trees supply animals with shelter, shade and food and act as safe sites, nesting sites, observation posts and foraging sites. Recent research demonstrates a positive influence of shrub patches on animal diversity. Moreover, it would seem that intermediate shrub cover can also sustain viable populations in savannah landscapes as has been demonstrated for small carnivores and rodent species. The influence of perennial grasses on faunal diversity did not receive the same attention as the influence of trees and shrubs. In my thesis I didn’t explicitly measure the direct effects of perennial grasses but my results strongly imply that it has an important role. If the perennial grass cover is significantly depleted my results suggest it will negatively influence reptile diversity and abundance and on several populations parameters of P. lineoocellata. Perennial grass cover is associated with the highest prey abundance, reptile diversity and reptile abundance. It provides reptiles both a refuge from predators and opportunities to optimise thermoregulation. The relevance of each of the three vegetation structural elements is different for each taxa and species. In conclusion, I can all three major vegetation structures in the savannah system are important for faunal diversity.
Jacob Brandon Maduro’s Memoirs and Related Observations (Havana, 1953) speak to the lasting yet malleable legacy of Jewish Caribbean/Atlantic mercantile communities that defined early modern settlement in the Americas. A close reading of the Memoirs, alongside relevant archival records and community narratives, lends new perspectives to scholarship on Port Jewries and the Atlantic Diaspora. Specifically concerned with Jacob’s adoption of such leading intellectual and political tropes as the Monroe doctrine, José Martí’s Nuestra America, and a Zionism that evolved from an ideology to a reality, the Memoirs reveal a narrative at once defined by the tremendous upheavals of the first half of the 20th century, and an enduring sense of Jewish diasporic peoplehood defined through a Port Jew paradigm whereby the preservation of Jewish ethnicity is understood as synonymous with the championing of modernity.
During the National Multiplication Training in Kenya in 2018, participants raised concerns about attrition, completion rates and quality of PhD programmes in Kenya’s public universities. This led the authors of this article to further examine the question of PhD completion rates. Available data underlined that PhD students across various disciplines in Kenya’s public universities take unnecessarily long to complete their studies due to a myriad of factors that are related to their supervisors, university guidelines for post-graduate studies, or the students themselves. This article examines inertia areas along the PhD training pathway at three public universities in Kenya and provides suggestions on structural and operational changes universities must make to shorten completion periods.
The site of confluence of the artery and the portal vein in the liver still appears to be controversial. Anatomical studies suggested a presinusoidal or an intrasinusoidal confluence in the first, second or even final third of the sinusoids. The objective of this investigation was to study the problem with functional biochemical techniques. Rat livers were perfused through the hepatic artery and simultaneously either in the orthograde direction from the portal vein to the hepatic vein or in the retrograde direction from the hepatic vein to the portal vein. Arterial how was linearly dependent on arterial pressure between 70 cm H2O and 120 cm H2O at a constant portal or hepatovenous pressure of 18 cm H2O. An arterial pressure of 100 cm H2O was required for the maintenance of a homogeneous orthograde perfusion of the whole parenchyma and of a physiologic ratio of arterial to portal how of about 1:3. Glucagon was infused either through the artery or the portal vein and hepatic vein, respectively, to a submaximally effective ''calculated'' sinusoidal concentration after mixing of 0.1 nmol/L. During orthograde perfusions, arterial and portal glucagon caused the same increases in glucose output. Yet during retrograde perfusions, hepatovenous glucagon elicited metabolic alterations equal to those in orthograde perfusions, whereas arterial glucagon effected changes strongly reduced to between 10% and 50%. Arterially infused trypan blue was distributed homogeneously in the parenchyma during orthograde perfusions, whereas it reached clearly smaller areas of parenchyma during retrograde perfusions. Finally, arterially applied acridine orange was taken up by all periportal hepatocytes in the proximal half of the acinus during orthograde perfusions but only by a much smaller portion of periportal cells in the proximal third of the acinus during retrograde perfusions. These findings suggest that in rat liver, the hepatic artery and the portal vein mix before and within the first third of the sinusoids, rather than in the middle or even last third.
This thesis aims to quantify the human impact on the natural resource water at the landscape scale. The drivers in the federal state of Brandenburg (Germany), the area under investigation, are land-use changes induced by policy decisions at European and federal state level. The water resources of the federal state are particularly sensitive to changes in land-use due to low precipitation rates in the summer combined with sandy soils and high evapotranspiration rates. Key elements in landscape hydrology are forests because of their unique capacity to transport water from the soil to the atmosphere. Given these circumstances, decisions made at any level of administration that may have effects on the forest sector in the state are critical in relation to the water cycle. It is therefore essential to evaluate any decision that may change forest area and structure in such a sensitive region. Thus, as a first step, it was necessary to develop and implement a model able to simulate possible interactions and feedbacks between forested surfaces and the hydrological cycle at the landscape scale. The result is a model for simulating the hydrological properties of forest stands based on a robust computation of the temporal and spatial LAI (leaf area index) dynamics. The approach allows the simulation of all relevant hydrological processes with a low parameter demand. It includes the interception of precipitation and transpiration of forest stands with and without groundwater in the rooting zone. The model also considers phenology, biomass allocation, as well as mortality and simple management practices. It has been implemented as a module in the eco-hydrological model SWIM (Soil and Water Integrated Model). This model has been tested in two pre-studies to verify the applicability of its hydrological process description for the hydrological conditions typical for the state. The newly implemented forest module has been tested for Scots Pine (Pinus sylvestris) and in parts for Common Oak (Quercus robur and Q. petraea) in Brandenburg. For Scots Pine the results demonstrate a good simulation of annual biomass increase and LAI in addition to the satisfactory simulation of litter production. A comparison of the simulated and measured data of the May sprout for Scots pine and leaf unfolding for Oak, as well as the evaluation against daily transpiration measurements for Scots Pine, does support the applicability of the approach. The interception of precipitation has also been simulated and compared with weekly observed data for a Scots Pine stand which displays satisfactory results in both the vegetation periods and annual sums. After the development and testing phase, the model is used to analyse the effects of two scenarios. The first scenario is an increase in forest area on abandoned agricultural land that is triggered by a decrease in European agricultural production support. The second one is a shift in species composition from predominant Scots Pine to Common Oak that is based on decisions of the regional forestry authority to support a more natural species composition. The scenario effects are modelled for the federal state of Brandenburg on a 50m grid utilising spatially explicit land-use patterns. The results, for the first scenario, suggest a negative impact of an increase in forest area (9.4% total state area) on the regional water balance, causing an increase in mean long-term annual evapotranspiration of 3.7% at 100% afforestation when compared to no afforestation. The relatively small annual change conceals a much more pronounced seasonal effect of a mean long-term evapotranspiration increase by 25.1% in the spring causing a pronounced reduction in groundwater recharge and runoff. The reduction causes a lag effect that aggravates the scarcity of water resources in the summer. In contrast, in the second scenario, a change in species composition in existing forests (29.2% total state area) from predominantly Scots Pine to Common Oak decreases the long-term annual mean evapotranspiration by 3.4%, accompanied by a much weaker, but apparent, seasonal pattern. Both scenarios exhibit a high spatial heterogeneity because of the distinct natural conditions in the different regions of the state. Areas with groundwater levels near the surface are particularly sensitive to changes in forest area and regions with relatively high proportion of forest respond strongly to the change in species composition. In both cases this regional response is masked by a smaller linear mean effect for the total state area. Two critical sources of uncertainty in the model results have been investigated. The first one originates from the model calibration parameters estimated in the pre-study for lowland regions, such as the federal state. The combined effect of the parameters, when changed within their physical meaningful limits, unveils an overestimation of the mean water balance by 1.6%. However, the distribution has a wide spread with 14.7% for the 90th percentile and -9.9% for the 10th percentile. The second source of uncertainty emerges from the parameterisation of the forest module. The analysis exhibits a standard deviation of 0.6 % over a ten year period in the mean of the simulated evapotranspiration as a result of variance in the key forest parameters. The analysis suggests that the combined uncertainty in the model results is dominated by the uncertainties of calibration parameters. Therefore, the effect of the first scenario might be underestimated because the calculated increase in evapotranspiration is too small. This may lead to an overestimation of the water balance towards runoff and groundwater recharge. The opposite can be assumed for the second scenario in which the decrease in evapotranspiration might be overestimated.
This paper originated from discussions about the need for
important changes in the curriculum for Computing including two focus
group meetings at IFIP conferences over the last two years. The
paper examines how recent developments in curriculum, together with
insights from curriculum thinking in other subject areas, especially mathematics
and science, can inform curriculum design for Computing.
The analysis presented in the paper provides insights into the complexity
of curriculum design as well as identifying important constraints and
considerations for the ongoing development of a vision and framework
for a Computing curriculum.
How Things Work
(2015)
Recognizing and defining functionality is a key competence
adopted in all kinds of programming projects. This study investigates
how far students without specific informatics training are able to identify
and verbalize functions and parameters. It presents observations
from classroom activities on functional modeling in high school chemistry
lessons with altogether 154 students. Finally it discusses the potential
of functional modelling to improve the comprehension of scientific
content.
Acclimatization
(2003)
Together with their wives Otto and Richard Schomburgk arrived in Port Adelaide (South Australia) on August 16th 1849. The essay looks at how these two brothers, who had received their scientific training and promotion in the circle surrounding Alexander von Humboldt, reacted to the unfamiliar conditions in the young British colony. Some indication will be given as to the differences between the Schomburgk brothers treatment of the natural resources of the new colony and that of the English colonists of the time.
In the middle of the 19th century the question whether expanding civilization and industrialization had an effect on climate was discussed intensely worldwide. It was feared that increasing deforestation would lead to continuous decrease in rainfall. This first scientific discussion about climate change as the result of human intervention was strongly influenced by the research Alexander von Humboldt and Jean-Baptiste Boussingault had undertaken when they investigated the falling water levels of Lake Valencia in Venezuela. This essay aims to clarify the question whether Alexander von Humboldt can be counted among the leading figures of modern environmentalism on account of this research as is being claimed by Richard H. Grove in his influential book Green Imperialism. Colonial Expansion, Tropical Island Edens and Origins of Environmentalism, 1600-1860 (1995).
The paper is an enquiry into dynamic social contract theory. The social contract defines the rules of resource use. An intergenerational social contract in an economy with a single exhaustible resource is examined within a framework of an overlapping generations model. It is assumed that new generations do not accept the old social contract, and access to resources will be renegotiated between any incumbent generation and their successors. It turns out that later generations will be in an unfortunate position regardless of their bargaining power.
Mothers of Seafaring
(2023)
The article aims to trace the contribution of Jewish women in the Yishuv’s maritime history. Taking the example of Henrietta Diamond, a founding member and chairperson of the Zebulun Seafaring Society, the article seeks to explore the representation and role of women in a growing Jewish maritime domain from the 1930s to the 1950s. It examines Zionist narratives on the ‘New Jew’ and the Jewish body and studies their relevance for the emerging field of maritime activities in the Yishuv. By contextualizing the work and depiction of Henrietta Diamond, the article sheds new light on the gendered notions that underlay the emergence of the Jewish maritime domain and illustrates the patterns of inclusion and exclusion in it.
The most massive stars are those with the shortest but most active life. One group of massive stars, the Luminous Blue Variables (LBVs), of which only a few objects are known, are in particular of interest concerning the stability of stars. They have a high mass loss rate and are close to being instable. This is even more likely as rotation becomes an important factor in stellar evolution of these stars. Through massive stellar winds and sometimes giant eruptions, LBV nebulae are formed. Various aspects in the evolution in the LBV phase lead, beside the large scale morphological and kinematical differences, to a diversity of small structures like clumps, rims, and outflows in these nebulae.
Luminous Blue Variables (LBVs) are stars is a transitional phase massive stars may enter while evolving from main-sequence to Wolf-Rayet stars. The to LBVs intrinsic photometric variability is based on the modulation of the stellar spectrum. Within a few years the spectrum shifts from OB to AF type and back. During their cool phase LBVs are close to the Humphreys-Davidson (equivalent to Eddington/Omega-Gamma) limit. LBVs have a rather high mass loss rate, with stellar winds that are fast in the hot and slower in the cool phase of an LBV. These alternating wind velocities lead to the formation of LBV nebulae by wind-wind interactions. A nebula can also be formed in a spontaneous giant eruption in which larger amounts of mass are ejected. LBV nebulae are generally small (< 5 pc) mainly gaseous circumstellar nebulae, with a rather large fraction of LBV nebulae being bipolar. After the LBV phase the star will turn into a Wolf-Rayet star, but note that not all WR stars need to have passed the LBV phase. Some follow from the RSG and the most massive directly from the MS phase. In general WRs have a large mass loss and really fast stellar winds. The WR wind may interact with winds of earlier phases (MS, RSG) to form WR nebulae. As for WR with LBV progenitors the scenario might be different, here no older wind is present but an LBV nebula! The nature of WR nebulae are therefore manifold and in particular the connection (or family ties) of WR to LBV nebulae is important to understand the transition between these two phases, the evolution of massive stars, their winds, wind-wind and wind-nebula interactions. Looking at the similarities and differences of LBV and WR nebula, figuring what is a genuine LBV and WR nebula are the basic question addressed in the analysis presented here.
Informatics as a school subject has been virtually absent from bilingual education programs in German secondary schools. Most bilingual programs in German secondary education started out by focusing on subjects from the field of social sciences. Teachers and bilingual curriculum experts alike have been regarding those as the most suitable subjects for bilingual instruction – largely due to the intercultural perspective that a bilingual approach provides. And though one cannot deny the gain that ensues from an intercultural perspective on subjects such as history or geography, this benefit is certainly not limited to social science subjects. In consequence, bilingual curriculum designers have already begun to include other subjects such as physics or chemistry in bilingual school programs. It only seems a small step to extend this to informatics. This paper will start out by addressing potential benefits of adding informatics to the range of subjects taught as part of English-language bilingual programs in German secondary education. In a second step it will sketch out a methodological (= didactical) model for teaching informatics to German learners through English. It will then provide two items of hands-on and tested teaching material in accordance with this model. The discussion will conclude with a brief outlook on the chances and prerequisites of firmly establishing informatics as part of bilingual school curricula in Germany.
This paper describes an almost forgotten chapter in the relatively short history of Jewish- Buddhist interactions. The popularization of Buddhism in Germany in the second half of 19th century, effected mainly by its positive appraisal in the philosophy of Arthur Schopenhauer, made it a common referent for both critics of Judaism and Christianity as well as their defenders. At the same time, Judaism was viewed by many as a historically antiquated religion and Jewish elements in Christianity were regarded as impediments to the progress of European religiosity and culture. Schopenhauerian conception of “pessimistic” Buddhism and “optimistic” Judaism as the two most distant religious ideas was proudly appropriated by many Jewish thinkers. These Jews portrayed Buddhism as an anti-worldly and anti-social religion of egoistic individuals who seek their own salvation (i. e. annihilation into Nothingness), the most extreme form of pessimism and asceticism which negates every being, will, work, social structures and transcendence. Judaism, in contrast, represented direct opposites of all the aforementioned characteristics. In comparisons to Buddhism, Judaism stood out as a religion which carried the most needed social and psychological values for a healthy modern society: decisive affirmation of the world, optimism, social activity, co-operation with others, social egalitarianism, true charitability, and religious purity free from all remnants of polytheism, asceticism, and the inefficiently excessive moral demands ascribed to both Buddhism and Christianity. Through the analysis of texts by Ludwig Philippson, Ludwig Stein, Leo Baeck, Max Eschelbacher, Juda Bergmann, Fritz-Leopold Steinthal, Elieser David and others, this paper tries to show how the image of Buddhism as an antithesis to Judaism helped the German Jewish reform thinkers in defining the “essence of Judaism” and in proving to both Jewish and Christian audiences its enduring meaningfulness and superiority for the modern society.
The debate whether to locate the narrative of digital games a) as part of the code or b) as part of the performance will be the starting point for an analysis of two roleplaying games: the single-player game ZELDA: MAJORA’S MASK and the Korean MMORPG AION and their respective narrative logics. When we understand games as abstract code systems, then the narrative logic can be understood as embedded on the code level. With a focus on the player’s performance, the actualization of the possibilities given in the code system is central. Both logics, that of code and that of performance, are reflected in players’ narratives based on the playing experience. They do reflect on the underlying code and rules of the game system as they do reflect on the game world and their own performance within. These narratives rely heavily on the source text – the digital game –, which means that they give insights into the underlying logics of the source text. I will discuss the game structure, the players’ performance while playing the game and the performance of the player after playing the game producing fan narratives. I conceive the narrative structure and the performance of the player playing as necessarily interconnected when we discuss the narrative logics of a game. Producing fan narratives is understood as a performance as well. This performance is based on the experience the players made while playing and refers to both logics of the game they use as their source text.
The spectral efficiency of blackness induction was measured in three normal trichromatic observers and in one deuteranomalous observer. The psychophysical task was to adjust the radiance of a monochromatic 60–120′ annulus until a 45′ central broadband field just turned black and its contour became indiscriminable from a dark surrounding gap that separated it from the annulus. The reciprocal of the radiance required to induce blackness with annulus wavelengths between 420 and 680 nm was used to define a spectral-efficiency function for the blackness component of the achromatic process. For each observer, the shape of this blackness-sensitivity function agreed with the spectral-efficiency function based on heterochromatic flicker photometry when measured with the same 60–120′ annulus. Both of these functions matched the Commission Internationale de l'Eclairage Vλ function except at short wavelengths. Ancillary measurements showed that the latter difference in sensitivity can be ascribed to nonuniformities of preretinal absorption, since the annular field excluded the central 60′ of the fovea. Thus our evidence indicates that, at least to a good first approximation, induced blackness is inversely related to the spectral-luminosity function. These findings are consistent with a model that separates the achromatic and the chromatic pathways.
The optical density of human macular pigment was measured for 50 observers ranging in age from 10 to 90 years. The psychophysical method required adjusting the radiance of a 1°, monochromatic light (400–550 nm) to minimize flicker (15 Hz) when presented in counterphase with a 460 nm standard. This test stimulus was presented superimposed on a broad-band, short-wave background. Macular pigment density was determined by comparing sensitivity under these conditions for the fovea, where macular pigment is maximal, and 5° temporally. This difference spectrum, measured for 12 observers, matched Wyszecki and Stiles's standard density spectrum for macular pigment. To study variation in macular pigment density for a larger group of observers, measurements were made at only selected spectral points (460, 500 and 550 nm). The mean optical density at 460 nm for the complete sample of 50 subjects was 0.39. Substantial individual differences in density were found (ca. 0.10–0.80), but this variation was not systematically related to age.
Stop bashing givenness!
(2005)
Elke Kasimir’s paper (in this volume) argues against employing the notion of Givenness in the explanation of accent assignment. I will claim that the arguments against Givenness put forward by Kasimir are inconclusive because they beg the question of the role of Givenness. It is concluded that, more generally, arguments against Givenness as a diagnostic for information structural partitions should not be accepted offhand, since the notion of Givenness of discourse referents is (a) theoretically simple, (b) readily observable and quantifiable, and (c) bears cognitive significance.
Content: 1. Introduction 2. Getting to the Seen from the Unseen 2.1. The Theory of the Zones 2.2. Brief Comments on Mechanism 3. The Areal Evidence: Shared Features and Their Dialectal Provenance 4. Explaining the Evidence Seen 4.1. Why It Is Not Due to Mere Misleading Coincidence 4.2. Why It Is Not Due to French Influence 4.3. Why It Is Not Due to Norse Influence 4.4. Why It Is Not Due to English Influence over Brittonic 4.5. Why It Is Due to Brittonic Influence 5. Conclusion 5.1. The Areal Pattern and Its Explanation 5.2. Substrate versus Superstrate 5.3. Some Final Arguments, and Good Questions 6. Addenda
In recent years computer games have been discussed by a variety of disciplines from various perspectives. A fundamental difference with other media, which is a point of continuous consideration, is the specific relationship between the viewer and the image, the player and the game apparatus, which is a characteristic of video games as a dispositive. Terms such as immersion, participation, interactivity, or ergodic are an indication of the deep interest in this constellation. This paper explores the resonance between body and image in video games like REZ, SOUL CALIBUR and DANCE DANCE REVOLUTION from the perspective of a temporal ontology of the image, taking particular account of the structuring power of the interface and its subject positioning aspects.
Current contestations of the liberal international order stand in notable contrast with the earlier rise of international law during the post-cold war period. As Krieger and Liese argue, this situation calls for assessment of the type of change that is currently observed, i.e. norm change (Wandel) or a more fundamental transformation of international law – a metamorphosis (Verwandlung)? To address this question, this paper details the bi-focal approach to norms in order to reflect and take account of the complex interrelation between fact-based and value-based conceptions of norms. The paper is organised in three sections. The first section presents three axioms underlying the conceptual framework to study norm(ative) change which are visualised by a triangular operation to analyse this change in relation with practices and norms. The second section recalls three key interests that have guided IR norms research after the return to norms in the late 1980s. They include, first, allocating change in and through practice, second, identifying behavioural change with reference to norm- following, and third, identifying norm(ative) change with reference to discursive practice. The third section presents the two analytical tools of the conceptual frame, namely, the norm-typology and the cycle-grid model. It also indicates how to apply these tools with reference to illustrative case scenarios. The conclusion recalls the key elements of the conceptual framework for research on norm(ative) change in international relations in light of the challenge of establishing sustainable normativity in the global order.
Observations of the WC9+OB system WR65 in the infrared show variations of its dust emission consistent with a period near 4.8 yr, suggesting formation in a colliding-wind binary (CWB) having an elliptical orbit. If we adopt the IR maximum as zero phase, the times of X-ray maximum count and minimum extinction to the hard component measured by Oskinova & Hamann fall at phases 0.4–0.5, when the separation of the WC9 and OB stars is greatest. We consider WR65 in the context of other WC8–9+OB stars showing dust emission.
The paper presents a simulation and parameter-estimation approach for evaluating stochastic patterns of population growth and spread of an annual forest herb, Melampyrum pratense (Orobanchaceae). The survival of a species during large-scale changes in land use and climate will depend, to a considerable extent, on its dispersal and colonisation abilities. Predictions on species migration need a combination of field studies and modelling efforts. Our study on the ability of M. pratense to disperse into so far unoccupied areas was based on experiments in secondary woodland in NE Germany. Experiments started in 1997 at three sites where the species was not yet present, with 300 seeds sown within one square meter. Population development was then recorded until 2001 by mapping of individuals with a resolution of 5 cm. Additional observations considered density dependence of seed production. We designed a spatially explicit individual-based computer simulation model to explain the spatial patterns of population development and to predict future population spread. Besides primary drop of seeds (barochory) it assumed secondary seed transport by ants (myrmecochory) with an exponentially decreasing dispersal tail. An important feature of populationpattern explanation was the simultaneous estimation of both population-growth and dispersal parameters from consistent spatio-temporal data sets. As the simulation model produced stochastic time series and random spatially discrete distributions of individuals we estimated parameters by minimising the expectation of weighted sums of squares. These sums-ofsquares criteria considered population sizes, radial population distributions around the area of origin and distributions of individuals within squares of 25*25 cm, the range of density action. Optimal parameter values, together with the precision of the estimates, were obtained from calculating sums of squares in regular grids of parameter values. Our modelling results showed that transport of fractions of seeds by ants over distances of 1…2 m was indispensable for explaining the observed population spread that led to distances of at most 8 m from population origin within 3 years. Projections of population development over 4 additional years gave a diffusion-like increase of population area without any “outposts”. This prediction generated by the simulation model gave a hypothesis which should be revised by additional field observations. Some structural deviations between observations and model output already indicated that for full understanding of population spread the set of dispersal mechanisms assumed in the model may have to be extended by additional features of plant-animal mutualism.
Multiple hierarchies
(2005)
In this paper, we present the Multiple Annotation approach, which solves two problems: the problem of annotating overlapping structures, and the problem that occurs when documents should be annotated according to different, possibly heterogeneous tag sets. This approach has many advantages: it is based on XML, the modeling of alternative annotations is possible, each level can be viewed separately, and new levels can be added at any time. The files can be regarded as an interrelated unit, with the text serving as the implicit link. Two representations of the information contained in the multiple files (one in Prolog and one in XML) are described. These representations serve as a base for several applications.
The dynamics of noisy bistable systems is analyzed by means of Lyapunov exponents and measures of complexity. We consider both the classical Kramers problem with additive white noise and the case when the barrier fluctuates due to additional external colored noise. In case of additive noise we calculate the Lyapunov exponents and all measures of complexity analytically as functions of the noise intensity resp. the mean escape time. For the problem of fluctuating barrier the usual description of the dynamics with the mean escape time is not sufficient. The application of the concept of measures of complexity allows to describe the structures of motion in more detail. Most complexity measures sign the value of correlation time at which the phenomenon of resonant activation occurs with an extremum.
Green formulae for elliptic cone differential operators are established. This is achieved by an accurate description of the maximal domain of an elliptic cone differential operator and its formal adjoint; thereby utilizing the concept of a discrete asymptotic type. From this description, the singular coefficients replacing the boundary traces in classical Green formulas are deduced.
Asymptotic algebras
(2001)
It is prooved that mermorphic, parameter-dependet elliptic Mellin symbols can be factorized in a particular way. The proof depends on the availability of logarithms of pseudodifferential operators. As a byproduct, we obtain a characterization of the group generated by pseudodifferential operators admitting a logarithm. The factorization has applications to the theory os pseudodifferential operators on spaces with conical singularities, e.g., to the index theory and the construction of various sub-calculi of the cone calculus.
Local asymptotic types
(2002)
We compute spectral libraries for populations of coeval stars using state-of-the-art massive-star evolutionary tracks that account for different astrophysics including rotation and close-binarity. Our synthetic spectra account for stellar and nebular contributions. We use our models to obtain E(B – V ), age, and mass for six clusters in spiral galaxy NGC 1566, which have ages of < 50 Myr and masses of > 5 x 104M⊙ according to standard models. NGC 1566 was observed from the NUV to the I-band as part of the imaging Treasury HST program LEGUS: Legacy Extragalactic UV Survey. We aim to establish i) if the models provide reasonable fits to the data, ii) how well the models and photometry are able to constrain the cluster properties, and iii) how different the properties obtained with different models are.
We analyze anaphoric phenomena in the context of building an input understanding component for a conversational system for tutoring mathematics. In this paper, we report the results of data analysis of two sets of corpora of dialogs on mathematical theorem proving. We exemplify anaphoric phenomena, identify factors relevant to anaphora resolution in our domain and extensions to the input interpretation component to support it.
Aspect splits can affect agreement, Case, and even preposition insertion. This paper discusses the functional ‘why’ and the theoretical ‘how’ of aspect splits. Aspect splits are an economical way to mark aspect by preserving or suppressing some independent element in one aspect. In formal terms, they are produced in the same way as coda conditions in phonology, with positional/contextual faithfulness.This approach captures the additive effects of cross-cutting splits. Aspect splits are analyzed here from Hindi, Nepali, Yucatec Maya, Chontal, and Palauan.
This article considers one of the major weaknesses in the existing historiography of Irish Jewry, the failure to consider the true extent and impact of antisemitism on Ireland’s Jewish community. This is illustrated through a brief survey of one small area of the Irish-Jewish narrative, the Jewish relationship with Irish nationalist politics. Throughout, the focus remains on the need for a fresh approach to the sources and the issues at hand, in order to create a more holistic, objective and inclusive history of the Jewish experience in Ireland.
Edge representations of operators on closed manifolds are known to induce large classes of operators that are elliptic on specific manifolds with edges, cf. [9]. We apply this idea to the case of boundary value problems. We establish a correspondence between standard ellipticity and ellipticity with respect to the principal symbolic hierarchy of the edge algebra of boundary value problems, where an embedded submanifold on the boundary plays the role of an edge. We first consider the case that the weight is equal to the smoothness and calculate the dimensions of kernels and cokernels of the associated principal edge symbols. Then we pass to elliptic edge operators for arbitrary weights and construct the additional edge conditions by applying relative index results for conormal symbols.
It is shown that bounded solutions to semilinear elliptic Fuchsian equations obey complete asymptoic expansions in terms of powers and logarithms in the distance to the boundary. For that purpose, Schuze's notion of asymptotic type for conormal asymptotics close to a conical point is refined. This in turn allows to perform explicit calculations on asymptotic types - modulo the resolution of the spectral problem for determining the singular exponents in the asmptotic expansions.
The massive growth of MOOCs in 2011 laid the groundwork for the achievement of SDG 4. With the various benefits of MOOCs, there is also anticipation that online education should focus on more interactivity and global collaboration. In this context, the Global MOOC and Online Education Alliance (GMA) established a diverse group of 17 world-leading universities and three online education platforms from across 14 countries on all six continents in 2020. Through nearly three years of exploration, GMA has gained experience and achieved progress in fostering global cooperation in higher education. First, in joint teaching, GMA has promoted in-depth cooperation between members inside and outside the alliance. Examples include promoting the exchange of high-quality MOOCs, encouraging the creation of Global Hybrid Classroom, and launching Global Hybrid Classroom Certificate Programs. Second, in capacity building and knowledge sharing, GMA has launched Online Education Dialogues and the Global MOOC and Online Education Conference, inviting global experts to share best practices and attracting more than 10 million viewers around the world. Moreover, GMA is collaborating with international organizations to support teachers’ professional growth, create an online learning community, and serve as a resource for further development. Third, in public advocacy, GMA has launched the SDG Hackathon and Global Massive Open Online Challenge (GMOOC) and attracted global learners to acquire knowledge and incubate their innovative ideas within a cross-cultural community to solve real-world problems that all humans face and jointly create a better future. Based on past experiences and challenges, GMA will explore more diverse cooperation models with more partners utilizing advanced technology, provide more support for digital transformation in higher education, and further promote global cooperation towards building a human community with a shared future.
This paper investigates private university students’ language learning activities in MOOC platforms and their attitude toward it. The study explores the development of MOOC use in Chinese private universities, with a focus on two modes: online et blended. We conducted empirical studies with students learning French and Japanese as a second foreign language, using questionnaires (N = 387) and interviews (N = 20) at a private university in Wuhan. Our results revealed that the majority of students used the MOOC platform more than twice a week and focused on the MOOC video, materials and assignments. However, we also found that students showed less interest in online communication (forums). Those who worked in the blended learning mode, especially Japanese learning students, had a more positive attitude toward MOOCs than other students.
When azobenzene-modified photosensitive polymer films are irradiated with light interference patterns, topographic variations in the film develop that follow the electric field vector distribution resulting in the formation of surface relief grating (SRG). The exact correspondence of the electric field vector orientation in interference pattern in relation to the presence of local topographic minima or maxima of SRG is in general difficult to determine. In my thesis, we have established a systematic procedure to accomplish the correlation between different interference patterns and the topography of SRG. For this, we devise a new setup combining an atomic force microscope and a two-beam interferometer (IIAFM). With this set-up, it is possible to track the topography change in-situ, while at the same time changing polarization and phase of the impinging interference pattern. To validate our results, we have compared two photosensitive materials named in short as PAZO and trimer. This is the first time that an absolute correspondence between the local distribution of electric field vectors of interference pattern and the local topography of the relief grating could be established exhaustively. In addition, using our IIAFM we found that for a certain polarization combination of two orthogonally polarized interfering beams namely SP (↕, ↔) interference pattern, the topography forms SRG with only half the period of the interference patterns. Exploiting this phenomenon we are able to fabricate surface relief structures below diffraction limit with characteristic features measuring only 140 nm, by using far field optics with a wavelength of 491 nm. We have also probed for the stresses induced during the polymer mass transport by placing an ultra-thin gold film on top (5–30 nm). During irradiation, the metal film not only deforms along with the SRG formation, but ruptures in regular and complex manner. The morphology of the cracks differs strongly depending on the electric field distribution in the interference pattern even when the magnitude and the kinetic of the strain are kept constant. This implies a complex local distribution of the opto-mechanical stress along the topography grating. The neutron reflectivity measurements of the metal/polymer interface indicate the penetration of metal layer within the polymer resulting in the formation of bonding layer that confirms the transduction of light induced stresses in the polymer layer to a metal film.
Contents: 1 Introduction 2 Main result 3 Construction of the asymptotic solutions 3.1 Derivation of the equations for the profiles 3.2 Exsistence of the principal profile 3.3 Determination of Usub(2) and the remaining profiles 4 Stability of the samll global solutions. Justification of One Phase Nonlinear Geometric Optics for the Kirchhoff-type equations 4.1 Stability of the global solutions to the Kirchhoff-type symmetric hyperbolic systems 4.2 The nonlinear system of ordinary differential equations with the parameter 4.3 Some energies estimates 4.4 The dependence of the solution W(t, ξ) on the function s(t) 4.5 The oscillatory integrals of the bilinear forms of the solutions 4.6 Estimates for the basic bilinear form Γsub(s)(t) 4.7 Contraction mapping 4.8 Stability of the global solution 4.9 Justification of One Phase Nonlinear Geometric Optics for the Kirchhoff-type equations
In this article we construct the fundamental solutions for the wave equation arising in the de Sitter model of the universe. We use the fundamental solutions to represent solutions of the Cauchy problem and to prove the Lp − Lq-decay estimates for the solutions of the equation with and without a source term.
Enlisted History
(2018)
Zeev Jawitz (1847–1924) was active in all spheres of culture: history, language, literature and pedagogy, all the while striving for harmonization with the Orthodox outlook. He understood that a people returning to its homeland needed a national culture, one that was both broad and deep, and that the narrow world of the Halakhah would no longer suffice. His main work was the multi-volume Toldot Israel (History of Israel, published 1895–1924) which encompasses Jewish history from its beginning – Patriarchs – until the end of the 19th century. His historical writing, with its emphasis on internal religious Jewish sources, the unity and continuity of Jewish history, and respect of Orthodox principles, comes as an alternative to the historiography of the celebrated historian Heinrich Graetz. The alternative that Jawitz tried to substitute for Wissenschaft des Judentums, was influenced not only by Orthodox ideology, which he supported, but also by his nationalist ideology. He saw himself and his disciples as the “priests of memory,” presenting the true and immanent history and character of the Jewish nation as a platform to the Jewish future in the land of Israel.
In this paper, the problem on formation and construction of a shock wave for three dimensional compressible Euler equations with the small perturbed spherical initial data is studied. If the given smooth initial data satisfies certain nondegenerate condition, then from the results in [20], we know that there exists a unique blowup point at the blowup time such that the first order derivates of smooth solution blow up meanwhile the solution itself is still continuous at the blowup point. From the blowup point, we construct a weak entropy solution which is not uniformly Lipschitz continuous on two sides of shock curve, moreover the strength of the constructed shock is zero at the blowup point and then gradually increases. Additionally, some detailed and precise estimates on the solution are obtained in the neighbourhood of the blowup point.
This note is devoted to the study on the global existence of a shock wave for the supersonic flow past a curved wedge. When the curved wedge is a small perturbation of a straight wedge and the angle of the wedge is less than some critical value, wwe show that a shock attached at the wedge will exist globally.
In this paper, we discuss the global existence of solutions for Chemotaxis models with saturation growth. If the coe±cients of the equations are all positive smooth T-periodic functions, then the problem has a positive T-periodic solution, and meanwhile we discuss here the stability problems for the T-periodic solutions.
Nested complementation plays an important role in expressing counter- i.e. star-free and first-order definable languages and their hierarchies. In addition, methods that compile phonological rules into finite-state networks use double-nested complementation or “double negation”. This paper reviews how the double-nested complementation extends to a relatively new operation, generalized restriction (GR), coined by the author (Yli-Jyrä and Koskenniemi 2004). This operation encapsulates a double-nested complementation and elimination of a concatenation marker, diamond, whose finite occurrences align concatenations in the arguments of the operation. The paper demonstrates that the GR operation has an interesting potential in expressing regular languages, various kinds of grammars, bimorphisms and relations. This motivates a further study of optimized implementation of the operator.
Generalized Two-Level Grammar (GTWOL) provides a new method for compilation of parallel replacement rules into transducers. The current paper identifies the role of generalized lenient composition (GLC) in this method. Thanks to the GLC operation, the compilation method becomes bipartite and easily extendible to capture various application modes. In the light of three notions of obligatoriness, a modification to the compilation method is proposed. We argue that the bipartite design makes implementation of parallel obligatoriness, directionality, length and rank based application modes extremely easy, which is the main result of the paper.
The article starts with an overview of modernization theories, its history of ups and downs as well as its present status. This first part is followed by an analysis of basic social structure distributions and trends in human development in selected countries. One major focal point of the paper is the Non-Western world and the Arab countries, in particular. The author looks at modernization and modernity in that region and comes to the conclusion that the Western world can no longer expect to be able to simply export its own values and its way of life to the rest of the world.
In this paper, we present a finite-state approach to constituency and therewith an analysis of coordination phenomena involving so-called non-constituents. We show that non-constituents can be seen as parts of fully-fledged constituents and therefore be coordinated in the same way. We have implemented an algorithm based on finite state automata that generates an LFG grammar assigning valid analyses to non-constituent coordination structures in the German language.
In this paper I argue that both parametric variation and the alleged differences between languages in terms of their internal complexity straightforwardly follow from the Strongest Minimalist Thesis that takes the Faculty of Language (FL) to be an optimal solution to conditions that neighboring mental modules impose on it. In this paper I argue that hard conditions like legibility at the linguistic interfaces invoke simplicity metrices that, given that they stem from different mental modules, are not harmonious. I argue that widely attested expression strategies, such as agreement or movement, are a direct result of conflicting simplicity metrices, and that UG, perceived as a toolbox that shapes natural language, can be taken to consist of a limited number of markings strategies, all resulting from conflicting simplicity metrices. As such, the contents of UG follow from simplicity requirements, and therefore no longer necessitate linguistic principles, valued or unvalued, to be innately present. Finally, I show that the SMT does not require that languages themselves have to be optimal in connecting sound to meaning.
Integration of digital elevation models and satellite images to investigate geological processes.
(2006)
In order to better understand the geological boundary conditions for ongoing or past surface processes geologists face two important questions: 1) How can we gain additional knowledge about geological processes by analyzing digital elevation models (DEM) and satellite images and 2) Do these efforts present a viable approach for more efficient research. Here, we will present case studies at a variety of scales and levels of resolution to illustrate how we can substantially complement and enhance classical geological approaches with remote sensing techniques. Commonly, satellite and DEM based studies are being used in a first step of assessing areas of geologic interest. While in the past the analysis of satellite imagery (e.g. Landsat TM) and aerial photographs was carried out to characterize the regional geologic characteristics, particularly structure and lithology, geologists have increasingly ventured into a process-oriented approach. This entails assessing structures and geomorphic features with a concept that includes active tectonics or tectonic activity on time scales relevant to humans. In addition, these efforts involve analyzing and quantifying the processes acting at the surface by integrating different remote sensing and topographic data (e.g. SRTM-DEM, SSM/I, GPS, Landsat 7 ETM, Aster, Ikonos…). A combined structural and geomorphic study in the hyperarid Atacama desert demonstrates the use of satellite and digital elevation data for assessing geological structures formed by long-term (millions of years) feedback mechanisms between erosion and crustal bending (Zeilinger et al., 2005). The medium-term change of landscapes during hundred thousands to millions years in a more humid setting is shown in an example from southern Chile. Based on an analysis of rivers/watersheds combined with landscapes parameterization by using digital elevation models, the geomorphic evolution and change in drainage pattern in the coastal Cordillera can be quantified and put into the context of seismotectonic segmentation of a tectonically active region. This has far-reaching implications for earthquake rupture scenarios and hazard mitigation (K. Rehak, see poster on IMAF Workshop). Two examples illustrate short-term processes on decadal, centennial and millennial time scales: One study uses orogen scale precipitation gradients derived from remotely sensed passive microwave data (Bookhagen et al., 2005a). They demonstrate how debris flows were triggered as a response of slopes to abnormally strong rainfall in the interior parts of the Himalaya during intensified monsoons. The area of the orogen that receives high amounts of precipitation during intensified monsoons also constitutes numerous landslide deposits of up to 1km<sup>3 volume that were generated during intensified monsoon phase at about 27 and 9 ka (Bookhagen et al., 2005b). Another project in the Swiss Alps compared sets of aerial photographs recorded in different years. By calculating high resolution surfaces the mass transport in a landslide could be reconstructed (M. Schwab, Universität Bern). All these examples, although representing only a short and limited selection of projects using remote sense data in geology, have as a common approach the goal to quantify geological processes. With increasing data resolution and new sensors future projects will even enable us to recognize more patterns and / or structures indicative of geological processes in tectonically active areas. This is crucial for the analysis of natural hazards like earthquakes, tsunamis and landslides, as well as those hazards that are related to climatic variability. The integration of remotely sensed data at different spatial and temporal scales with field observations becomes increasingly important. Many of presently highly populated places and increasingly utilized regions are subject to significant environmental pressure and often constitute areas of concentrated economic value. Combined remote sensing and ground-truthing in these regions is particularly important as geologic, seismicity and hydrologic data may be limited here due to the recency of infrastructural development. Monitoring ongoing processes and evaluating the remotely sensed data in terms of recurrence of events will greatly enhance our ability to assess and mitigate natural hazards. <hr> Dokument 1: Foliensatz | Dokument 2: Abstract <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
The article provides historical background for Alexander von Humboldt’s expedition into Russia in 1829. It includes information on Humboldt’s works and publications in Russia over the course of his lifetime, as well as an explanation of the Russian scientific community’s response to those works. Humboldt’s ideas on the existence of an active volcano in Central Asia attracted the attention of two prominent Russian geographers, P. Semenov and P. Kropotkin, whose views on the nature of volcanism were quite different. P. Semenov personally met Humboldt in Berlin. P. Kropotkin made one of the most important geological discoveries of the 19th Century: he found the fresh volcanic cones near Lake Baikal.
Soon after Humboldt’s Russian expedition, and partly as a result of it, an important mineral was found in the Ilmen mountains – samarskite, which later gave its name to the chemical element Samarium, developed in 1879. At the beginning of the 20th Century, the Russian scientist V. Vernadskiy pointed out that samarskite was the first uranium-rich mineral found in Russia.
The aim of these lectures is a reformulation and generalization of the fundamental investigations of Alexander Bach [2, 3] on the concept of probability in the work of Boltzmann [6] in the language of modern point process theory. The dominating point of view here is its subordination under the disintegration theory of Krickeberg [14]. This enables us to make Bach's consideration much more transparent. Moreover the point process formulation turns out to be the natural framework for the applications to quantum mechanical models.
In this talk, I would like to share my experiences gained from participating in four CSP solver competitions and the second ASP solver competition. In particular, I’ll talk about how various programming techniques can make huge differences in solving some of the benchmark problems used in the competitions. These techniques include global constraints, table constraints, and problem-specific propagators and labeling strategies for selecting variables and values. I’ll present these techniques with experimental results from B-Prolog and other CLP(FD) systems.
Zuerst erschienen in:
Alexander von Humboldt-Stiftung. Mitteilungen, 5. Jg., Heft 38, Oktober 1980, S. 27–36.
Two-dimensional bouyancy-driven convection in a horizontal fluid layer with stress-free boundary conditions at top and bottom and periodic boundary conditions in the horizontal direction is investigated by means of numerical simulation and bifurcation-analysis techniques. As the bouyancy forces increase, the primary stationary and symmetric convection rolls undergo successive Hopf bifurcations, bifurcations to traveling waves, and phase lockings. We pay attention to symmetry breaking and its connection with the generation of large-scale horizontal flows. Calculations of Lyapunov exponents indicate that at a Rayleigh number of 2.3×105 no temporal chaos is reached yet, but the system moves nonchaotically on a 4-torus in phase space.
Current curricular trends require teachers in Baden-
Wuerttemberg (Germany) to integrate Computer Science (CS) into
traditional subjects, such as Physical Science. However, concrete guidelines
are missing. To fill this gap, we outline an approach where a
microcontroller is used to perform and evaluate measurements in the
Physical Science classroom.
Using the open-source Arduino platform, we expect students to acquire
and develop both CS and Physical Science competencies by using a
self-programmed microcontroller. In addition to this combined development
of competencies in Physical Science and CS, the subject matter
will be embedded in suitable contexts and learning environments,
such as weather and climate.
Contrastive focus
(2007)
The article puts forward a discourse-pragmatic approach to the notoriously evasive phenomena of contrastivity and emphasis. It is argued that occurrences of focus that are treated in terms of ‘contrastive focus’, ‘kontrast’ (Vallduví & Vilkuna 1998) or ‘identificational focus’ (É. Kiss 1998) in the literature should not be analyzed in familiar semantic terms like introduction of alternatives or exhaustivity. Rather, an adequate analysis must take into account discourse-pragmatic notions like hearer expectation or discourse expectability of the focused content in a given discourse situation. The less expected a given content is judged to be for the hearer, relative to the Common Ground, the more likely a speaker is to mark this content by means of special grammatical devices, giving rise to emphasis.
In challenging times for international law, there might be a heightened need for both analysis and prescription. The international rule of law as a connecting thread that goes through the global legal order is a particularly salient topic. By providing a working understanding of the content and contexts of the international rule of law, and by taking the regime of international investment law as a case study, this paper argues that assessing 'rise' or 'decline' motions in this sphere warrants a nuanced approach that should recognise parallel positive and negative developments. Whilst prominent procedural and substantive aspects of international investment law strongly align with the international rule of law requirements, numerous challenges threaten the future existence of the regime and appeal of international rule of law more broadly. At the same time, opportunities exist to adapt the substantive decision-making processes in investor-State disputes so to pursue parallel goals of enhancing rule of law at both international and national levels. Through recognising the specificities of interaction between international and national sphere, arbitrators can further reinvigorate the legitimacy of international rule of law through international investment law - benefitting thus the future of both.
The concept of three journeys as a way to denote spiritual development was introduced
by Dhu al-Nun, one of the founding fathers of Islamic mysticism. The use of this
concept was later refined by combining it with the Sufi technique of adding different
prepositions to a certain term, in order to differentiate between spiritual stages. By
using the words journey (Safar) and God (Allah) and inserting a preposition before the
word God, Sufi writers could map the different roads to God or the stations (Maqamat) on this road. Ibn al-'Arabi, in the beginning of the thirteenth century, speaks of three
different ways: from God, toward God and in God. Tanchum ha-Yerushalmi, the Judeo
Arabic biblical commentator from the end of this century, speaks of the three journeys
as three stations of one continuous way. A nearly identical description we can find in
the writing of the Muslim scholar Ibn Qayyim al-Jawziyya, a generation later. Later in
the fourteenth century, in the writing of the Sufi writer al-Qashani, the three travels
become four, although the scheme of three prepositions is preserved. Near the end of
the fourteenth century, in the writings of R. David ha-Nagid, we find only two journeys:
to God and in God. All this tells us that Judeo Arabic literature can help us map
with greater precision the historical development of Sufi ideas.
1. Introduction 2. Analysis of implementation of the Basel III in China 2.1 Implementation of capital adequacy rules 2.2 Implementation of leverage ratio rules 2.3 Implementation of liquidity management rules 3. Suggestions for further development of China’s banking industry 3.1 Promoting capital structure adjustment and broadening capital supplement channels 3.2 Transforming business models and developing intermediary and off-balance business 3.3 Increasing the intensity of risk management and refining its standards
The use of unilateral force under George W. Bush is not a new phenomenon in US foreign policy. As the author argues, it is merely a continuation of Bill Clinton’s foreign policy and is deeply rooted in both the foreign policy traditions of Jacksonianism and Wilsonianism. The analysis concludes that Clinton used unilateralist foreign policy with a 'smile' whereas the Bush administration uses it with an attitude.
During his trip to New Spain in 1803, Alexander von Humboldt visited large tracts of New Spanish territory, which includes modern Mexico and part of the United States. This trip provided the data for his geographical Atlas of the region, as well as information about the ancient Mexican cultures that he would later include in the general Atlas and in other major works, such as Vues des Cordillères. Likewise, Humboldt’s Political Essay on the Kingdom of New Spain displayed a comprehensive physical, natural, economic, and social description of Mexico in the colonial period, which will also be analysed. With these works, Humboldt presented a new geographical and cultural image of New Spain to the European audiences. In addition to this, his work made important contributions to cartographic knowledge.
Content: 1. Introduction 2. Early Examples of the AFP in Hiberno-English 3. Assessments of the Evidence 4. Attempts to Explain the Early HE Construction 5. Distribution and Function of the AFP in EMI and HE 5.1. The AFP with the Future Tense in Irish 5.2. The AFP with the Secondary Future or Conditional 5.3. The AFP with the Subjunctive 5.5. Functions of the AFP in Early Modern Irish and HE 6. The Restriction of the AFP to the Recent Perfect 7. Conclusions
If taking a flipped learning approach, MOOC content can be used for online pre-class instruction. After which students can put the knowledge they gained from the MOOC into practice either synchronously or asynchronously. This study examined one such, asynchronous, course in teacher education. The course ran with 40 students over 13 weeks from February to May 2020. A case study approach was followed using mixed methods to assess the efficacy of the course. Quantitative data was gathered on achievement of learning outcomes, online engagement, and satisfaction. Qualitative data was gathered via student interviews from which a thematic analysis was undertaken. From a combined analysis of the data, three themes emerged as pertinent to course efficacy: quality and quantity of communication and collaboration; suitability of the MOOC; and significance for career development.
On doubling unconditionals
(2019)
During the last few years there was a tremendous growth of scientific activities in the fields related to both Physics and Control theory: nonlinear dynamics, micro- and nanotechnologies, self-organization and complexity, etc. New horizons were opened and new exciting applications emerged. Experts with different backgrounds starting to work together need more opportunities for information exchange to improve mutual understanding and cooperation. The Conference "Physics and Control 2007" is the third international conference focusing on the borderland between Physics and Control with emphasis on both theory and applications. With its 2007 address at Potsdam, Germany, the conference is located for the first time outside of Russia. The major goal of the Conference is to bring together researchers from different scientific communities and to gain some general and unified perspectives in the studies of controlled systems in physics, engineering, chemistry, biology and other natural sciences. We hope that the Conference helps experts in control theory to get acquainted with new interesting problems, and helps experts in physics and related fields to know more about ideas and tools from the modern control theory.
Interdisciplinary studies on information structure : ISIS ; Working papers of the SFB 632. - Vol. 8
(2007)
The 8th volume of the working paper series Interdisciplinary Studies on Information Structure (ISIS) of the SFB 632 contains a collection of eight papers contributed by guest authors and SFB-members. The first paper on “Biased Questions” is an invited contribution by Nicholas Asher (CNRS, Laboratoire IRIT) & Brian Reese (University of Texas at Austin). Surveying English tag questions, negative polar questions, and what they term “focus” questions, they investigate the effects of prosody on discourse function and discourse structure and analyze the interaction between prosody and discourse in SDRT (Segmented Discourse Representation Theory). Stefan Hinterwimmer (A2) explores the interpretation of singular definites and universally quantified DPs in adverbially quantified English sentences. He suggests that the availability of a co-varying interpretation is more constrained in the case of universally quantified DPs than in the case of singular definites, because different from universally quantified DPs, co-varying definites are inherently focus-marked. The existence of striking similarities between topic/comment structure and bimanual coordination is pointed out and investigated by Manfred Krifka (A2). Showing how principles of bimanual coordination influence the expression of topic/comment structure beyond spoken language, he suggests that bimanual coordination might have been a preadaptation of the development of Information Structure in human communication. Among the different ways of expressing focus in Foodo, an underdescribed African Guang language of the Kwa family, the marked focus constructions are the central topic of the paper by Ines Fiedler (B1 & D2). Exploring the morphosyntactic facilities that Foodo has for focalization, she suggests that the two focus markers N and n have developed out of a homophone conjunction. Focus marking in another scarcely documented African tone language, the Gur language Konkomba, is treated by Anne Schwarz (B1 & D2). Comparing the two alleged focus markers lé and lá of the language, she argues that lé is better interpreted as a syntactic device rather than as a focus marker and shows that this analysis is corroborated by parallels in related languages. The reflexes of Information Structure in four different European languages (French, German, Greek and Hungarian) are compared and validated by Sam Hellmuth & Stavros Skopeteas (D2). The production data was collected with selected materials of the Questionnaire on Information Structure (QUIS) developed at the SFB. The results not only allow for an evaluation of the current elicitation paradigms, but also help to identify potentially fruitful venues of future research. Frank Kügler, Stavros Skopeteas (D2) & Elisabeth Verhoeven (University of Bremen) give an account of the encoding of Information Structure in Yucatec Maya, a Mayan tone language spoken on the Yucatecan peninsula in Mexico. The results of a production experiment lead them to the conclusion that focus is mainly expressed by syntax in this language. Stefanie Jannedy (D3) undertakes an instrumental investigation on the expressions and interpretation of focus in Vietnamese, a language of the Mon-Khmer family contrasting six lexical tones. The data strongly suggests that focus in Vietnamese is exclusively marked by prosody (intonational emphasis expressed via duration, f0 and amplitude) and that different focus conditions can reliably be recovered. This volume offers insights into current work conducted at the SFB 632, comprising empirical and theoretical aspects of Information Structure in a multitude of languages. Several of the papers mine field work data collected during the first phase of the SFB and explore the expression of Information Structure in tone and non-tone languages from various regions of the world.
Grammatica Grandonica
(2013)
In May 2010, Johann Ernst Hanxleden’s Grammatica Grandonica was rediscovered in Montecompatri (Lazio, Rome). Although historiographers attached much weight to the nearly oldest western grammar of Sanskrit, the precious manuscript was lost for several decades. The first aim of the present digital publication is to offer a photographical reproduction of the manuscript. This facsimile is accompanied by a double edition: a facing diplomatic edition with the Sanskrit in Malayāḷam script, followed by a transliterated established text.
Preface
(2010)
Aspect-oriented programming, component models, and design patterns are modern and actively evolving techniques for improving the modularization of complex software. In particular, these techniques hold great promise for the development of "systems infrastructure" software, e.g., application servers, middleware, virtual machines, compilers, operating systems, and other software that provides general services for higher-level applications. The developers of infrastructure software are faced with increasing demands from application programmers needing higher-level support for application development. Meeting these demands requires careful use of software modularization techniques, since infrastructural concerns are notoriously hard to modularize. Aspects, components, and patterns provide very different means to deal with infrastructure software, but despite their differences, they have much in common. For instance, component models try to free the developer from the need to deal directly with services like security or transactions. These are primary examples of crosscutting concerns, and modularizing such concerns are the main target of aspect-oriented languages. Similarly, design patterns like Visitor and Interceptor facilitate the clean modularization of otherwise tangled concerns. Building on the ACP4IS meetings at AOSD 2002-2009, this workshop aims to provide a highly interactive forum for researchers and developers to discuss the application of and relationships between aspects, components, and patterns within modern infrastructure software. The goal is to put aspects, components, and patterns into a common reference frame and to build connections between the software engineering and systems communities.
Inhalt: 1. Einleitung 1.1. Forschungsziele 1.2. Arbeitsmethodik 1.3. Aufbau der Pilotstudie 2. Kommunale Verwaltungsreform in Brandenburg 3. Die Kreisverwaltung Potsdam-Mittelmark 3.1. Der Landkreis Potsdam-Mittelmark 3.2. Das Personal der Kreisverwaltung 3.3. Verbeamtungskonzept 3.4. Folgen der Kreisgebietsreform 3.5. Gleichstellungsfragen 4. Verwaltungsreform im Landkreis Potsdam-Mittelmark 4.1. Zum Reformansatz 4.2. Weitere Reformschritte 4.3. Ziele der Reform 4.4. Leitbilddiskussion 4.5. Mitarbeiter und Reform 4.6. Personalrat und Reform 4.7. ÖTV und Reform 5. Personalfragen bei der Verwaltungsreform im Landkreis Potsdam-Mittelmark 5.1. Defizite im Personalbereich 5.2. Zur Arbeitsmotivation der Mitarbeiter in der Kreisverwaltung 5.3. Elemente des modernen Personalmanagements 5.4. Instrumente in der Personalarbeit - 5.4.1. Mitarbeiterbefragung - 5.4.2. Weiterbildung - 5.4.2. Weiterbildung 6. Ergebnisse der Pilotstudie 6.1. Besonderheiten der Verwaltungsreform in den neuen Bundesländern am Beispiel Potsdam-Mittelmark 6.2. Zwischenbilanz zur Umsetzung der Modernisierungskonzeption 6.3. Vorschläge für die Fortsetzung des Projekts
Interdisciplinary studies on information structure : ISIS ; Working papers of the SFB 632 - Vol. 5
(2006)
In this paper we compare the behaviour of adverbs of frequency (de Swart 1993) like usually with the behaviour of adverbs of quantity like for the most part in sentences that contain plural definites. We show that sentences containing the former type of Q-adverb evidence that Quantificational Variability Effects (Berman 1991) come about as an indirect effect of quantification over situations: in order for quantificational variability readings to arise, these sentences have to obey two newly observed constraints that clearly set them apart from sentences containing corresponding quantificational DPs, and that can plausibly be explained under the assumption that quantification over (the atomic parts of) complex situations is involved. Concerning sentences with the latter type of Q-adverb, on the other hand, such evidence is lacking: with respect to the constraints just mentioned, they behave like sentences that contain corresponding quantificational DPs. We take this as evidence that Q-adverbs like for the most part do not quantify over the atomic parts of sum eventualities in the cases under discussion (as claimed by Nakanishi and Romero (2004)), but rather over the atomic parts of the respective sum individuals.
What is "Celtic"and what is universal in the "Celtic Englishes"? This was the central concern of the fourth and final Colloquium of studies on language contact between English and the Celtic languages at the University of Potsdam in September 2004. The contributions to this volume discuss the "Celtic" peculiarities of Standard English in England and in Ireland (North and South). They also examine the perceived "Celticity" of personal names in the "Celtic" countries (Ireland, Wales, Cornwall, Brittany). Moreover, they put emphasis on specific grammatical features such as the expression of perfectivity, relativity, intensification and the typological shift of verbal word formation from syntheticity to analycity as well as the emergence of universal contact trends shared by Celtic, African and Indian Englishes. Thus, the choice of contributors and the scope of their articles makes Celtic Englishes IV an invaluable handbook for scholarly work in the field of the English - Celtic relations.