Extern
Refine
Year of publication
Document Type
- Article (234) (remove)
Language
- English (234) (remove)
Keywords
- USA (7)
- United States (7)
- moderne jüdische Geschichte (6)
- football (5)
- modern Jewish history (5)
- 20. Jahrhundert (4)
- 20th century (4)
- exercise (4)
- 19. Jahrhundert (3)
- kinetics (3)
Institute
- Extern (234) (remove)
As mid-19th-century American Jews introduced radical changes to their religious observance and began to define Judaism in new ways, to what extent did they engage with European Jewish ideas? Historians often approach religious change among Jews from German lands during this period as if Jewish immigrants had come to America with one set of ideas that then evolved solely in conversation with their American contexts. Historians have similarly cast the kinds of Judaism Americans created as both unique to America and uniquely American. These characterizations are accurate to an extent. But to what extent did Jewish innovations in the United States take place in conversation with European Jewish developments? Looking to the 19th-century American Jewish press, this paper seeks to understand how American Jews engaged European Judaism in formulating their own ideas, understanding themselves, and understanding their place in world Judaism.
This article explores the multi-directional geographic trajectories and ties of Jews who came to the United States in the 19th century, working to complicate simplistic understandings of “German” Jewish immigration. It focuses on the case study of Henry Cohn, an ordinary Russian-born Jew whose journeys took him to Prussia, New York, Savannah, and California. Once in the United States he returned to Europe twice, the second time permanently, although a grandson ended up in California, where he worked to ensure the preservation of Cohn’s records. This story highlights how Jews navigated and transgressed national boundaries in the 19th century and the limitations of the historical narratives that have been constructed from their experiences.
The knowledge of transformation pathways and identification of transformation products (TPs) of veterinary drugs is important for animal health, food, and environmental matters. The active agent Monensin (MON) belongs to the ionophore antibiotics and is widely used as a veterinary drug against coccidiosis in broiler farming. However, no electrochemically (EC) generated TPs of MON have been described so far. In this study, the online coupling of EC and mass spectrometry (MS) was used for the generation of oxidative TPs. EC-conditions were optimized with respect to working electrode material, solvent, modifier, and potential polarity. Subsequent LC/HRMS (liquid chromatography/high resolution mass spectrometry) and MS/MS experiments were performed to identify the structures of derived TPs by a suspected target analysis. The obtained EC-results were compared to TPs observed in metabolism tests with microsomes and hydrolysis experiments of MON. Five previously undescribed TPs of MON were identified in our EC/MS based study and one TP, which was already known from literature and found by a microsomal assay, could be confirmed. Two and three further TPs were found as products in microsomal tests and following hydrolysis, respectively. We found decarboxylation, O-demethylation and acid-catalyzed ring-opening reactions to be the major mechanisms of MON transformation.
The paper investigates the question of sustainability of capacity building initiatives by reporting about the multiplication training in the frame of DIES NMT Programme on quality assurance in Uganda and how it could make use of the social capital within the existing quality assurance network to sustain and address challenges during its implementation. The purpose of the article is to explore the nature of networking (social and institutional) which was established by the Ugandan Universities Quality Assurance Forum (UUQAF) and share the strategies used in this training experience for future sustainable capacity building training initiatives in emerging economies. The paper employed a qualitative research method to describe and analyse the training framework based on primary and secondary documents.
Higher education institutions in Guinea face many challenges, including reporting responsibilities, globalisation, and massification. Institutional evaluations of higher education and research institutions in 2013 could not initiate the implementation of change processes within the institutions. Recently, however, various initiatives have been started to change this situation with the purpose to sensitise and raise awareness and capabilities for quality assurance structures in Guinean HEIs. So far, the emphasis has been put on quality enhancement in higher education, especially on teaching evaluation, curriculum development, as well as on establishing quality assurance structures. This article gives an overview of the state of play and takes stock of the activities that have been initiated to set up quality assurance mechanisms in higher education and research institutions, and presents perspectives for further development of the quality approach in Guinea. The project ‘Quality Assurance Multiplication 2017-2018’ serves as an example to describe approaches and activities in setting up stable quality assurance structures, and to strengthen and raise awareness for a ‘quality culture’.
Whilst providing a framework for learning and scientific emancipation, a proposal writing training is confronted with various organisational and didactic challenges, which influence the achievement of the set training objectives. Based on observations made during the workshops for proposal writing organised in Kinshasa, Democratic Republic of Congo, as part of the NMT Programme, the article raises two main questions: (a) How could these challenges be overcome and successfully addressed in the training? (b) What is the level of learning outcomes of the participants at the end of the training? The article shows that the success of the training lays in the relevance of the employed training approaches. The use of a participatory approach encouraged constructive exchanges between participants, trainers, and experts, and enabled all participants to finalise coherent projects to apply for national and international funding.
This article collected the results of a qualitative study focused on Colombian Higher Education Institutions’ representatives partaking in the training ‘Internationalisation for Peacebuilding 2018’. The selected Higher Education Institutions and representatives were all located in regions acutely affected by the Colombian armed conflict, now experiencing multifaceted challenges and opportunities in a post-conflict scenario. Interviews with participants of the training were conducted to analyse the skills acquired and to identify possible improvements brought about by the training at the institutions. The article further identifies specific needs of the institutions, to be taken into account for future courses on internationalisation for higher education institutions.
During the National Multiplication Training in Kenya in 2018, participants raised concerns about attrition, completion rates and quality of PhD programmes in Kenya’s public universities. This led the authors of this article to further examine the question of PhD completion rates. Available data underlined that PhD students across various disciplines in Kenya’s public universities take unnecessarily long to complete their studies due to a myriad of factors that are related to their supervisors, university guidelines for post-graduate studies, or the students themselves. This article examines inertia areas along the PhD training pathway at three public universities in Kenya and provides suggestions on structural and operational changes universities must make to shorten completion periods.
Deans at Institutions of Higher Education are seldom recipients of effective or specific professional management training, institutional mentorship, and coaching despite an increasing demand on them to play a more dynamic leadership role in the face of ever-changing local and global challenges. To address this deficiency, the inaugural Malaysian Chapter of the International Deans’ Course (MyIDC) was held in three parts over 2019 and 2020. In this paper, findings related to feedback on the programme are presented and discussed. Responses from the participants from two sets of surveys, and written feedback provided by two IDC international trainers involved in MyIDC were analysed. These reveal potential areas of improvement for the forthcoming MyIDC programme, such as in terms of planning and organisation, duration, content, and delivery. The article explores the lessons learnt from the MyIDC 2019/2020 training programme and discusses the improvements that can be made arising from the feedback received.
The higher education structure in Malaysia has experienced significant changes since the implementation of the Private Higher Educational Institutions Act of 1996. The unprecedented expansion of the higher education sector and the increasing autonomy conferred to universities have created a huge demand for competent university leadership that supports the development of higher education in Malaysia. This article discusses the very first national multiplication training in Malaysia in 2014 and analyses such out-comes as the identification of good practices for future initiatives and applications in university leadership training.
Dermal Delivery of the High-Molecular-Weight Drug Tacrolimus by Means of Polyglycerol-Based Nanogels
(2019)
Polyglycerol-based thermoresponsive nanogels (tNGs) have been shown to have excellent skin hydration properties and to be valuable delivery systems for sustained release of drugs into skin. In this study, we compared the skin penetration of tacrolimus formulated in tNGs with a commercial 0.1% tacrolimus ointment. The penetration of the drug was investigated in ex vivo abdominal and breast skin, while different methods for skin barrier disruption were investigated to improve skin permeability or simulate inflammatory conditions with compromised skin barrier. The amount of penetrated tacrolimus was measured in skin extracts by liquid chromatography tandem-mass spectrometry (LC-MS/MS), whereas the inflammatory markers IL-6 and IL-8 were detected by enzyme-linked immunosorbent assay (ELISA). Higher amounts of tacrolimus penetrated in breast as compared to abdominal skin or in barrier-disrupted as compared to intact skin, confirming that the stratum corneum is the main barrier for tacrolimus skin penetration. The anti-proliferative effect of the penetrated drug was measured in skin tissue/Jurkat cells co-cultures. Interestingly, tNGs exhibited similar anti-proliferative effects as the 0.1% tacrolimus ointment. We conclude that polyglycerol-based nanogels represent an interesting alternative to paraffin-based formulations for the treatment of inflammatory skin conditions.
We extend the scope of European palaeogenomics by sequencing the genomes of Late Upper Palaeolithic (13,300 years old, 1.4-fold coverage) and Mesolithic (9,700 years old, 15.4-fold) males from western Georgia in the Caucasus and a Late Upper Palaeolithic (13,700 years old, 9.5-fold) male from Switzerland. While we detect Late Palaeolithic-Mesolithic genomic continuity in both regions, we find that Caucasus hunter-gatherers (CHG) belong to a distinct ancient clade that split from western hunter-gatherers similar to 45 kya, shortly after the expansion of anatomically modern humans into Europe and from the ancestors of Neolithic farmers similar to 25 kya, around the Last Glacial Maximum. CHG genomes significantly contributed to the Yamnaya steppe herders who migrated into Europe similar to 3,000 BC, supporting a formative Caucasus influence on this important Early Bronze age culture. CHG left their imprint on modern populations from the Caucasus and also central and south Asia possibly marking the arrival of Indo-Aryan languages.
Understanding heat transport in sedimentary basins requires an assessment of the regional 3D heat distribution and of the main physical mechanisms responsible for the transport of heat. We review results from different 3D numerical simulations of heat transport based on 3D basin models of the Central European Basin System (CEBS). Therefore we compare differently detailed 3D structural models of the area, previously published individually, to assess the influence of (1) different configurations of the deeper lithosphere, (2) the mechanism of heat transport considered and (3) large faults dissecting the sedimentary succession on the resulting thermal field and groundwater flow. Based on this comparison we propose a modelling strategy linking the regional and lithosphere-scale to the sub-basin and basin-fill scale and appropriately considering the effective heat transport processes. We find that conduction as the dominant mechanism of heat transport in sedimentary basins is controlled by the distribution of thermal conductivities, compositional and thickness variations of both the conductive and radiogenic crystalline crust as well as the insulating sediments and by variations in the depth to the thermal lithosphere-asthenosphere boundary. Variations of these factors cause thermal anomalies of specific wavelength and must be accounted for in regional thermal studies. In addition advective heat transport also exerts control on the thermal field on the regional scale. In contrast, convective heat transport and heat transport along faults is only locally important and needs to be considered for exploration on the reservoir scale. The general applicability of the proposed workflow makes it of interest for a broad range of application in geosciences including oil and gas exploration, geothermal utilization or carbon capture and sequestration issues. (C) 2014 Elsevier Ltd. All rights reserved.
The weather in Eurasia, Australia, and North and South America is largely controlled by the strength and position of extratropical storm tracks. Future climate change will likely affect these storm tracks and the associated transport of energy, momentum, and water vapour. Many recent studies have analyzed how storm tracks will change under climate change, and how these changes are related to atmospheric dynamics. However, there are still discrepancies between different studies on how storm tracks will change under future climate scenarios. Here, we show that under global warming the CMIP5 ensemble of coupled climate models projects only little relative changes in vertically averaged mid-latitude mean storm track activity during the northern winter, but agree in projecting a substantial decrease during summer. Seasonal changes in the Southern Hemisphere show the opposite behaviour, with an intensification in winter and no change during summer. These distinct seasonal changes in northern summer and southern winter storm tracks lead to an amplified seasonal cycle in a future climate. Similar changes are seen in the mid-latitude mean Eady growth rate maximum, a measure that combines changes in vertical shear and static stability based on baroclinic instability theory. Regression analysis between changes in the storm tracks and changes in the maximum Eady growth rate reveal that most models agree in a positive association between the two quantities over mid-latitude regions.
The Great Hungarian Plain was a crossroads of cultural transformations that have shaped European prehistory. Here we analyse a 5,000-year transect of human genomes, sampled from petrous bones giving consistently excellent endogenous DNA yields, from 13 Hungarian Neolithic, Copper, Bronze and Iron Age burials including two to high (similar to 22x) and seven to similar to 1x coverage, to investigate the impact of these on Europe's genetic landscape. These data suggest genomic shifts with the advent of the Neolithic, Bronze and Iron Ages, with interleaved periods of genome stability. The earliest Neolithic context genome shows a European hunter-gatherer genetic signature and a restricted ancestral population size, suggesting direct contact between cultures after the arrival of the first farmers into Europe. The latest, Iron Age, sample reveals an eastern genomic influence concordant with introduced Steppe burial rites. We observe transition towards lighter pigmentation and surprisingly, no Neolithic presence of lactase persistence.
1,4-Di(homo)allyl-2,5-diketopiperazines are synthesized and polymerized via ADMET using the Hoveyda-Grubbs 2nd generation catalyst. The but-3-enylated diketopiperazine can be converted into unsaturated tertiary polyamide with molar mass of <3000 g mol(-1), whereas the allylated diketopiperazine cannot. Double-bond isomerization occurs regardless of whether or not benzoquinone is present. A polyesteramide with a higher molar mass of ca. 4800 g mol(-1) is obtained by the alternating copolymerization (ALTMET) of 1,4-di(but-3-enyl)-2,5-di ketopiperazine and ethylene glycol diacrylate. A post-polymerization modification of the poly(ester)amides via radical thiol-ene chemistry, however, fails.
The mid- to late Holocene interval is characterised by a highly variable climate in response to a gradual change in orbital insolation. The seasonal impact of these changes on the Eifel Maar region is not yet well documented largely due to uncertainties about the completeness of this archive ("missing varves" in the well known Lake Holzmaar) and a limited understanding of the factors (e.g. temperature, precipitation) influencing the seasonality archived within the lamination/varves. In this study we approach these challenges from a different perspective. Using detailed microfacies investigations we: (1) demonstrate that the ambiguity about the "missing varves" is related to the climate induced complex biotic and abiotic laminations that led to mis-identification of varves; (2) use a combination of detailed microfacies investigations (varve structure, seasonality of biotic and abiotic signals), lamination quality, varve counts on multiple cores, published and new radiocarbon dates to develop a continuous master chronology based on the Bayesian modelling approach. The dates of major climate, volcanic, and archaeological event(s) determined using our model are in good agreement with the independently determined ages of the same events from other archives, confirming the accuracy of our age model; (3) test the sensitivity of the seasonal proxies to the available data on mid-Holocene changes in temperature and precipitation; (4) demonstrate that the changes in lake eutrophicity are correlative with temperature changes in NW Europe and probably triggered by solar variability; and (5) show that the early Iron Age onset of eutrophication in Lake Holzmaar was climate induced and began several decades before the impact of anthropogenic activity was seen in the form of intensified detrital erosion in the catchment area. Our work has implications for understanding the impact of climate change and anthropogenic activities on limnological systems. (C) 2014 Elsevier B.V. All rights reserved.
Two dimensional gas chromatography coupled to time-of-flight mass spectrometry (GCxGC-TOF-MS) is a promising technique to overcome limits of complex metabolome analysis using one dimensional GC-TOF-MS. Especially at the stage of data export and data mining, however, convenient procedures to cope with the complexity of GCxGC-TOF-MS data are still in development. Here, we present a high sample throughput protocol exploiting first and second retention index for spectral library search and subsequent construction of a high dimensional data matrix useful for statistical analysis. The method was applied to the analysis of 13 C-labelling experiments in the unicellular green alga Chlamydomonas reinhardtii. We developed a rapid sampling and extraction procedure for Chlamydomonas reinhardtii laboratory strain (CC503), a cell wall deficient mutant. By testing all published quenching protocols we observed dramatic metabolite leakage rates for certain metabolites. To circumvent metabolite leakage, samples were directly quenched and analyzed without separation of the medium. The growth medium was adapted to this rapid sampling protocol to avoid interference with GCxGC-TOF-MS analysis. To analyse batches of samples a new software tool, MetMax, was implemented which extracts the isotopomer matrix from stable isotope labelling experiments together with the first and second retention index (RI1 and RI2). To exploit RI1 and RI2 for metabolite identification we used the Golm metabolome database (GMD [1] with RI1/ RI2-reference spectra and new search algorithms. Using those techniques we analysed the dynamics of (CO2)-C-13 and C-13- acetate uptake in Chlamydomonas reinhardtii cells in two different steady states namely photoautotrophic and mixotrophic growth conditions.
The challenge is providing teachers with the resources they need to strengthen their instructions and better prepare students for the jobs of the 21st Century. Technology can help meet the challenge. Teachers’ Tryscience is a noncommercial offer, developed by the New York Hall of Science, TeachEngineering, the National Board for Professional Teaching Standards and IBM Citizenship to provide teachers with such resources. The workshop provides deeper insight into this tool and discussion of how to support teaching of informatics in schools.
.NET Gadgeteer Workshop
(2013)
Problem solving is one of the central activities performed by computer scientists as well as by computer science learners. Whereas the teaching of algorithms and programming languages is usually well structured within a curriculum, the development of learners’ problem-solving skills is largely implicit and less structured. Students at all levels often face difficulties in problem analysis and solution construction. The basic assumption of the workshop is that without some formal instruction on effective strategies, even the most inventive learner may resort to unproductive trial-and-error problemsolving processes. Hence, it is important to teach problem-solving strategies and to guide teachers on how to teach their pupils this cognitive tool. Computer science educators should be aware of the difficulties and acquire appropriate pedagogical tools to help their learners gain and experience problem-solving skills.
A method is presented of acquiring the principles of three sorting algorithms through developing interactive applications in Excel.
We present a concept of better integration of practical teaching in student teacher education in Computer Science. As an introduction to the workshop different possible scenarios are discussed on the basis of examples. Afterwards workshop participants will have the opportunity to discuss the application of the aconcepts in other settings.
In this paper we report on our experiments in teaching computer science concepts with a mix of tangible and abstract object manipulations. The goal we set ourselves was to let pupils discover the challenges one has to meet to automatically manipulate formatted text. We worked with a group of 25 secondary school pupils (9-10th grade), and they were actually able to “invent” the concept of mark-up language. From this experiment we distilled a set of activities which will be replicated in other classes (6th grade) under the guidance of maths teachers.
Informatics as a school subject has been virtually absent from bilingual education programs in German secondary schools. Most bilingual programs in German secondary education started out by focusing on subjects from the field of social sciences. Teachers and bilingual curriculum experts alike have been regarding those as the most suitable subjects for bilingual instruction – largely due to the intercultural perspective that a bilingual approach provides. And though one cannot deny the gain that ensues from an intercultural perspective on subjects such as history or geography, this benefit is certainly not limited to social science subjects. In consequence, bilingual curriculum designers have already begun to include other subjects such as physics or chemistry in bilingual school programs. It only seems a small step to extend this to informatics. This paper will start out by addressing potential benefits of adding informatics to the range of subjects taught as part of English-language bilingual programs in German secondary education. In a second step it will sketch out a methodological (= didactical) model for teaching informatics to German learners through English. It will then provide two items of hands-on and tested teaching material in accordance with this model. The discussion will conclude with a brief outlook on the chances and prerequisites of firmly establishing informatics as part of bilingual school curricula in Germany.
We shall examine the Pedagogical Content Knowledge (PCK) of Computer Science (CS) teachers concerning students’ Computational Thinking (CT) problem solving skills within the context of a CS course in Dutch secondary education and thus obtain an operational definition of CT and ascertain appropriate teaching methodology. Next we shall develop an instrument to assess students’ CT and design a curriculum intervention geared toward teaching and improving students’ CT problem solving skills and competences. As a result, this research will yield an operational definition of CT, knowledge about CT PCK, a CT assessment instrument and teaching materials and accompanying teacher instructions. It shall contribute to CS teacher education, development of CT education and to education in other (STEM) subjects where CT plays a supporting role, both nationally and internationally.
We launched an original large-scale experiment concerning informatics learning in French high schools. We are using the France-IOI platform to federate resources and share observation for research. The first step is the implementation of an adaptive hypermedia based on very fine grain epistemic modules for Python programming learning. We define the necessary traces to be built in order to study the trajectories of navigation the pupils will draw across this hypermedia. It may be browsed by pupils either as a course support, or an extra help to solve the list of exercises (mainly for algorithmics discovery). By leaving the locus of control to the learner, we want to observe the different trajectories they finally draw through our system. These trajectories may be abstracted and interpreted as strategies and then compared for their relative efficiency. Our hypothesis is that learners have different profiles and may use the appropriate strategy accordingly. This paper presents the research questions, the method and the expected results.
The traditional purpose of algorithm in education is to prepare students for programming. In our effort to introduce the practically missing computing science into Czech general secondary education, we have revisited this purpose.We propose an approach, which is in better accordance with the goals of general secondary education in Czechia. The importance of programming is diminishing, while recognition of algorithmic procedures and precise (yet concise) communication of algorithms is gaining importance. This includes expressing algorithms in natural language, which is more useful for most of the students than programming. We propose criteria to evaluate such descriptions. Finally, an idea about the limitations is required (inefficient algorithms, unsolvable problems, Turing’s test). We describe these adjusted educational goals and an outline of the resulting course. Our experience with carrying out the proposed intentions is satisfactory, although we did not accomplish all the defined goals.
Japan launched the new Course of Study in April 2012, which has been carried out in elementary schools and junior high schools. It will also be implemented in senior high schools from April 2013. This article presents an overview of the information studies education in the new Course of Study for K-12. Besides, the authors point out what role experts of informatics and information studies education should play in the general education centered around information studies that is meant to help people of the nation to lead an active, powerful, and flexible life until the satisfying end.
This article is a summary of the work carried out by the Ministry of Education in Turkey, in terms of the development of a new ICT Curriculum, together with the e-Training of teachers who will play an important role in the forthcoming pilot study. Based on recent literature on the topic, the article starts by introducing the “F@tih Project”, a national project that aims to effectively integrate technology into schools. After assessing teachers’ and students’ ICT competencies, as defined internationally, the review continues with the proposed model for the e-training of teachers. Summarizing the process of development of the new ICT curriculum, researchers underline key points of the curriculum such as dimensions, levels and competencies. Then teachers’ e-training approaches, together with selected tools, are explained in line with the importance and stages of action research that will be used throughout the pilot implementation of the curriculum and e-training process.
A comparison of current trends within computer science teaching in school in Germany and the UK
(2013)
In the last two years, CS as a school subject has gained a lot of attention worldwide, although different countries have differing approaches to and experiences of introducing CS in schools. This paper reports on a study comparing current trends in CS at school, with a major focus on two countries, Germany and UK. A survey was carried out of a number of teaching professionals and experts from the UK and Germany with regard to the content and delivery of CS in school. An analysis of the quantitative data reveals a difference in foci in the two countries; putting this into the context of curricular developments we are able to offer interpretations of these trends and suggest ways in which curricula in CS at school should be moving forward.
The aim of our article is to collect and present information about contemporary programming environments that are suitable for primary education. We studied the ways they implement (or do not implement) some programming concepts, the ways programs are represented and built in order to support young and novice programmers, as well as their suitability to allow different forms of sharing the results of pupils’ work. We present not only a short description of each considered environment and the taxonomy in the form of a table, but also our understanding and opinions on how and why the environments implement the same concepts and ideas in different ways and which concepts and ideas seem to be important to the creators of such environments.
The process of introducing compulsory ICT education at primary school level in the Czech Republic should be completed next year. Programming and Information, two topics from the basics of computer science have been included in a new textbook. The question is whether the new chapters of the textbook are comprehensible for primary school teachers, who have undergone no training in computer science. The paper reports on a pilot verification project in which pre-service primary school teachers were trained to teach these informatics topics.
In this paper, we show how the theory of NP completeness can be introduced to students in secondary schools. The motivation of this research is that although there are difficult issues that require technical backgrounds, students are already familiar with demanding computational problems through games such as Sudoku or Tetris. Our intention is to bring together important concepts in the theory of NP completeness in such a way that students in secondary schools can easily understand them. This is part of our ongoing research about how to teach fundamental issues in Computer Science in secondary schools. We discuss what needs to be taught in which sequence in order to introduce ideas behind NP completeness to students without technical backgrounds.
Development of competence-oriented curricula is still an important theme in informatics education. Unfortunately informatics curricula, which include the domain of logic programming, are still input-orientated or lack detailed competence descriptions. Therefore, the development of competence model and of learning outcomes' descriptions is essential for the learning process in this domain. A prior research developed both. The next research step is to formulate test items to measure the described learning outcomes. This article describes this procedure and exemplifies test items. It also relates a test in school to the items and shows which misconceptions and typical errors are important to discuss in class. The test result can also confirm or disprove the competence model. Therefore, this school test is important for theoretical research as well as for the concrete planning of lessons. Quantitative analysis in school is important for evaluation and improvement of informatics education.
The most recent trend in the studies of LF intervention effects makes crucial reference to focusing effects on the interveners, and this paper critically examines the representative analyses of the focus-based approach. While each analysis has its own merits and shortcomings, I argue that a pragmatic analysis that does not make appeal to syntactic configurations is better equipped to deal with many of the complex and delicate facts surrounding intervention effects.
When we pay close attention to the prosody of Wh-questions in Japanese, we discover many novel and interesting empirical puzzles that would require us to devise a much finer syntactic component of grammar. This paper addresses the issues that pose some problems to such an elaborated grammar, and offers solutions, making an appeal to the information structure and sentence processing involved in the interpretation of interrogative and focus constructions.
This paper discusses how focus changes prosodic structure in Tokyo Japanese. It is generally believed that focus blocks the intonational process of downstep and causes a pitch reset. This paper presents experimental evidence against this traditional view by looking at the prosodic behavior of Wh words, which receive focus lexically in Japanese as in other languages. It is demonstrated, specifically, that the focused Wh element does not block downstep although it receives a much higher pitch than its preceding element. This suggests that presence of lexical focus does not trigger pitch reset in Japanese.
The end of culture?
(2000)
Local Orders, Global Chaos
(1999)
Face-to-face communication is multimodal. In unscripted spoken discourse we can observe the interaction of several "semiotic layers", modalities of information such as syntax, discourse structure, gesture, and intonation. We explore the role of gesture and intonation in structuring and aligning information in spoken discourse through a study of the co-occurrence of pitch accents and gestural apices. Metaphorical spatialization through gesture also plays a role in conveying the contextual relationships between the speaker, the government and other external forces in a naturally-occurring political speech setting.
This paper investigates the structural properties of morphosyntactically marked focus constructions, focussing on the often neglected non-focal sentence part in African tone languages. Based on new empirical evidence from five Gur and Kwa languages, we claim that these focus expressions have to be analysed as biclausal constructions even though they do not represent clefts containing restrictive relative clauses. First, we relativize the partly overgeneralized assumptions about structural correspondences between the out-of-focus part and relative clauses, and second, we show that our data do in fact support the hypothesis of a clause coordinating pattern as present in clause sequences in narration. It is argued that we deal with a non-accidental, systematic feature and that grammaticalization may conceal such basic narrative structures.
The Semantics of Ellipsis
(2005)
There are four phenomena that are particularly troublesome for theories of ellipsis: the existence of sloppy readings when the relevant pronouns cannot possibly be bound; an ellipsis being resolved in such a way that an ellipsis site in the antecedent is not understood in the way it was there; an ellipsis site drawing material from two or more separate antecedents; and ellipsis with no linguistic antecedent. These cases are accounted for by means of a new theory that involves copying syntactically incomplete antecedent material and an analysis of silent VPs and NPs that makes them into higher order definite descriptions that can be bound into.
We present a system for the linguistic exploration and analysis of lexical cohesion in English texts. Using an electronic thesaurus-like resource, Princeton WordNet, and the Brown Corpus of English, we have implemented a process of annotating text with lexical chains and a graphical user interface for inspection of the annotated text. We describe the system and report on some sample linguistic analyses carried out using the combined thesaurus-corpus resource.
Fronting of an infinite VP across a finite main verb-akin to German "VP-topicalization"-can be found also in Czech and Polish. The paper discusses evidence from large corpora for this process and some of its properties, both syntactic and information-structural. Based on this case, criteria for more user-friedly searching and retrieval of corpus data in syntactic research are being developed.
Multiple hierarchies
(2005)
In this paper, we present the Multiple Annotation approach, which solves two problems: the problem of annotating overlapping structures, and the problem that occurs when documents should be annotated according to different, possibly heterogeneous tag sets. This approach has many advantages: it is based on XML, the modeling of alternative annotations is possible, each level can be viewed separately, and new levels can be added at any time. The files can be regarded as an interrelated unit, with the text serving as the implicit link. Two representations of the information contained in the multiple files (one in Prolog and one in XML) are described. These representations serve as a base for several applications.
This paper describes the standardization problems that come up in a diachronic corpus: it has to cope with differing standards with regard to diplomaticity, annotation, and header information. Such highly het-erogeneous texts must be standardized to allow for comparative re-search without (too much) loss of information.
Unity in diversity
(2005)
This paper describes the creation and preparation of TUSNELDA, a collection of corpus data built for linguistic research. This collection contains a number of linguistically annotated corpora which differ in various aspects such as language, text sorts / data types, encoded annotation levels, and linguistic theories underlying the annotation. The paper focuses on this variation on the one hand and the way how these heterogeneous data are integrated into one resource on the other hand.
ANNIS
(2004)
In this paper, we discuss the design and implementation of our first version of the database "ANNIS" ("ANNotation of Information Structure"). For research based on empirical data, ANNIS provides a uniform environment for storing this data together with its linguistic annotations. A central database promotes standardized annotation, which facilitates interpretation and comparison of the data. ANNIS is used through a standard web browser and offers tier-based visualization of data and annotations, as well as search facilities that allow for cross-level and cross-sentential queries. The paper motivates the design of the system, characterizes its user interface, and provides an initial technical evaluation of ANNIS with respect to data size and query processing.
Focus strategies in chadic
(2004)
We argue that the standard focus theories reach their limits when confronted with the focus systems of the Chadic languages. The backbone of the standard focus theories consists of two assumptions, both called into question by the languages under consideration. Firstly, it is standardly assumed that focus is generally marked by stress. The Chadic languages, however, exhibit a variety of different devices for focus marking. Secondly, it is assumed that focus is always marked. In Tangale, at least, focus is not marked consistently on all types of constituents. The paper offers two possible solutions to this dilemma.
We argue that there is a crucial difference between determiner and adverbial quantification. Following Herburger [2000] and von Fintel [1994], we assume that determiner quantifiers quantify over individuals and adverbial quantifiers over eventualities. While it is usually assumed that the semantics of sentences with determiner quantifiers and those with adverbial quantifiers basically come out the same, we will show by way of new data that quantification over events is more restricted than quantification over individuals. This is because eventualities in contrast to individuals have to be located in time which is done using contextual information according to a pragmatic resolution strategy. If the contextual information and the tense information given in the respective sentence contradict each other, the sentence is uninterpretable. We conclude that this is the reason why in these cases adverbial quantification, i.e. quantification over eventualities, is impossible whereas quantification over individuals is fine.