Extern
Refine
Year of publication
Document Type
- Article (232) (remove)
Language
- English (232) (remove)
Keywords
- USA (7)
- United States (7)
- moderne jüdische Geschichte (6)
- football (5)
- modern Jewish history (5)
- 20. Jahrhundert (4)
- 20th century (4)
- exercise (4)
- 19. Jahrhundert (3)
- kinetics (3)
Institute
- Extern (232) (remove)
The Semantics of Ellipsis
(2005)
There are four phenomena that are particularly troublesome for theories of ellipsis: the existence of sloppy readings when the relevant pronouns cannot possibly be bound; an ellipsis being resolved in such a way that an ellipsis site in the antecedent is not understood in the way it was there; an ellipsis site drawing material from two or more separate antecedents; and ellipsis with no linguistic antecedent. These cases are accounted for by means of a new theory that involves copying syntactically incomplete antecedent material and an analysis of silent VPs and NPs that makes them into higher order definite descriptions that can be bound into.
This paper investigates the structural properties of morphosyntactically marked focus constructions, focussing on the often neglected non-focal sentence part in African tone languages. Based on new empirical evidence from five Gur and Kwa languages, we claim that these focus expressions have to be analysed as biclausal constructions even though they do not represent clefts containing restrictive relative clauses. First, we relativize the partly overgeneralized assumptions about structural correspondences between the out-of-focus part and relative clauses, and second, we show that our data do in fact support the hypothesis of a clause coordinating pattern as present in clause sequences in narration. It is argued that we deal with a non-accidental, systematic feature and that grammaticalization may conceal such basic narrative structures.
Face-to-face communication is multimodal. In unscripted spoken discourse we can observe the interaction of several "semiotic layers", modalities of information such as syntax, discourse structure, gesture, and intonation. We explore the role of gesture and intonation in structuring and aligning information in spoken discourse through a study of the co-occurrence of pitch accents and gestural apices. Metaphorical spatialization through gesture also plays a role in conveying the contextual relationships between the speaker, the government and other external forces in a naturally-occurring political speech setting.
We present a system for the linguistic exploration and analysis of lexical cohesion in English texts. Using an electronic thesaurus-like resource, Princeton WordNet, and the Brown Corpus of English, we have implemented a process of annotating text with lexical chains and a graphical user interface for inspection of the annotated text. We describe the system and report on some sample linguistic analyses carried out using the combined thesaurus-corpus resource.
This paper describes the standardization problems that come up in a diachronic corpus: it has to cope with differing standards with regard to diplomaticity, annotation, and header information. Such highly het-erogeneous texts must be standardized to allow for comparative re-search without (too much) loss of information.
When we pay close attention to the prosody of Wh-questions in Japanese, we discover many novel and interesting empirical puzzles that would require us to devise a much finer syntactic component of grammar. This paper addresses the issues that pose some problems to such an elaborated grammar, and offers solutions, making an appeal to the information structure and sentence processing involved in the interpretation of interrogative and focus constructions.
The most recent trend in the studies of LF intervention effects makes crucial reference to focusing effects on the interveners, and this paper critically examines the representative analyses of the focus-based approach. While each analysis has its own merits and shortcomings, I argue that a pragmatic analysis that does not make appeal to syntactic configurations is better equipped to deal with many of the complex and delicate facts surrounding intervention effects.
This paper discusses how focus changes prosodic structure in Tokyo Japanese. It is generally believed that focus blocks the intonational process of downstep and causes a pitch reset. This paper presents experimental evidence against this traditional view by looking at the prosodic behavior of Wh words, which receive focus lexically in Japanese as in other languages. It is demonstrated, specifically, that the focused Wh element does not block downstep although it receives a much higher pitch than its preceding element. This suggests that presence of lexical focus does not trigger pitch reset in Japanese.
Prediction of hybrid biomass in Arabidopsis thaliana by selected parental SNP and metabolic markers
(2009)
A recombinant inbred line (RIL) population, derived from two Arabidopsis thaliana accessions, and the corresponding testcrosses with these two original accessions were used for the development and validation of machine learning models to predict the biomass of hybrids. Genetic and metabolic information of the RILs served as predictors. Feature selection reduced the number of variables (genetic and metabolic markers) in the models by more than 80% without impairing the predictive power. Thus, potential biomarkers have been revealed. Metabolites were shown to bear information on inherited macroscopic phenotypes. This proof of concept could be interesting for breeders. The example population exhibits substantial mid-parent biomass heterosis. The results of feature selection could therefore be used to shed light on the origin of heterosis. In this respect, mainly dominance effects were detected.
Two dimensional gas chromatography coupled to time-of-flight mass spectrometry (GCxGC-TOF-MS) is a promising technique to overcome limits of complex metabolome analysis using one dimensional GC-TOF-MS. Especially at the stage of data export and data mining, however, convenient procedures to cope with the complexity of GCxGC-TOF-MS data are still in development. Here, we present a high sample throughput protocol exploiting first and second retention index for spectral library search and subsequent construction of a high dimensional data matrix useful for statistical analysis. The method was applied to the analysis of 13 C-labelling experiments in the unicellular green alga Chlamydomonas reinhardtii. We developed a rapid sampling and extraction procedure for Chlamydomonas reinhardtii laboratory strain (CC503), a cell wall deficient mutant. By testing all published quenching protocols we observed dramatic metabolite leakage rates for certain metabolites. To circumvent metabolite leakage, samples were directly quenched and analyzed without separation of the medium. The growth medium was adapted to this rapid sampling protocol to avoid interference with GCxGC-TOF-MS analysis. To analyse batches of samples a new software tool, MetMax, was implemented which extracts the isotopomer matrix from stable isotope labelling experiments together with the first and second retention index (RI1 and RI2). To exploit RI1 and RI2 for metabolite identification we used the Golm metabolome database (GMD [1] with RI1/ RI2-reference spectra and new search algorithms. Using those techniques we analysed the dynamics of (CO2)-C-13 and C-13- acetate uptake in Chlamydomonas reinhardtii cells in two different steady states namely photoautotrophic and mixotrophic growth conditions.
A method is presented of acquiring the principles of three sorting algorithms through developing interactive applications in Excel.
Problem solving is one of the central activities performed by computer scientists as well as by computer science learners. Whereas the teaching of algorithms and programming languages is usually well structured within a curriculum, the development of learners’ problem-solving skills is largely implicit and less structured. Students at all levels often face difficulties in problem analysis and solution construction. The basic assumption of the workshop is that without some formal instruction on effective strategies, even the most inventive learner may resort to unproductive trial-and-error problemsolving processes. Hence, it is important to teach problem-solving strategies and to guide teachers on how to teach their pupils this cognitive tool. Computer science educators should be aware of the difficulties and acquire appropriate pedagogical tools to help their learners gain and experience problem-solving skills.
.NET Gadgeteer Workshop
(2013)
The challenge is providing teachers with the resources they need to strengthen their instructions and better prepare students for the jobs of the 21st Century. Technology can help meet the challenge. Teachers’ Tryscience is a noncommercial offer, developed by the New York Hall of Science, TeachEngineering, the National Board for Professional Teaching Standards and IBM Citizenship to provide teachers with such resources. The workshop provides deeper insight into this tool and discussion of how to support teaching of informatics in schools.
The aim of our article is to collect and present information about contemporary programming environments that are suitable for primary education. We studied the ways they implement (or do not implement) some programming concepts, the ways programs are represented and built in order to support young and novice programmers, as well as their suitability to allow different forms of sharing the results of pupils’ work. We present not only a short description of each considered environment and the taxonomy in the form of a table, but also our understanding and opinions on how and why the environments implement the same concepts and ideas in different ways and which concepts and ideas seem to be important to the creators of such environments.
A comparison of current trends within computer science teaching in school in Germany and the UK
(2013)
In the last two years, CS as a school subject has gained a lot of attention worldwide, although different countries have differing approaches to and experiences of introducing CS in schools. This paper reports on a study comparing current trends in CS at school, with a major focus on two countries, Germany and UK. A survey was carried out of a number of teaching professionals and experts from the UK and Germany with regard to the content and delivery of CS in school. An analysis of the quantitative data reveals a difference in foci in the two countries; putting this into the context of curricular developments we are able to offer interpretations of these trends and suggest ways in which curricula in CS at school should be moving forward.
This article is a summary of the work carried out by the Ministry of Education in Turkey, in terms of the development of a new ICT Curriculum, together with the e-Training of teachers who will play an important role in the forthcoming pilot study. Based on recent literature on the topic, the article starts by introducing the “F@tih Project”, a national project that aims to effectively integrate technology into schools. After assessing teachers’ and students’ ICT competencies, as defined internationally, the review continues with the proposed model for the e-training of teachers. Summarizing the process of development of the new ICT curriculum, researchers underline key points of the curriculum such as dimensions, levels and competencies. Then teachers’ e-training approaches, together with selected tools, are explained in line with the importance and stages of action research that will be used throughout the pilot implementation of the curriculum and e-training process.
Japan launched the new Course of Study in April 2012, which has been carried out in elementary schools and junior high schools. It will also be implemented in senior high schools from April 2013. This article presents an overview of the information studies education in the new Course of Study for K-12. Besides, the authors point out what role experts of informatics and information studies education should play in the general education centered around information studies that is meant to help people of the nation to lead an active, powerful, and flexible life until the satisfying end.
The traditional purpose of algorithm in education is to prepare students for programming. In our effort to introduce the practically missing computing science into Czech general secondary education, we have revisited this purpose.We propose an approach, which is in better accordance with the goals of general secondary education in Czechia. The importance of programming is diminishing, while recognition of algorithmic procedures and precise (yet concise) communication of algorithms is gaining importance. This includes expressing algorithms in natural language, which is more useful for most of the students than programming. We propose criteria to evaluate such descriptions. Finally, an idea about the limitations is required (inefficient algorithms, unsolvable problems, Turing’s test). We describe these adjusted educational goals and an outline of the resulting course. Our experience with carrying out the proposed intentions is satisfactory, although we did not accomplish all the defined goals.