Refine
Year of publication
Document Type
- Article (596)
- Preprint (299)
- Postprint (257)
- Conference Proceeding (160)
- Doctoral Thesis (51)
- Working Paper (48)
- Review (36)
- Monograph/Edited Volume (26)
- Part of a Book (24)
- Other (16)
Language
- English (1517) (remove)
Is part of the Bibliography
- no (1517) (remove)
Keywords
- Curriculum Framework (31)
- European values education (31)
- Europäische Werteerziehung (31)
- Familie (31)
- Family (31)
- Lehrevaluation (31)
- Studierendenaustausch (31)
- Unterrichtseinheiten (31)
- curriculum framework (31)
- lesson evaluation (31)
Institute
- Institut für Mathematik (309)
- Extern (259)
- Institut für Physik und Astronomie (119)
- Vereinigung für Jüdische Studien e. V. (108)
- Department Linguistik (91)
- Department Psychologie (86)
- Institut für Chemie (65)
- Hasso-Plattner-Institut für Digital Engineering GmbH (55)
- Historisches Institut (52)
- Institut für Umweltwissenschaften und Geographie (49)
Teaching Data Management
(2015)
Data management is a central topic in computer science as
well as in computer science education. Within the last years, this topic is
changing tremendously, as its impact on daily life becomes increasingly
visible. Nowadays, everyone not only needs to manage data of various
kinds, but also continuously generates large amounts of data. In
addition, Big Data and data analysis are intensively discussed in public
dialogue because of their influences on society. For the understanding of
such discussions and for being able to participate in them, fundamental
knowledge on data management is necessary. Especially, being aware
of the threats accompanying the ability to analyze large amounts of
data in nearly real-time becomes increasingly important. This raises the
question, which key competencies are necessary for daily dealings with
data and data management.
In this paper, we will first point out the importance of data management
and of Big Data in daily life. On this basis, we will analyze which are
the key competencies everyone needs concerning data management to
be able to handle data in a proper way in daily life. Afterwards, we will
discuss the impact of these changes in data management on computer
science education and in particular database education.
Developing critical thinking
(2013)
The morphological appearance of massive stars across their post-Main Sequence evolution and before the SN event is very uncertain, both from a theoretical and observational perspective. We recently developed coupled stellar evolution and atmospheric modeling of stars done with the Geneva and CMFGEN codes, for initial masses between 9 and 120 M⊙. We are able to predict the observables such as the high-resolution spectrum and broadband photometry. Here I discuss how the spectrum of a massive star changes across its evolution and before death, with focus on the WR stage. Our models indicate that single stars with initial masses larger than 30 M⊙ end their lives as WR stars. Depending on rotation, the spectrum of the star can either be that of a WN or WO subtype at the pre-SN stage. Our models allow, for the first time, direct comparison between predictions from stellar evolution models and observations of SN progenitors.
This paper outlines a newly-developed method to include the effects of time variability in the radiative transfer code CMFGEN. It is shown that the flow timescale is often large compared to the variability timescale of LBVs. Thus, time-dependent effects significantly change the velocity law and density structure of the wind, affecting the derivation of the mass-loss rate, volume filling factor, wind terminal velocity, and luminosity. The results of this work are directly applicable to all active LBVs in the Galaxy and in the LMC, such as AG Car, HR Car, S Dor and R 127, and could result in a revision of stellar and wind parameters. The massloss rate evolution of AG Car during the last 20 years is presented, highlighting the need for time-dependent models to correctly interpret the evolution of LBVs.
Minimalist accounts lack a natural theory of markedness, whereas Optimality-Theoretical accounts fundamentally encode markedness. We think the duality of interfaces assumed in Minimalism is a step towards explaining pairedness behavior, where a given language exhibits a marked/ unmarked pair of items occupying the same niche. We argue that while Minimalism articulates the derivational aspect of language, and underlies grammaticality, an Optimality Theoretic articulation of PF and LF is conceptually natural and explains pairedness behavior. We adopt this ‘hybrid’ account, first, to explain the existence of marked (often termed ‘reflexive’) and unmarked anticausatives in German, recently studied in depth by Sch¨afer [2007].
Ethical issues surrounding modern computing technologies play an increasingly important role in the public debate. Yet, ethics still either doesn’t appear at all or only to a very small extent in computer science degree programs. This paper provides an argument for the value of ethics beyond a pure responsibility perspective and describes the positive value of ethical debate for future computer scientists. It also provides a systematic analysis of the module handbooks of 67 German universities and shows that there is indeed a lack of ethics in computer science education. Finally, we present a principled design of a compulsory course for undergraduate students.
Ngizim fieldnotes
(2011)
This chapter presents field notes of the West Chadic language Ngizim, spoken in North-East Nigeria. In Ngizim, subject focus is indicated by subject inversion, whereas the word order of sentences with focused non-subjects can remain unchanged. The goal of the field work was to find out more about focus marking in Ngizim.
We investigate the effect of wind clumping on the dynamics of Wolf-Rayet winds, by means of the Potsdam Wolf-Rayet (PoWR) hydrodynamic atmosphere models. In the limit of microclumping the radiative acceleration is generally enhanced. We examine the reasons for this effect and show that the resulting wind structure depends critically on the assumed radial dependence of the clumping factor D(r). The observed terminal wind velocities for WR stars imply that D(r) increases to very large values in the outer part of the wind, in agreement with the assumption of detached expanding shells.
rezensiertes Werk: Shraibman, Yechiel: Sieben Jahre und sieben Monate : meine Bukarester Jahre ; Roman. - Berlin : be.bra, 2009. - 272 S. ISBN 978-3-937233-56-7
The names of God and the celestial powers : their function and meaning in the Hekhalot literature
(1987)
Excerpt: "Names and adjurations are the two main theurgic means found in the Hekhalot literature applied in connect ion with the descent to the Merkhavah and the invocation of angels to come down to earth and to reveal secrets," says Ithamar Gruenwald in his book on the Merkavah literature. He continues and maintains, with Gershom Scholem, that "that particular element in the Hekhalot Literature actually belonged to its very heart and this almost from its beginning." It is very seductive for the student of this literature to go straight to the heart of these texts; but the danger of this approach is as great as the danger of yeridat merkavah itself. Indeed, I feel as if I am passing the gates of the Hekhalot, the watchers of the gates standing on both sides prepared to throw their iron axes at my head. I can only hope that I may present the proper names! [...]
Excerpt: The shrill sounds of the now seemingly outdated controversy between Gershom Scholem and Martin Buber at the beginning of the sixties are still in the minds of every student of Hasidic literature and thought. - The "Scholem-community" feels content and the "Buber-community" upset. We can summarize the case in a few words. Martin Buber, the pioneer of Hasidism in the Western World, held the position that whoever would want to understand Hasidism had to turn to Hasidic tale as here, in the tales of the Hasidim, real Hasidic life was to be found. Whereas in the Hasidic homilies we meet mere non-creative tradition especially in the form of Kabbalah. Buber did not totally deny the importance of the Hasidic Midrash but he regarded it just as a commentary, i.e. as secondhand material, whereas, in his view, the tale was a true mirror of real Hasidic life [...]
Excerpt: Hasidic Ashkenazi literature is known to scholars of Jewish religion as one of the most prolific sources of medieval Jewish magic or magical beliefs. This is all the more astonishing as the non esoteric writings of the Hasidey Ashkenaz represent a rather traditional Jewish piety as known to us from talmudic sources. Considering this duality of an almost traditional Jewish piety on the one hand and very distinct magic tenets on the other, we may ask whether the Hasidey Ashkenaz themselves perceived any difference between magic and religion. There are indeed a number of modern historians of religion who completely deny the validity of such a distinction, for in most historical religions magic and religion are in fact intertwined to a certain degree, thus permitting almost no differentiation between the two.
Excerpt: The writings from the thirteenth century called by Gershom Scholem the "Writings of the 'lyyun circle" are one of the most intriguing chapters of early kabbalah - this I need not to elaborate on as it is a well known fact to anyone whoever had read these texts or the literature about them. When reading these texts, one gets the impression as if the authors had at hand a box full of terms and phrases into which everybody could just stick his hand and take terms and phrases out of it in order to arrange them according to his own taste, disregarding the meaning they have in the writings of his fellow kabbalists. The result was, that we now have before us a large number of varying mosaic pictures in which we detect again and again the same mosaic pebbles, however composed differently.
Two types of X-ray sources are mostly found in planetary nebulae (PNe): point sources at their central stars and diffuse emission inside hot bubbles. Here we describe these two types of sources based on the most recent observations obtained in the framework of the Chandra Planetary Nebula Survey, ChanPlaNS, an X-ray survey targeting a volume-limited sample of PNe. Diffuse X-ray emission is found preferentially in young PNe with sharp, closed inner
shells. Point sources of X-ray emission at the central stars reveal magnetically active binary companions and shock-in stellar winds.
Passive plant actuators have fascinated many researchers in the field of botany and structural biology since at least one century. Up to date, the most investigated tissue types in plant and artificial passive actuators are fibre-reinforced composites (and multilayered assemblies thereof) where stiff, almost inextensible cellulose microfibrils direct the otherwise isotropic swelling of a matrix. In addition, Nature provides examples of actuating systems based on lignified, low-swelling, cellular solids enclosing a high-swelling cellulosic phase. This is the case of the Delosperma nakurense seed capsule, in which a specialized tissue promotes the reversible opening of the capsule upon wetting. This tissue has a diamond-shaped honeycomb microstructure characterized by high geometrical anisotropy: when the cellulosic phase swells inside this constraining structure, the tissue deforms up to four times in one principal direction while maintaining its original dimension in the other. Inspired by the example of the Delosoperma nakurense, in this thesis we analyze the role of architecture of 2D cellular solids as models for natural hygromorphs. To start off, we consider a simple fluid pressure acting in the cells and try to assess the influence of several architectural parameters onto their mechanical actuation. Since internal pressurization is a configurational type of load (that is the load direction is not fixed but it “follows” the structure as it deforms) it will result in the cellular structure acquiring a “spontaneous” shape. This shape is independent of the load but just depends on the architectural characteristics of the cells making up the structure itself. Whereas regular convex tiled cellular solids (such as hexagonal, triangular or square lattices) deform isotropically upon pressurization, we show through finite element simulations that by introducing anisotropic and non-convex, reentrant tiling large expansions can be achieved in each individual cell. The influence of geometrical anisotropy onto the expansion behaviour of a diamond shaped honeycomb is assessed by FEM calculations and a Born lattice approximation. We found that anisotropic expansions (eigenstrains) comparable to those observed in the keels tissue of the Delosoperma nakurense are possible. In particular these depend on the relative contributions of bending and stretching of the beams building up the honeycomb. Moreover, by varying the walls’ Young modulus E and internal pressure p we found that both the eigenstrains and 2D elastic moduli scale with the ratio p/E. Therefore the potential of these pressurized structures as soft actuators is outlined. This approach was extended by considering several 2D cellular solids based on two types of non-convex cells. Each honeycomb is build as a lattice made of only one non-convex cell. Compared to usual honeycombs, these lattices have kinked walls between neighbouring cells which offers a hidden length scale allowing large directed deformations. By comparing the area expansion in all lattices, we were able to show that less convex cells are prone to achieve larger area expansions, but the direction in which the material expands is variable and depends on the local cell’s connectivity. This has repercussions both at the macroscopic (lattice level) and microscopic (cells level) scales. At the macroscopic scale, these non-convex lattices can experience large anisotropic (similarly to the diamond shaped honeycomb) or perfectly isotropic principal expansions, large shearing deformations or a mixed behaviour. Moreover, lattices that at the macroscopic scale expand similarly can show quite different microscopic deformation patterns that include zig-zag motions and radical changes of the initial cell shape. Depending on the lattice architecture, the microscopic deformations of the individual cells can be equal or not, so that they can build up or mutually compensate and hence give rise to the aforementioned variety of macroscopic behaviours. Interestingly, simple geometrical arguments involving the undeformed cell shape and its local connectivity enable to predict the results of the FE simulations. Motivated by the results of the simulations, we also created experimental 3D printed models of such actuating structures. When swollen, the models undergo substantial deformation with deformation patterns qualitatively following those predicted by the simulations. This work highlights how the internal architecture of a swellable cellular solid can lead to complex shape changes which may be useful in the fields of soft robotics or morphing structures.
The aim of our article is to collect and present information about contemporary programming environments that are suitable for primary education. We studied the ways they implement (or do not implement) some programming concepts, the ways programs are represented and built in order to support young and novice programmers, as well as their suitability to allow different forms of sharing the results of pupils’ work. We present not only a short description of each considered environment and the taxonomy in the form of a table, but also our understanding and opinions on how and why the environments implement the same concepts and ideas in different ways and which concepts and ideas seem to be important to the creators of such environments.
The guarantee of judicial independence is undoubtedly one of the most important institutional design features of international courts and tribunals. An independence deficit can adversely impact a court’s authority, create a crisis of legitimacy, and undermine the very effectiveness of an international court or tribunal. It can hardly be denied that for an international court to be considered legitimate, a basic degree of independence is a must. An independent judiciary is a precondition to the fair and just resolution of legal disputes. In the context of interstate dispute settlement where the jurisdiction of courts is based on the principle of consent, in the absence of a basic degree of judicial independence, states may not be willing to submit to the jurisdiction of international courts. Comparing and contrasting the International Court of Justice and the Appellate Body of the World Trade Organisation, I assess whether those international judicial mechanisms possess the basic degree of independence required for a court to be able to maintain its credibility so that it can continue to perform its core function of adjudicating interstate disputes. With both those interstate adjudicative bodies constituting the two leading international courts in terms of participation and the sheer number of cases decided, much may be learned from comparing them. I argue there is a case for bolstering the independence of the ICJ; and without immediate reforms to the Appellate Body’s institutional design, its recent demise may become permanent. I conclude that if a basic degree of judicial independence cannot be guaranteed, it is preferable to let a court vanish for a while than to maintain a significantly deficient one.
The spatially-resolved winds of the massive binary, Eta Carinae, extend an arcsecond on the sky, well beyond the 10 to 20 milliarcsecond binary orbital dimension. Stellar wind line profiles, observed at very different angular resolutions of VLTI/AMBER, HST/STIS and VLT/UVES, provide spatial information on the extended wind interaction structure as it changes with orbital phase. These same wind lines, observable in the starlight scattered off the foreground lobe of the dusty Homunculus, provide time-variant line profiles viewed from significantly different angles. Comparisons of direct and scattered wind profiles observed in the same epoch and at different orbital phases provide insight on the extended wind structure and promise the potential for three-dimensional imaging of the outer wind structures. Massive, long-lasting clumps, including the nebularWeigelt blobs, originated during the two historical ejection events. Wind interactions with these clumps are quite noticeable in spatially-resolved spectroscopy. As the 2009.0 minimum approaches, analysis of existing spectra and 3-D modeling are providing bases for key observations to gain further understanding of this complex massive binary.
Eta Carinae
(2015)
Since Augusto Damineli's demonstration in 1996 that Eta Carinae is a binary with a 5.52 year period, many innovative observations and increasingly advanced three-dimensional models have led to considerable insight on this massive system that ejected at least ten, possibly forty, solar masses in the nineteenth century. Here we present a review of our current understanding of this complex system and point out continuing puzzles.
User-centered design processes are the first choice when new interactive systems or services are developed to address real customer needs and provide a good user experience. Common tools for collecting user research data, conducting brainstormings, or sketching ideas are whiteboards and sticky notes. They are ubiquitously available, and no technical or domain knowledge is necessary to use them. However, traditional pen and paper tools fall short when saving the content and sharing it with others unable to be in the same location. They are also missing further digital advantages such as searching or sorting content. Although research on digital whiteboard and sticky note applications has been conducted for over 20 years, these tools are not widely adopted in company contexts. While many research prototypes exist, they have not been used for an extended period of time in a real-world context. The goal of this thesis is to investigate what the enablers and obstacles for the adoption of digital whiteboard systems are. As an instrument for different studies, we developed the Tele-Board software system for collaborative creative work. Based on interviews, observations, and findings from former research, we tried to transfer the analog way of working to the digital world. Being a software system, Tele-Board can be used with a variety of hardware and does not depend on special devices. This feature became one of the main factors for adoption on a larger scale. In this thesis, I will present three studies on the use of Tele-Board with different user groups and foci. I will use a combination of research methods (laboratory case studies and data from field research) with the overall goal of finding out when a digital whiteboard system is used and in which cases not. Not surprisingly, the system is used and accepted if a user sees a main benefit that neither analog tools nor other applications can offer. However, I found that these perceived benefits are very different for each user and usage context. If a tool provides possibilities to use in different ways and with different equipment, the chances of its adoption by a larger group increase. Tele-Board has now been in use for over 1.5 years in a global IT company in at least five countries with a constantly growing user base. Its use, advantages, and disadvantages will be described based on 42 interviews and usage statistics from server logs. Through these insights and findings from laboratory case studies, I will present a detailed analysis of digital whiteboard use in different contexts with design implications for future systems.
Three dimensions can be distinguished in a cross-linguistic account of information structure. First, there is the definition of the focus constituent, the part of the linguistic expression which is subject to some focus meaning. Second and third, there are the focus meanings and the array of structural devices that encode them. In a given language, the expression of focus is facilitated as well as constrained by the grammar within which the focus devices operate. The prevalence of focus ambiguity, the structural inability to make focus distinctions, will thus vary across languages, and within a language, across focus meanings.
On-demand Musicology
(2017)
Detection and Characterization of Wolf-Rayet stars in M81 with GTC/OSIRIS spectra and HST images
(2015)
Here we investigate a sample of young star clusters (YSCs) and other regions of recent star formation with Wolf-Rayet (W-R) features detected in the relatively nearby spiral galaxy M81 by analysing long-slit (LS) and Multi-Object Spectroscopy (MOS) spectra obtained with the OSIRIS instrument at the 10.4-m Gran Telescopio Canarias (GTC). We take advantage of the synergy between GTC spectra and Hubble Space Telescope (HST) images to also reveal their spatial localization and the environments hosting these stars. We finally discuss and comment on the next steps of our study.
Information structure
(2007)
This article is a summary of the work carried out by the Ministry of Education in Turkey, in terms of the development of a new ICT Curriculum, together with the e-Training of teachers who will play an important role in the forthcoming pilot study. Based on recent literature on the topic, the article starts by introducing the “F@tih Project”, a national project that aims to effectively integrate technology into schools. After assessing teachers’ and students’ ICT competencies, as defined internationally, the review continues with the proposed model for the e-training of teachers. Summarizing the process of development of the new ICT curriculum, researchers underline key points of the curriculum such as dimensions, levels and competencies. Then teachers’ e-training approaches, together with selected tools, are explained in line with the importance and stages of action research that will be used throughout the pilot implementation of the curriculum and e-training process.
This paper comprises four parts. Firstly, an overview of the mathematics of decision logic in relation to games and of the construction of narration and characters is given. This includes specific limits of the use of decision logic pertaining to games in general and to storytelling in particular. Secondly, the rule system as the medial unconsciousness is focused on. Thirdly, remarks are made on the debate between ludology and narratology, which had to fail as it missed the crucial point: the computer game as a medium. Finally, gaming in general, as well as its relationship to chance, coincidence, emergence, and event is discussed.
The claim is made, that in order to analyze them sufficiently, computer games first of all have to be described according to their mediality, understood as the very form in which possible contents are presented to be interacted with. This calls for a categorical approach that defines the condition of possible actions that are determined by the program, but that can only be perceived as aesthetic features.
If we want to compare the explanatory and descriptive adequacy of the MP and OT, the original definitions by Chomsky (1964) are or little direct use. However, a relativized version of both notions can be defined, which can be used to express a number of parallels between the study of individual I-languages and the language faculty. In any version of explanatory and descriptive adequacy, the two notions derive from the research programme and can only be achieved together. They can therefore not be used to characterize the difference in orientation between OT and the MP. Even if ‘OT’ is restricted to a particular theory in Chomskyan linguistics (to the exclusion of, for instance, its use in LFG), it cannot be said to be stronger in descriptive adequacy than in explanatory adequacy in the technical sense of these terms.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Religion
(2012)
The super massive binary system, η Car, experienced periastron passage in the summer of 2014. We observed the star twice around the maximum (forb =0.97, 2014 June 6) and just before the minimum (ϕorb =0.99, 2014 July 28) of its wind-wind colliding (WWC) X-ray emis-sion using the XMM-Newton and NuSTAR observatories, the latter of which is equipped with extremely hard X-ray (>10 keV) focusing mirrors. In both observations, NuSTAR detected the thermal X-ray tail up to 40-50 keV. The hard slope is consistent with an electron tem- perature of ∼6 keV, which is significantly higher than the ionization temperature (kT ∼4 keV) measured from the Fe K emission lines, assuming collisional equilibrium plasma. The spectrum did not show a hard power-law component above this energy range, unlike earlier detections with INTEGRAL and Suzaku. In the second NuSTAR observation, the X-ray flux above 5 keV declined gradually in ∼1 day. This result suggests that the WWC apex was gradually hidden behind the optically thick primary wind around conjunction.
Modeling expanding atmospheres is a difficult task because of the extreme non-LTE situation, the need to account for complex model atoms, especially for the iron-group elements with their millions of lines, and because of the supersonic expansion. Adequate codes have been developed e.g. by Hillier (CMFGEN), the Munich group (Puls, Pauldrach), and in Potsdam (PoWR code, Hamann et al.). While early work was based on the assumption of a smooth and homogeneous spherical stellar wind, the need to account for clumping became obvious about ten years ago. A relatively simple first-order clumping correction was readily implemented into the model codes. However, its simplifying assumptions are severe. Most importantly, the clumps are taken to be optically thin at all frequencies (”microclumping”). We discuss the consequences of this approximation and describe an approach to account for optically thick clumps (“macroclumping”). First results demonstrate that macroclumping can generally reduce the strength of spectral features, depending on their optical thickness. The recently reported discrepancy between the Hα diagnostic and the Pv resonance lines in O star spectra can be resolved without decreasing the mass-loss rates, when macroclumping is taken into account.
Component based software development (CBSD) and aspectoriented software development (AOSD) are two complementary approaches. However, existing proposals for integrating aspects into component models are direct transposition of object-oriented AOSD techniques to components. In this article, we propose a new approach based on views. Our proposal introduces crosscutting components quite naturally and can be integrated into different component models.
We describe a framework to support the implementation of web-based systems to manipulate data stored in relational databases. Since the conceptual model of a relational database is often specified as an entity-relationship (ER) model, we propose to use the ER model to generate a complete implementation in the declarative programming language Curry. This implementation contains operations to create and manipulate entities of the data model, supports authentication, authorization, session handling, and the composition of individual operations to user processes. Furthermore and most important, the implementation ensures the consistency of the database w.r.t. the data dependencies specified in the ER model, i.e., updates initiated by the user cannot lead to an inconsistent state of the database. In order to generate a high-level declarative implementation that can be easily adapted to individual customer requirements, the framework exploits previous works on declarative database programming and web user interface construction in Curry.
An important characteristic of Service-Oriented Architectures is that clients do not depend on the service implementation's internal assignment of methods to objects. It is perhaps the most important technical characteristic that differentiates them from more common object-oriented solutions. This characteristic makes clients and services malleable, allowing them to be rearranged at run-time as circumstances change. That improvement in malleability is impaired by requiring clients to direct service requests to particular services. Ideally, the clients are totally oblivious to the service structure, as they are to aspect structure in aspect-oriented software. Removing knowledge of a method implementation's location, whether in object or service, requires re-defining the boundary line between programming language and middleware, making clearer specification of dependence on protocols, and bringing the transaction-like concept of failure scopes into language semantics as well. This paper explores consequences and advantages of a transition from object-request brokering to service-request brokering, including the potential to improve our ability to write more parallel software.
Focus and Tone
(2007)
Tone is a distinctive feature of the lexemes in tone languages. The information-structural category focus is usually marked by syntactic and morphological means in these languages, but sometimes also by intonation strategies. In intonation languages, focus is marked by pitch movements, which are also perceived as tone. The present article discusses prosodic focus marking in these two language types.
Focus asymmetries in Bura
(2008)
(Chadic), which exhibits a number of asymmetries: Grammatical focus marking is obligatory only with focused subjects, where focus is marked by the particle án following the subject. Focused subjects remain in situ and the complement of án is a regular VP. With nonsubject foci, án appears in a cleft-structure between the fronted focus constituent and a relative clause. We present a semantically unified analysis of focus marking in Bura that treats the particle as a focusmarking copula in T that takes a property-denoting expression (the background) and an individual-denoting expression (the focus) as arguments. The article also investigates the realization of predicate and polarity focus, which are almost never marked. The upshot of the discussion is that Bura shares many characteristic traits of focus marking with other Chadic languages, but it crucially differs in exhibiting a structural difference in the marking of focus on subjects and non-subject constituents.
The paper presents an in-depth study of focus marking in Gùrùntùm, a West Chadic language spoken in Bauchi Province of Northern Nigeria. Focus in Gùrùntùm is marked morphologically by means of a focus marker a, which typically precedes the focus constituent. Even though the morphological focus-marking system of Gùrùntùm allows for a lot of fine-grained distinctions in information structure (IS) in principle, the language is not entirely free of focus ambiguities that arise as the result of conflicting IS- and syntactic requirements that govern the placement of focus markers. We show that morphological focus marking with a applies across different types of focus, such as newinformation, contrastive, selective and corrective focus, and that a does not have a second function as a perfectivity marker, as is assumed in the literature. In contrast, we show at the end of the paper that a can also function as a foregrounding device at the level of discourse structure.
Focus strategies in chadic
(2004)
We argue that the standard focus theories reach their limits when confronted with the focus systems of the Chadic languages. The backbone of the standard focus theories consists of two assumptions, both called into question by the languages under consideration. Firstly, it is standardly assumed that focus is generally marked by stress. The Chadic languages, however, exhibit a variety of different devices for focus marking. Secondly, it is assumed that focus is always marked. In Tangale, at least, focus is not marked consistently on all types of constituents. The paper offers two possible solutions to this dilemma.
Mixed elliptic problems are characterised by conditions that have a discontinuity on an interface of the boundary of codimension 1. The case of a smooth interface is treated in [3]; the investigation there refers to additional interface conditions and parametrices in standard Sobolev spaces. The present paper studies a necessary structure for the case of interfaces with conical singularities, namely, corner conormal symbols of such operators. These may be interpreted as families of mixed elliptic problems on a manifold with smooth interface. We mainly focus on second order operators and additional interface conditions that are holomorphic in an extra parameter. In particular, for the case of the Zaremba problem we explicitly obtain the number of potential conditions in this context. The inverses of conormal symbols are meromorphic families of pseudo-differential mixed problems referring to a smooth interface. Pointwise they can be computed along the lines [3].
Given an algebra of pseudo-differential operators on a manifold, an elliptic element is said to be a reduction of orders, if it induces isomorphisms of Sobolev spaces with a corresponding shift of smoothness. Reductions of orders on a manifold with boundary refer to boundary value problems. We consider smooth symbols and ellipticity without additional boundary conditions which is the relevant case on a manifold with boundary. Starting from a class of symbols that has been investigated before for integer orders in boundary value problems with the transmission property we study operators of arbitrary real orders that play a similar role for operators without the transmission property. Moreover, we show that order reducing symbols have the Volterra property and are parabolic of anisotropy 1; analogous relations are formulated for arbitrary anisotropies. We finally investigate parameter-dependent operators, apply a kernel cut-off construction with respect to the parameter and show that corresponding holomorphic operator-valued Mellin symbols reduce orders in weighted Sobolev spaces on a cone with boundary.
We study mixed boundary value problems for an elliptic operator A on a manifold X with boundary Y , i.e., Au = f in int X, T±u = g± on int Y±, where Y is subdivided into subsets Y± with an interface Z and boundary conditions T± on Y± that are Shapiro-Lopatinskij elliptic up to Z from the respective sides. We assume that Z ⊂ Y is a manifold with conical singularity v. As an example we consider the Zaremba problem, where A is the Laplacian and T− Dirichlet, T+ Neumann conditions. The problem is treated as a corner boundary value problem near v which is the new point and the main difficulty in this paper. Outside v the problem belongs to the edge calculus as is shown in [3]. With a mixed problem we associate Fredholm operators in weighted corner Sobolev spaces with double weights, under suitable edge conditions along Z \ {v} of trace and potential type. We construct parametrices within the calculus and establish the regularity of solutions.
We study pseudodifferential operators on a cylinder IR x B with cross section B that conical singularities. Configurations of that kind are the local model of cornere singularities with base spaces B. Operators A in our calculus are assumed to have symbols α which are meromorphic in the complex covariable with values in the space of all cone operators on B. In case α is dependent of the axial variable t ∈ IR, we show an explicit formula for solutions of the homogeneous equation. Each non-bjectivity point of the symbol in the complex plane corresponds to a finite-dimensional space of solutions. Moreover, we give a relative index formula.
This work is an introduction to anisotropic spaces, which have an ω-weight of analytic functions and are generalizations of Lipshitz classes in the polydisc. We prove that these classes form an algebra and are invariant with respect to monomial multiplication. These operators are bounded in these (Lipshitz and Djrbashian) spaces. As an application, we show a theorem about the division by good-inner functions in the mentioned classes is proved.