Refine
Has Fulltext
- yes (1486) (remove)
Year of publication
Document Type
- Article (571)
- Preprint (299)
- Postprint (257)
- Conference Proceeding (160)
- Working Paper (48)
- Doctoral Thesis (47)
- Review (35)
- Monograph/Edited Volume (25)
- Part of a Book (24)
- Other (16)
Language
- English (1486) (remove)
Is part of the Bibliography
- no (1486) (remove)
Keywords
- Curriculum Framework (31)
- European values education (31)
- Europäische Werteerziehung (31)
- Familie (31)
- Family (31)
- Lehrevaluation (31)
- Studierendenaustausch (31)
- Unterrichtseinheiten (31)
- curriculum framework (31)
- lesson evaluation (31)
Institute
- Institut für Mathematik (309)
- Extern (276)
- Vereinigung für Jüdische Studien e. V. (125)
- Institut für Physik und Astronomie (119)
- Department Linguistik (91)
- Department Psychologie (85)
- Institut für Chemie (65)
- Hasso-Plattner-Institut für Digital Engineering GmbH (55)
- Institut für Umweltwissenschaften und Geographie (48)
- Institut für Informatik und Computational Science (46)
Different properties of programs, implemented in Constraint Handling Rules (CHR), have already been investigated. Proving these properties in CHR is fairly simpler than proving them in any type of imperative programming language, which triggered the proposal of a methodology to map imperative programs into equivalent CHR. The equivalence of both programs implies that if a property is satisfied for one, then it is satisfied for the other. The mapping methodology could be put to other beneficial uses. One such use is the automatic generation of global constraints, at an attempt to demonstrate the benefits of having a rule-based implementation for constraint solvers.
This paper offers a new theoretical framework for studying the problem of generations and social change in contemporary Iran. It offers a model which is called „articulation of cultural modes“. The paper agrees with Ronald Inglehart that ‘culture’ is now playing a more dominant role in the social formation of current societies, as ‘technology’ once did in the modern era. But it goes one step further by arguing that culture cannot be approached as a holistic concept building on a comprehensive theoretical framework.
We establish elements of a new approch to ellipticity and parametrices within operator algebras on a manifold with higher singularities, only based on some general axiomatic requirements on parameter-dependent operators in suitable scales of spaces. The idea is to model an iterative process with new generations of parameter-dependent operator theories, together with new scales of spaces that satisfy analogous requirements as the original ones, now on a corresponding higher level. The “full” calculus is voluminous; so we content ourselves here with some typical aspects such as symbols in terms of order reducing families, classes of relevant examples, and operators near the conical exit to infinity.
Focus presuppositions
(2007)
This paper reviews notions related to focus and presupposition and addresses the hypothesis that focus triggers an existential presupposition. Presupposition projection behavior in certain examples appears to favor a presuppositional analysis of focus. It is argued that these examples are open to a different analysis using givenness theory. Overall, the analysis favors a weak semantics for focus not including an existential presupposition.
This paper evaluates the construction of the rights of human rights defenders within international law and its shortcomings in protecting women. Human rights defenders have historically been defined on the basis of their actions as defenders. However, as Marxist-feminist scholar Silvia Federici contends, women are inherently politicised and, moreover, face obstacles to political action which are invisible to and untouchable by the law. Labour rights set an example of handling such a disadvantaged political position by placing vital importance on workers’ right to association and collective action. The paper closes with the suggestion that transposing this construction of rights to women would better protect women as human rights defenders while emphasising their capacity for self-determination in their political actions.
The objective and motivation behind this research is to provide applications with easy-to-use interfaces to communities of deaf and functionally illiterate users, which enables them to work without any human assistance. Although recent years have witnessed technological advancements, the availability of technology does not ensure accessibility to information and communication technologies (ICT). Extensive use of text from menus to document contents means that deaf or functionally illiterate can not access services implemented on most computer software. Consequently, most existing computer applications pose an accessibility barrier to those who are unable to read fluently. Online technologies intended for such groups should be developed in continuous partnership with primary users and include a thorough investigation into their limitations, requirements and usability barriers. In this research, I investigated existing tools in voice, web and other multimedia technologies to identify learning gaps and explored ways to enhance the information literacy for deaf and functionally illiterate users. I worked on the development of user-centered interfaces to increase the capabilities of deaf and low literacy users by enhancing lexical resources and by evaluating several multimedia interfaces for them. The interface of the platform-independent Italian Sign Language (LIS) Dictionary has been developed to enhance the lexical resources for deaf users. The Sign Language Dictionary accepts Italian lemmas as input and provides their representation in the Italian Sign Language as output. The Sign Language dictionary has 3082 signs as set of Avatar animations in which each sign is linked to a corresponding Italian lemma. I integrated the LIS lexical resources with MultiWordNet (MWN) database to form the first LIS MultiWordNet(LMWN). LMWN contains information about lexical relations between words, semantic relations between lexical concepts (synsets), correspondences between Italian and sign language lexical concepts and semantic fields (domains). The approach enhances the deaf users’ understanding of written Italian language and shows that a relatively small set of lexicon can cover a significant portion of MWN. Integration of LIS signs with MWN made it useful tool for computational linguistics and natural language processing. The rule-based translation process from written Italian text to LIS has been transformed into service-oriented system. The translation process is composed of various modules including parser, semantic interpreter, generator, and spatial allocation planner. This translation procedure has been implemented in the Java Application Building Center (jABC), which is a framework for extreme model driven design (XMDD). The XMDD approach focuses on bringing software development closer to conceptual design, so that the functionality of a software solution could be understood by someone who is unfamiliar with programming concepts. The transformation addresses the heterogeneity challenge and enhances the re-usability of the system. For enhancing the e-participation of functionally illiterate users, two detailed studies were conducted in the Republic of Rwanda. In the first study, the traditional (textual) interface was compared with the virtual character-based interactive interface. The study helped to identify usability barriers and users evaluated these interfaces according to three fundamental areas of usability, i.e. effectiveness, efficiency and satisfaction. In another study, we developed four different interfaces to analyze the usability and effects of online assistance (consistent help) for functionally illiterate users and compared different help modes including textual, vocal and virtual character on the performance of semi-literate users. In our newly designed interfaces the instructions were automatically translated in Swahili language. All the interfaces were evaluated on the basis of task accomplishment, time consumption, System Usability Scale (SUS) rating and number of times the help was acquired. The results show that the performance of semi-literate users improved significantly when using the online assistance. The dissertation thus introduces a new development approach in which virtual characters are used as additional support for barely literate or naturally challenged users. Such components enhanced the application utility by offering a variety of services like translating contents in local language, providing additional vocal information, and performing automatic translation from text to sign language. Obviously, there is no such thing as one design solution that fits for all in the underlying domain. Context sensitivity, literacy and mental abilities are key factors on which I concentrated and the results emphasize that computer interfaces must be based on a thoughtful definition of target groups, purposes and objectives.
It is shown that the Hankel transformation Hsub(v) acts in a class of weighted Sobolev spaces. Especially, the isometric mapping property of Hsub(v) which holds on L²(IRsub(+),rdr) is extended to spaces of arbitrary Sobolev order. The novelty in the approach consists in using techniques developed by B.-W. Schulze and others to treat the half-line Rsub(+) as a manifold with a conical singularity at r = 0. This is achieved by pointing out a connection between the Hankel transformation and the Mellin transformation.The procedure proposed leads at the same time to a short proof of the Hankel inversion formula. An application to the existence and higher regularity of solutions, including their asymptotics, to the 1-1-dimensional edge-degenerated wave equation is given.
In 1914 Bohr proved that there is an r ∈ (0, 1) such that if a power series converges in the unit disk and its sum has modulus less than 1 then, for |z| < r, the sum of absolute values of its terms is again less than 1. Recently analogous results were obtained for functions of several variables. The aim of this paper is to comprehend the theorem of Bohr in the context of solutions to second order elliptic equations meeting the maximum principle.
An expansion for a class of functions is called stable if the partial sums are bounded uniformly in the class. Stable expansions are of key importance in numerical analysis where functions are given up to certain error. We show that expansions in homogeneous functions are always stable on a small ball around the origin, and evaluate the radius of the largest ball with this property.
Deepening Understanding
(2012)
Deepening understanding
(2013)
A survey has been carried out in the Computer Science (CS) department at the University of Baghdad to investigate the attitudes of CS students in a female dominant environment, showing the differences between male and female students in different academic years. We also compare the attitudes of the freshman students of two different cultures (University of Baghdad, Iraq, and the University of Potsdam).
Innovat MOOC
(2023)
The COVID-19 pandemic has revealed the importance for university teachers to have adequate pedagogical and technological competences to cope with the various possible educational scenarios (face-to-face, online, hybrid, etc.), making use of appropriate active learning methodologies and supporting technologies to foster a more effective learning environment. In this context, the InnovaT project has been an important initiative to support the development of pedagogical and technological competences of university teachers in Latin America through several trainings aiming to promote teacher innovation. These trainings combined synchronous online training through webinars and workshops with asynchronous online training through the MOOC “Innovative Teaching in Higher Education.” This MOOC was released twice. The first run took place right during the lockdown of 2020, when Latin American teachers needed urgent training to move to emergency remote teaching overnight. The second run took place in 2022 with the return to face-to-face teaching and the implementation of hybrid educational models. This article shares the results of the design of the MOOC considering the constraints derived from the lockdowns applied in each country, the lessons learned from the delivery of such a MOOC to Latin American university teachers, and the results of the two runs of the MOOC.
This article deals with contact between East Asian thought and modern Hebrew Literature from the late nineteenth century through the twentieth century, until today. In the first part, the article suggests that from a historiographical perspective, one may outline three waves of contact between these two cultural phenomena, at opposite ends of Asia. In the first wave, which began in the early twentieth century, Asian influence on Hebrew literature written in Europe was mediated mainly through the philosophers Schopenhauer and Nietzsche. The second wave, which emerged in the 1950s, relates to the influence of the leaders of the Beat Generation, who, in turn, were influenced by modernist poetry in English, which was colored by contact with Asian poetry. The third wave is part of the glocal New Age phenomenon and its appropriation of certain Buddhist traits.
The second part of the article presents several theoretical possibilities of symbioses between cultures, as they appear within language.
The third part presents the symptomatic example of the work of contemporary Hebrew writer Yoel Hoffmann, who appears to be a representative of the second wave; however, his work maintains dialogue with the first wave, and its current popularity is part of the third wave. Hoffmann’s work serves as an example of how to apply the theoretical possibilities presented in the second part of the article, as an instance of literary contact between two cultures and their respective languages.
Yoel Hoffmann is an Israeli writer born in 1937 in Brasov (Kronstadt), Romania. Brought up in a German-speaking family, already in his first book, Sefer Yosef (1989), he conveys the voice of German-speaking immigrants in Israel (the “Katschen” story, 1986) and that of the East European Jewish community in Berlin in the late 1930s, on the verge of the Second World War. His works are crammed with characters of Jews from Germany gripped by the memory of the language they abandoned following their emigration to Palestine in the 1930s. The classic one is the character of Bernhard, in the eponymous work. The current article focuses on the representation and elaboration of Hoffmann’s unique creation, in a language influenced by his deep identification with Zen Buddhism on the one hand, and his attraction to the modernist, Western style of stream of consciousness on the other. In central sections of his works, Hoffman presents his entire literary corpus as a type of explicit, allusive, or secret Holocaust literature, and invites his readers and his critics to decode the allusions and expose the secret in this theme, a surprising statement in relation to Hoffmann’s work and its analysis so far. Hoffmann represents the Holocaust as a collective Israeli trauma for which his literary fiction creates a special catalogue of representative characters. In the creation of a catalogue, and particularly one that simultaneously classifies and individualizes, Hoffmann’s project resembles the monumental 1920s cataloguing project by the celebrated German photographer August Sander (Menschen des 20. Jahrhunderts). Hoffmann included photographs from this project in his works, and even chose some of them for the covers of his books. The article examines the implicit relationships between these two creative artists as conferring a meaning so far not considered in the research of the Holocaust theme in Yoel Hoffmann’s writings.
Monolayers of rod-shaped and disc-shaped liquid crystalline compounds at the air-water interface
(1986)
Calamitic (rod-shaped) and discotic (disc-shaped) thermotropic liquid crystalline (LC) compounds were spread at the air-water interface, and their ability to form monolayers was studied. The calamitic LCs investigated were found to form monolayers which behave analogously to conventional amphiphiles such as fatty acids. The spreading of the discotic LCs produced monolayers as well, but with a behaviour different from classical amphiphiles. The areas occupied per molecule are too small to allow the contact of all hydrophilic groups with the water surface and the packing of all hydrophobic chains. Various molecular arrangements of the discotics at the water surface to fit the spreading data are discussed.
Professional and amateur astronomers around the world contributed to a 4-month long campaign in 2013, mainly in spectroscopy but also in photometry, interferometry and polarimetry, to observe the first 3 Wolf-Rayet stars discovered: WR 134 (WN6b), WR 135 (WC8) and WR 137 (WC7pd+O9). Each of these stars are interesting in their own way, showing a variety of stellar wind structures. The spectroscopic data from this campaign were reduced and analyzed for WR 134 in order to better understand its behavior and long-term periodicity in the context of CIRs in the wind. We will be presenting the results of these spectroscopic data, which include the confirmation of the CIR variability and a time-coherency of ∼ 40 days (half-life of ∼ 20 days).
In Allefeld & Kurths [2004], we introduced an approach to multivariate phase synchronization analysis in the form of a Synchronization Cluster Analysis (SCA). A statistical model of a synchronization cluster was described, and an abbreviated instruction on how to apply this model to empirical data was given, while an implementation of the corresponding algorithm was (and is) available from the authors. In this letter, the complete details on how the data analysis algorithm is to be derived from the model are filled in.
Phase synchronization analysis, including our recently introduced multivariate approach, is applied to event-related EEG data from an experiment on language processing, following a classic psycholinguistic paradigm. For the two types of experimental manipulation distinct effects in overall synchronization are found; for one of them they can also be localized. The synchronization effects occur earlier than those found by the conventional analysis method, indicating that the new approach provides additional information on the underlying neuronal process.
In order to investigate the temporal characteristics of cognitive processing, we apply multivariate phase synchronization analysis to event-related potentials. The experimental design combines a semantic incongruity in a sentence context with a physical mismatch (color change). In the ERP average, these result in an N400 component and a P300-like positivity, respectively. The synchronization analysis shows an effect of global desynchronization in the theta band around 288ms after stimulus presentation for the semantic incongruity, while the physical mismatch elicits an increase of global synchronization in the alpha band around 204ms. Both of these effects clearly precede those in the ERP average. Moreover, the delay between synchronization effect and ERP component correlates with the complexity of the cognitive processes.
We present different tests for phase synchronization which improve the procedures currently used in the literature. This is accomplished by using a two-samples test setup and by utilizing insights and methods from directional statistics and bootstrap theory. The tests differ in the generality of the situation in which they can be applied as well as in their complexity, including computational cost. A modification of the resampling technique of the bootstrap is introduced, making it possible to fully utilize data from time series.
A method for the multivariate analysis of statistical phase synchronization phenomena in empirical data is presented. A first statistical approach is complemented by a stochastic dynamic model, to result in a data analysis algorithm which can in a specific sense be shown to be a generic multivariate statistical phase synchronization analysis. The method is applied to EEG data from a psychological experiment, obtaining results which indicate the relevance of this method in the context of cognitive science as well as in other fields.
We study a boundary value problem for an overdetermined elliptic system of nonlinear first order differential equations with linear boundary operators. Such a problem is solvable for a small set of data, and so we pass to its variational formulation which consists in minimising the discrepancy. The Euler-Lagrange equations for the variational problem are far-reaching analogues of the classical Laplace equation. Within the framework of Euler-Lagrange equations we specify an operator on the boundary whose zero set consists precisely of those boundary data for which the initial problem is solvable. The construction of such operator has much in common with that of the familiar Dirichlet to Neumann operator. In the case of linear problems we establish complete results.
Orthopyroxenes of a high temperature protomylonite of the Ivrea Zone, Northern Italy show twin like polysynthetic lamellae parallel to {210} of the hypersthene host. The transformation is caused by plastic deformation under high metamorphic conditions which has resulted in dynamic recrystallization of pyroxene and plagioclase. The lamellae consist of clinohypersthene. The twin plane and the lamellar clino-orthoinversion of hypersthene due to natural deformation have not been described hitherto.
Assignments, curriculum framework and background information as the base of developing lessons
(2012)
1. What are the general strengths of the assignments? 2. Structure of the assignment 3. Resources of the assignment 4. Fostering self-expression 5. How could you improve the assignment? 6. Lack of specific examples 7. Not relating the issue to the students 8. Language Problems 9. Infeasibility to adaptation 10. In what ways was the additional information useful ? How could this be improved? 11. Was the framework useful for you and in what way? 12. In what ways did the assignments reflect the steps identified in the framework?
Several types of insect cuticle contain enzymes catalyzing the formation ofof adducts between N-acetyldopamine (NADA) and N-acetylhistidine (NAH). Two such adducts, NAH-NADA-I and NAH NADA-II, have been isolated and their structures determined. In one of the adducts the link connecting the two residues occurs between the I-position (ß-position) in the NADA side chain and the 1-N atom (τ-N) in the imidazole ring of histidine. Diphenoloxidase activity alone is not sufficient for formation of this adduct, whereas extracts containing both diphenoloxidase and o-quinone-p-quinone methide isomerase activities catalyze the coupling reaction. The adduct consists of a mixture of two diastereomers and they are presumably formed by spontaneous reaction between enzymatically produced NADA-p-quinone methide and N-acetylhistidine. The other adduct has been identified as a ring addition product of N-acetylhistidine and NADA. In contrast to the former adduct it can be formed by incubation of the two substrates with mushroom tyrosinase alone. An adduct between N-acetylhistidine and the benzodioxan-type NADA-dimer is produced in vitro, when the N-acetylhistidine-NADA adduct is incubated with NADA and locust cuticle containing a 1,2-dehydro-NADA generating enzyme system. Trimeric NADA-polymerization products of the substituted benzodioxan-type have been obtained from in vivo sclerotized locust cuticle, confirming the ability of cuticle to produce NADA-oligomers. The results indicate that some insect cuticles contain enzymes promoting linkage of oxidized NADA to histidine residues. It is suggested that histidine residues in the cuticular proteins can serve as acceptors for oxidized NADA and that further addition of NADA-residues to the phenolic groups of bound NADA can occur, resulting in formation of protein-linked NADA-oligomers. The coupling reactions identified may be an important step in natural cuticular sclerotization.
In a previously published article in HIN under the title of “Eduard Dorsch and his unpublished poem on the occasion of Humboldt’s 100th birthday,” I elaborated on Dorsch’s poem that was read in Detroit in front of a German-American audience on Sept. 14, 1869, a day widely celebrated in the US in honor of Humboldt. Although it was not surprising that Dorsch wrote the occasional poem in the first place given his affinities with Humboldt’s world of thought, a discovery of a second occasional poem upon further research in Dorsch’s voluminous papers was indeed unexpected, in this case read on the same date in Monroe, Michigan. Although there are a number of similarities between the Detroit and Monroe versions, there are enough differences that warrant this addendum to my original article.
The boundary paradigm (Rayner, 1975) with a novel preview manipulation was used to examine the extent of parafoveal processing of words to the right of fixation. Words n+1 and n+2 had either correct or incorrect previews prior to fixation (prior to crossing the boundary location). In addition, the manipulation utilized either a high or low frequency word in word n+1 location on the assumption that it would be more likely that n+2 preview effects could be obtained when word n+1 was high frequency. The primary findings were that there was no evidence for a preview benefit for word n+2 and no evidence for parafoveal-on-foveal effects when word n+1 is at least four letters long. We discuss implications for models of eye-movement control in reading.
This article describes recent achievements in the field of micellar polymers, or polysoaps. Taking advantage of zwitterionic model polymers, systematic variations of the molecular architecture have provided an improved understanding of the relationship between the molecular structure of the polymers and their key properties such as surface activity and solubilization capacity. Useful rules are established, which take into account much of the previous data in the literature.
Several zwitterionic polymers were prepared by radical homopolymerization of surfactant monomers which bear diallyl, diene or vinylcyclopropane moieties. These polymer systems were complemented by alternating copolymers of appropriate zwitterionic vinyl compounds. Thus, polymers with reduced (as compared with simple vinylic homopolymers, or statistical copolymers) and well defined density of surfactant side groups are obtained. The solubilities found for these polymers are dominated by polymer geometry rather than by the balance of hydrophilic and hydrophobic fragments, thus corroborating a main-chain spacer model proposed recently. All water-soluble polymers exhibit characteristic features of classical polysoaps, as shown by surface tension measurements and by solubilization of hydrophobic dyes. In contrast, the water-insoluble copolymers are capable to form stable monolayers at the air-water interface.
Solubilization by polysoaps
(1994)
The aqueous solubilization power of several series of micellar homopolymers and copolymers (polysoaps) is investigated. Using five insoluble or poorly water-soluble dyes, comparisons of the capacities are made with respect ot the influence of structural variables such as the polymer backbone, the polymer geometry, the comonomer content, and the charge of the hydrophilic group. Some guidelines for polysoap structures suited for efficient solubilization are established. Noteworthy is that the solubilization capacities of the polysoaps are neither linked to the ability to reduce the surface tension of water, nor to the polarity of the solubilization sites deduced from spectroscopic probes.
Reversible changes in the self-organization of polysoaps may be induced by controlling their charge numbers via covalently bound redox moieties. This is illustrated with two viologen polysoaps, which in response to an electrochemical stimulus, change their solubility and aggregation in water, leading from homogeneously dissolved and aggregated molecules to collapsed ones and vice verse. Using the electrochemical quartz crystal microbalance (EQCM), it could be shown that the reversibility of this process is better than 95% in 16 cycles.
After the mass immigration to Israel from 1948 to 1950, about 2000 Jews remained
in Yemen. These Jews lived in small communities and continued to maintain their
religious environment as it was. In the years that followed, many of them, however, moved from Yemen to Israel with the assistance of the Jewish Agency and the Joint
Distribution Committee (JDC). The community was of a small size and the fact that it
was dispersed throughout the predominantly Muslim areas, created a certain closeness
between the two groups. About ten percent of the Jews chose to convert to Islam, many
of them in groups. In about twenty cases, the husbands chose to convert to Islam while
their wives emigrated to preserve their Judaism. Some of the converts refused to grant
their wives a divorce, because, according to Muslim law, conversion is enough to sever
the marital relationship. This procedure is called ʿAgunot. Meaning, women bound in
marriage to a husband and they no longer lived together, but the husband didn’t formally
‘released’ her from marriage union. The article follows the efforts undertaken
to release the ʿAgunot, and shows that Jewish and Muslim scholars were able to find
solutions to the ʿAgunot problem and, at times, managed to bridge the gap between the
two religions.
Over the last few decades, the methodology for the identification of customary international law (CIL) has been changing. Both elements of CIL – practice and opinio juris – have assumed novel and broader forms, as noted in the Reports of the Special Rapporteur of the International Law Commission (2013, 2014, 2015, 2016). This paper discusses these Reports and the draft conclusions, and reaction by States in the Sixth Committee of the United Nations General Assembly (UNGA), highlighting the areas of consensus and contestation. This ties to the analysis of the main doctrinal positions, with special attention being given to the two elements of CIL, and the role of the UNGA resolutions. The underlying motivation is to assess the real or perceived crisis of CIL, and the author develops the broader argument maintaining that in order to retain unity within international law, the internal limits of CIL must be carefully asserted.
The rule of law is the cornerstone of the international legal system. This paper shows, through analysis of intergovernmental instruments, statements made by representatives of States, and negotiation records, that the rule of law at the United Nations has become increasingly contested in the past years. More precisely, the argument builds on the process of integrating the notion of the rule of law into the Sustainable Development Goals, adopted in September 2015 in the document Transforming our world: the 2030 Agenda for Sustainable Development. The main sections set out the background of the rule of law debate at the UN, the elements of the rule of law at the goal- and target-levels in the 2030 Agenda – especially in the SDG 16 –, and evaluate whether the rule of law in this context may be viewed as a normative and universal foundation of international law. The paper concludes, with reflections drawn from the process leading up to the 2030 Agenda and the final outcome document that the rule of law – or at least strong and precise formulations of the concept – may be in decline in institutional and normative settings. This can be perceived as symptomatic of a broader crisis of the international legal order.
The end of the cold war division of the Baltic Sea in 1989, and the three Baltic states’ return to independence in 1991 created new opportunities for the decision-makers of the area, as well as new possibilities for fashioning security in the region. This article will examine the security debate affecting the Baltic Sea region in the post-cold war period, and in particular, the relevance of the European Union to that debate. The following section will examine various concepts of security relevant to the Baltic region; the third section looks at the EU and the Baltic area; and the last part deals with the implications that EU membership by the Baltic Sea states may have for the security of the Baltic Sea zone.
A New Kind of Jew
(2018)
The article examines Allen Ginsberg’s spiritual path, and places his interest in Asian religions within larger cultural agendas and life choices. While identifying as a Jew, Ginsberg wished to transcend beyond his parents’ orbit and actively sought to create an inclusive, tolerant, and permissive society where persons such as himself could live and create at ease. He chose elements from the Christian, Jewish, Native-American, Hindu, and Buddhist traditions, weaving them together into an ever-growing cultural and spiritual quilt. The poet never underwent a conversion experience or restricted his choices and freedoms. In Ginsberg’s understanding, Buddhism was a universal, non-theistic religion that meshed well with an individualist outlook, and worked toward personal solace and mindfulness. He and other Jews saw no contradiction between enchantment with Buddhism and their Jewish identity.
Mathematical modeling of biological systems is a powerful tool to systematically investigate the functions of biological processes and their relationship with the environment. To obtain accurate and biologically interpretable predictions, a modeling framework has to be devised whose assumptions best approximate the examined scenario and which copes with the trade-off of complexity of the underlying mathematical description: with attention to detail or high coverage. Correspondingly, the system can be examined in detail on a smaller scale or in a simplified manner on a larger scale. In this thesis, the role of photosynthesis and its related biochemical processes in the context of plant metabolism was dissected by employing modeling approaches ranging from kinetic to stoichiometric models. The Calvin-Benson cycle, as primary pathway of carbon fixation in C3 plants, is the initial step for producing starch and sucrose, necessary for plant growth. Based on an integrative analysis for model ranking applied on the largest compendium of (kinetic) models for the Calvin-Benson cycle, those suitable for development of metabolic engineering strategies were identified. Driven by the question why starch rather than sucrose is the predominant transitory carbon storage in higher plants, the metabolic costs for their synthesis were examined. The incorporation of the maintenance costs for the involved enzymes provided a model-based support for the preference of starch as transitory carbon storage, by only exploiting the stoichiometry of synthesis pathways. Many photosynthetic organisms have to cope with processes which compete with carbon fixation, such as photorespiration whose impact on plant metabolism is still controversial. A systematic model-oriented review provided a detailed assessment for the role of this pathway in inhibiting the rate of carbon fixation, bridging carbon and nitrogen metabolism, shaping the C1 metabolism, and influencing redox signal transduction. The demand of understanding photosynthesis in its metabolic context calls for the examination of the related processes of the primary carbon metabolism. To this end, the Arabidopsis core model was assembled via a bottom-up approach. This large-scale model can be used to simulate photoautotrophic biomass production, as an indicator for plant growth, under so-called optimal, carbon-limiting and nitrogen-limiting growth conditions. Finally, the introduced model was employed to investigate the effects of the environment, in particular, nitrogen, carbon and energy sources, on the metabolic behavior. This resulted in a purely stoichiometry-based explanation for the experimental evidence for preferred simultaneous acquisition of nitrogen in both forms, as nitrate and ammonium, for optimal growth in various plant species. The findings presented in this thesis provide new insights into plant system's behavior, further support existing opinions for which mounting experimental evidences arise, and posit novel hypotheses for further directed large-scale experiments.
This paper describes the proof calculus LD for clausal propositional logic, which is a linearized form of the well-known DPLL calculus extended by clause learning. It is motivated by the demand to model how current SAT solvers built on clause learning are working, while abstracting from decision heuristics and implementation details. The calculus is proved sound and terminating. Further, it is shown that both the original DPLL calculus and the conflict-directed backtracking calculus with clause learning, as it is implemented in many current SAT solvers, are complete and proof-confluent instances of the LD calculus.
Many formal descriptions of DPLL-based SAT algorithms either do not include all essential proof techniques applied by modern SAT solvers or are bound to particular heuristics or data structures. This makes it difficult to analyze proof-theoretic properties or the search complexity of these algorithms. In this paper we try to improve this situation by developing a nondeterministic proof calculus that models the functioning of SAT algorithms based on the DPLL calculus with clause learning. This calculus is independent of implementation details yet precise enough to enable a formal analysis of realistic DPLL-based SAT algorithms.
rezensiertes Werk: Leshonot yehude Sefarad ve-ha-mizrach vesifruyotehem / Languages and literatures of Sephardic and Oriental Jews. - Jerusalem : Misgav Yerushalayim, 2009. - 484 S. [hebr.] + 434 S. [lat.], ; Ill.
I review our current understanding of the interaction between a Wolf-Rayet star's fast wind and the surrounding medium, and discuss to what extent the predictions of numerical simulations coincide with multiwavelength observations of Wolf-Rayet nebulae. Through a series of examples, I illustrate how changing the input physics affects the results of the numerical simulations. Finally, I discuss how numerical simulations together with multiwavelength observations of these objects allow us to unpick the previous mass-loss history of massive stars.
In two experiments, many annotators marked antecedents for discourse deixis as unconstrained regions of text. The experiments show that annotators do converge on the identity of these text regions, though much of what they do can be captured by a simple model. Demonstrative pronouns are more likely than definite descriptions to be marked with discourse antecedents. We suggest that our methodology is suitable for the systematic study of discourse deixis.
Many methods have been proposed for the simulation of constrained mechanical systems. The most obvious of these have mild instabilities and drift problems. Consequently, stabilization techniques have been proposed A popular stabilization method is Baumgarte's technique, but the choice of parameters to make it robust has been unclear in practice. Some of the simulation methods that have been proposed and used in computations are reviewed here, from a stability point of view. This involves concepts of differential-algebraic equation (DAE) and ordinary differential equation (ODE) invariants. An explanation of the difficulties that may be encountered using Baumgarte's method is given, and a discussion of why a further quest for better parameter values for this method will always remain frustrating is presented. It is then shown how Baumgarte's method can be improved. An efficient stabilization technique is proposed, which may employ explicit ODE solvers in case of nonstiff or highly oscillatory problems and which relates to coordinate projection methods. Examples of a two-link planar robotic arm and a squeezing mechanism illustrate the effectiveness of this new stabilization method.
Many methods have been proposed for the stabilization of higher index differential-algebraic equations (DAEs). Such methods often involve constraint differentiation and problem stabilization, thus obtaining a stabilized index reduction. A popular method is Baumgarte stabilization, but the choice of parameters to make it robust is unclear in practice. Here we explain why the Baumgarte method may run into trouble. We then show how to improve it. We further develop a unifying theory for stabilization methods which includes many of the various techniques proposed in the literature. Our approach is to (i) consider stabilization of ODEs with invariants, (ii) discretize the stabilizing term in a simple way, generally different from the ODE discretization, and (iii) use orthogonal projections whenever possible. The best methods thus obtained are related to methods of coordinate projection. We discuss them and make concrete algorithmic suggestions.