Refine
Year of publication
- 2024 (9)
- 2023 (23)
- 2022 (29)
- 2021 (21)
- 2020 (24)
- 2019 (17)
- 2018 (12)
- 2017 (20)
- 2016 (17)
- 2015 (20)
- 2014 (15)
- 2013 (30)
- 2012 (22)
- 2011 (32)
- 2010 (42)
- 2009 (23)
- 2008 (36)
- 2007 (40)
- 2006 (54)
- 2005 (56)
- 2004 (57)
- 2003 (60)
- 2002 (50)
- 2001 (78)
- 2000 (70)
- 1999 (81)
- 1998 (71)
- 1997 (63)
- 1996 (46)
- 1995 (58)
- 1994 (27)
- 1993 (5)
- 1992 (12)
- 1991 (4)
Document Type
- Monograph/Edited Volume (1227) (remove)
Language
- English (1227) (remove)
Keywords
Institute
- Institut für Mathematik (353)
- Wirtschaftswissenschaften (191)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (89)
- Institut für Informatik und Computational Science (82)
- Department Linguistik (48)
- Institut für Anglistik und Amerikanistik (46)
- Sozialwissenschaften (39)
- Hasso-Plattner-Institut für Digital Engineering GmbH (35)
- Institut für Physik und Astronomie (35)
- Department Psychologie (25)
The ill-posed inversion of multiwavelength lidar data by a hybrid method of variable projection
(1999)
Public debate about energy relations between the EU and Russia is distorted. These distortions present considerable obstacles to the development of true partnership. At the core of the conflict is a struggle for resource rents between energy producing, energy consuming and transit countries. Supposed secondary aspects, however, are also of great importance. They comprise of geopolitics, market access, economic development and state sovereignty. The European Union, having engaged in energy market liberalisation, faces a widening gap between declining domestic resources and continuously growing energy demand. Diverse interests inside the EU prevent the definition of a coherent and respected energy policy. Russia, for its part, is no longer willing to subsidise its neighbouring economies by cheap energy exports. The Russian government engages in assertive policies pursuing Russian interests. In so far, it opts for a different globalisation approach, refusing the role of mere energy exporter. In view of the intensifying struggle for global resources, Russia, with its large energy potential, appears to be a very favourable option for European energy supplies, if not the best one. However, several outcomes of the strategic game between the two partners can be imagined. Engaging in non-cooperative strategies will in the end leave all stakeholders worse-off. The European Union should therefore concentrate on securing its partnership with Russia instead of damaging it. Stable cooperation would need the acceptance that the partner may pursue his own goals, which might be different from one’s own interests. The question is, how can a sustainable compromise be found? This thesis finds that a mix of continued dialogue, a tit for tat approach bolstered by an international institutional framework and increased integration efforts appears as a preferable solution.
Developing rich Web applications can be a complex job - especially when it comes to mobile device support. Web-based environments such as Lively Webwerkstatt can help developers implement such applications by making the development process more direct and interactive. Further the process of developing software is collaborative which creates the need that the development environment offers collaboration facilities. This report describes extensions of the webbased development environment Lively Webwerkstatt such that it can be used in a mobile environment. The extensions are collaboration mechanisms, user interface adaptations but as well event processing and performance measuring on mobile devices.
This book offers a comprehensive, multidisciplinary introduction to theme parks and the field of theme park studies. It identifies and discusses relevant economic, social, and cultural as well as medial, historical, and geographical aspects of theme parks worldwide, from the big international theme park chains to smaller, regional, family-operated parks. The book also describes the theories and methods that have been used to study theme parks in various academic disciplines and reviews the major contexts in which theme parks have been studied. By providing the necessary backgrounds, theories, and methods to analyze and understand theme parks both as a business field and as a socio-cultural phenomenon, this book will be a great resource to students, academics from all disciplines interested in theme parks, and professionals and policy-makers in the leisure and entertainment as well as the urban planning sector.
The Tetrarchy as Ideology
(2023)
The 'Tetrarchy', the modern name assigned to the period of Roman history that started with the emperor Diocletian and ended with Constantine I, has been a much-studied and much-debated field of the Roman Empire. Debate, however, has focused primarily on whether it was a true 'system' of government, or rather a collection of ad-hoc measures undertaken to stabilise the empire after the troubled period of the 3rd century CE. The papers collected here aim to go beyond this question and to present an innovative approach to a fascinating period of Roman history by understanding the Tetrarchy not as a system of government, but primarily as a political language. Their focus thus lies on the language and ideology of the imperial college and court, on the performance of power in imperial ceremonies, the representation of the emperors and their enemies in the provinces of the Roman world, as well as on the afterlife of Tetrarchic power in the Constantinian period.
Norm behavior of a parabolic-elliptic system modelling chemotaxis in three-dimensional domains
(2002)
Oases is a special ecosystem formed in arid climate and hungriness environment, in which resident, water and soil are the principal factor and exchanges of materials, energy and information are the main functional characteristics. The oases regions in central Asia are not only the basilic cradle of civilization of human beings, but also the important strategic places in world growing awareness of the potential benefits. We choose Keriya River Basin oases in south of Xinjiang as a case to study critical controlling of Oases Evolution, Based on the theories and methods used for environmental geology, physical geography, land resource research, and oases ecology. This study try to indicate the essential factors driving the oases ecosystem and the interactional dynamic mechanism in different scales and levels, confirm the optimal equilibrium aggregate of harmonious development between Population, Resources, Environment and Development, and establish the critical controlling pattern of sustainable development. We advance the indicator system to research the evolution of the PRED System of oases in Keriya River valley oases, in basis of the information derived from the field investigation and local materials. According to inquisitional result based on technical support of Geographic Information System (GIS) and Remote Sense (RS), the comparisons and analyses are carried out in land use at the upper reaches, vegetation change in the middle reaches, and desertification at the lower reaches, which narrates the regulations of Keriya River Valley oases land cover dynamic change. The main land cover types represent distinct characteristics of the local place. On the basis of field survey and statistical data, we use ARCINFO software to preprocess these data and the 2 TM satellite images. Through analyzing these images resulting from post- classification compare, we sums up the concrete quantificational dynamic distributed data of 13 land types covering a span of 15 years and regulation of the local ecological environment system. It finally points out that the trend of Keriya River Valley oases desertification expansion is mainly related to two important reasons: impact of natural environment and impact of human activities.
New survey data for a panel of Polish firms is used to estimate employment and wage adjustments under various forms of ownership (insider vs. outsider) and asymmetric response to exogenous shocks. In contrast to earlier studies, dynamic panel data estimators (GMM) allow for endogeneity of observed variables and partial adjustment to shocks. Results differ from other findings in the transition literature: wages have little effect on dynamic labor demand and the firm-size wage effect is confirmed. Firms that expand employment have to pay significantly larger wage increases and rising sales add little to employment, suggesting labor hoarding. Dec1ining sales, however, significantly reduce employment and privatization (or anticipation thereof) has the expected benefits.
Privatisation and ownership : the impact on firms in transition survey evidence from Bulgaria
(1999)
Previous papers in this Special Series, have described in detail the theoretical background and development patterns, along with some empirical results, for the privatisation processes in Bulgaria and Poland. A range of issues have been raised which demand closer empirical investigation. For this purpose, the research group has developed questionnaire studies for Bulgaria and Poland. In Bulgaria, the National Statistical Institute (NSI) carried out the case studies between February and April 1998. The problems of the questionnaire set-up were identified in apre-test study, but unlike the Polish case, they led to only minor differentiation. Since financial limitations prevented a larger sample size, a sample size of 61 mid-sized and large Bulgarian enterprises was selected. Failure to respond was not a serious problem, unlike with the Polish questionnaire; this is because the NSI has maintained good links to the enterprise sector and management were prepared to give detailed answers, even on questions of their firms' financial status. However, as the Polish experience suggests, it has become obvious that the privatisation process is also associated with management's increasing reluctance to answer comparatively 'intimate' questions. Thus, future questionnaire studies must take a much higher rate of refusals into consideration. The pre-selection procedure in Bulgaria was determined by the project target, which sought to analyse the effects of the privatisation process on firm' s behaviour during the transition process, and hence only firms which had already existed before the changes were included. For small and medium-size enterprises (SME's), most of which were founded after the changes, partly due to the legal processes of spontaneous privatisation, some empirical, as weIl as analytical, studies were carried out. Thus, the research group limited the scope of investigation to enterprises with more than 250 employees. The underlying hypothesis is that employment problems are concentrated in larger firms, in particular amongst those still (partly) state owned. Because of the former ownership structures and relatively slower capacity for management change, the assumption is that state-owned enterprises (SOE's) which have only been recently privatised might still have traditional links to government even after privatisation. On the one hand, the SME's are obviously more prone to, and linked with, market processes. As a result, they don't have the financial potential and incentives to follow job-hoarding strategies. On the other hand, there are almost no SME's which are still stateowned. Hence, the prevailing opinion in the literature is that 'larger industrial firms were apt to be least efficient, most often producing inadequate and non-competitive products, with a high degree ofunder-utilisation oflabour and most inflexible to change' (lones & Nikolov 1997, p. 252). Thus, as mentioned above, though there may be some limitations with regard to firm representation, our sample characterises a number of enterprises that offer fertile ground for the analysis of firms' adjustment to the newly established market realities in a transition economy. Our study is unique in the sense that existing empirical studies on privatisation and enterprise restructuring generally cover the time period just before and after the initial stages of transition, e.g. 1988/89 to 1992. In those studies, samples of firms in the Czech Republic, Poland, Hungary and Bulgaria recognise that behavioural adaptations at the enterprise level had taken place just before the actual privatisation process materialised. Therefore, almost all of the firms under examination were still state-owned. The firms were usually divided according to their performance as 'good', 'average' and 'bad' enterprises. The main findings of those early studies have shown that the macroeconomic adaptations (i.e., macro-level changes which induced micro-level adjustment by the firms), as well as emerging market structures, have created enormous pressures which in turn have influenced firms' economic behaviour, reallocation of resources and consequent restructuring. This evidence supports the hypothesis that the SOE's started restructuring and adjusting their behaviour and performance, in response to the harsh realities of more open markets, before privatisation actually started. In this paper, we seek to present some results on these developments in Bulgaria, at the later stages of transition and privatisation (1992-1996). The aim of our questionnaire study is therefore to show the effects of the privatisation process and ownership on the behavioural adaptations of firms which had once been state-owned or continue to be owned by the state. The period under investigation is 1992 to 1996. For 1990 and 1991, the number of missing values is reactively high and, where relevant, we partly exclude these observations from our analysis. The paper contains seven sections. Section 11 outlines the macroeconomic environment in which our sample firms operate, provides some specifics of the Bulgarian privatisation process, and discusses data quality. Section 111 concentrates on the analysis of privatisation, the specific forms of ownership that resulted from it, and firm size. In Section IV, we describe the trends of the main economic variables within firms (such as employment, wages, labour productivity, etc), and a number of proxies of firm viability, while Section V presents some regression results to corroborate the discussion of the previous section. Section VI gives an overview of survey results of the impact of enterprise determined wage policy, trade union activity and membership, government control, and social benefits on enterprise restructuring. Section VII is a summary of our findings.
In socialist economies firms have provided various social benefits, like child care, health care, food subsidies, housing etc. Using panel data from Bulgarian and Polish firms, this paper attempts to explain firm-specific provision of social benefits in the process of transition. We investigate empirically with the help of qualitative response models, how ownership type and structure, firm size, profitability, change in management, foreign direct investment, wage and employment policies, union involvement and employee power have impacted the state of non-wage benefits provision.
Faced with an accelerating climate crisis caused by burning fossil fuels we have to change the way the economy works. We can no longer go on with a system that just maximises private profit without consideration for its effects. Instead we have to conciously plan how to change to a fossil fuel free society.
The need is urgent.
The transformation will be vast.
Nothing similar has been done in the West since the days of wartime mobilisation.
This book explains the basic science of climate change before looking at the transformations needed to our energy and basic industries. It looks at the previous successful history of deliberate planning practiced in the UK from 1939 to the 1960s and how, using modern computing techniques it will be possible to organise resources so as to effect the change.
Interactional linguistics
(2018)
The first textbook dedicated to interactional linguistics, focusing on linguistic analyses of conversational phenomena, this introduction provides an overview of the theory and methodology of interactional linguistics. Reviewing recent findings on linguistic practices used in turn construction and turn taking, repair, action formation, ascription, and sequence and topic organization, the book examines the way that linguistic units of varying size - sentences, clauses, phrases, clause combinations, and particles - are mobilized for the implementation of specific actions in talk-in-interaction. A final chapter discusses the implications of an interactional perspective for our understanding of language as well as its variation, diversity, and universality. Supplementary online chapters explore additional topics such as the linguistic organization of preference, stance, footing, and storytelling, as well as the use of prosody and phonetics, and further practices with language. Featuring summary boxes and transcripts from recordings of everyday conversation, this is an essential resource for advanced undergraduate and postgraduate courses on language in social interaction.
We determine the ground state properties of inhomogeneous mixtures of bosons and fermions in cubic lattices and parabolic confining potentials. For finite hopping we determine the domain boundaries between Mott-insulator plateaux and hopping-dominated regions for lattices of arbitrary dimension within mean-field and perturbation theory. The results are compared with a new numerical method that is based on a Gutzwiller variational approach for the bosons and an exact treatment for the fermions. The findings can be applied as a guideline for future experiments with trapped atomic Bose- Fermi mixtures in optical lattices
This article examines the multiple governments of independent Estonia since 1992 referring to their stability. Confronted with the immense problems of democratic transition, the multi-party governments of Estonia change comparatively often. Following the elections of March 2003 the ninth government since 1992 was formed. A detailed examination of government stability and the example of Estonia is accordingly warranted, given that the country is seen as the most successful Central Eastern European transition country in spite of its frequent changes of government. Furthermore, this article questions whether or not internal government stability can exist within a situation where the government changes frequently. What does stability of government mean and what are the varying multi-faceted depths of the term? Before analysing the term, it has to be clarified and defined. It is presumed that government stability is composed of multiple variables influencing one another. Data about the average tenure of a government is not very conclusive. Rather, the deeper political causes for governmental change need to be examined. Therefore, this article discusses the conceptual and theoretical basics of governmental stability first. Secondly, it discusses the Estonian situation in detail up to the elections of 2003, including a short review of the 9th government since independence. In the conclusion, the author explains whether or not the governments of Estonia are stable. In the appendix, the reader finds all election results and also a list of all previous ministers of Estonian governments (all data are as of July 2002).
In reading, word frequency is commonly regarded as the major bottom-up determinant for the speed of lexical access. Moreover, language processing depends on top-down information, such as the predictability of a word from a previous context. Yet, however, the exact role of top-down predictions in visual word recognition is poorly understood: They may rapidly affect lexical processes, or alternatively, influence only late post-lexical stages. To add evidence about the nature of top-down processes and their relation to bottom-up information in the timeline of word recognition, we examined influences of frequency and predictability on event-related potentials (ERPs) in several sentence reading studies. The results were related to eye movements from natural reading as well as to models of word recognition. As a first and major finding, interactions of frequency and predictability on ERP amplitudes consistently revealed top-down influences on lexical levels of word processing (Chapters 2 and 4). Second, frequency and predictability mediated relations between N400 amplitudes and fixation durations, pointing to their sensitivity to a common stage of word recognition; further, larger N400 amplitudes entailed longer fixation durations on the next word, a result providing evidence for ongoing processing beyond a fixation (Chapter 3). Third, influences of presentation rate on ERP frequency and predictability effects demonstrated that the time available for word processing critically co-determines the course of bottom-up and top-down influences (Chapter 4). Fourth, at a near-normal reading speed, an early predictability effect suggested the rapid comparison of top-down hypotheses with the actual visual input (Chapter 5). The present results are compatible with interactive models of word recognition assuming that early lexical processes depend on the concerted impact of bottom-up and top-down information. We offered a framework that reconciles the findings on a timeline of word recognition taking into account influences of frequency, predictability, and presentation rate (Chapter 4).
"In spite of ever-increasing research into natural hazards, the reported damage from natural disasters continues to rise, increasingly disrupting human activities. We, as scientists who study the way in which the part of Earth most relevant to society- the surface-behaves, are disturbed and frustrated by this trend. It appears that the large amounts of funding devoted each year to research into reducing the impacts of natural disasters could be much more effective in producing useful results. At the same time we are aware that society, as represented by its decision makers, while increasingly concerned at the impacts of natural disasters on lives and economies, is reluctant to acknowledge the intrinsic activity of Earth's surface and to take steps to adapt societal behaviour to minimise the impacts of natural disasters. Understanding and managing natural hazards and disasters are beyond matters of applied earth science, and also involve considering human societal, economic and political decisions"
On the existence of a non-zero lower bound for the number of goldbach partitions of an even integer
(2002)
This paper analyses the macroeconomic developments which have taken place in the Bulgarian economy in the period 1993-1997. The paper also looks at the institutional arrangements and the process of economic policy-making in the country. In this context the problems the Bulgarian economy has experienced in the transition process towards a market-oriented economy are also studied. The paper proceeds as follows: Section 2 looks at the institutional arrangements and the process of economic policy-making through 1995. Section 3 studies the deep economic crisis in 1996 and points out what went wrong in that period. Section 4 continues studying the economic crisis of the Bulgarian economy as well as the problems in the transition process during the first half of 1997. Section 5 looks at the economic developments during the second half of 1997 and points to the prospects for growth in 1998. Section 6 deals with the Bulgarian financial institutions and the existing institutional arrangements. Finally, Section 7 concludes the paper.
Clones and hyperidentities
(1996)
Hyperequational theory
(1997)
Hyperidentities and clones
(2000)
The theory of hyperidentities generalises the equational theory of universal algebras and is applicable in several fields of science, especially in computer sciences. This book presents the theory of hyperidentities and its relation to clone identities. The basic concept of hypersubstitution is used to introduce the monoid of hypersubstitutions, hyperidentities, M-hyperidentities, solid and M-solid varieties. This work integrates into a coherent framework many results scattered throughout the literature over the last eighteen years. In addition, the book contains some applications of hyperidentities to the functional completenes problem in multiple-valued logic. The general theory is also extended to partial algberas. The last chapter contains a list of exercises and open problems with suggestions of future work in this area of research.
We analyse different Gibbsian properties of interactive Brownian diffusions X indexed by the lattice $Z^{d} : X = (X_{i}(t), i ∈ Z^{d}, t ∈ [0, T], 0 < T < +∞)$. In a first part, these processes are characterized as Gibbs states on path spaces of the form $C([0, T],R)Z^{d}$. In a second part, we study the Gibbsian character on $R^{Z}^{d}$ of $v^{t}$, the law at time t of the infinite-dimensional diffusion X(t), when the initial law $v = v^{0}$ is Gibbsian.
The dismembered bible
(2021)
It is often presumed that biblical redaction was invariably done using conventional scribal methods, meaning that when editors sought to modify or compile existing texts, they would do so in the process of rewriting them upon new scrolls. There is, however, substantial evidence pointing to an alternative scenario: Various sections of the Hebrew Bible appear to have been created through a process of material redaction. In some cases, ancient editors simply appended new sheets to existing scrolls. Other times, they literally cut and pasted their sources, carving out patches of text from multiple manuscripts and then gluing them together like a collage. Idan Dershowitz shows how this surprising technique left behind telltale traces in the biblical text - especially when the editors made mistakes - allowing us to reconstruct their modus operandi. Material evidence from the ancient Near East and elsewhere further supports his hypothesis.
Many Christians react in alarm when being confronted with reincarnation. They tend to regard it as an alien or exotic idea or sometimes even as an occult or dangerous teaching that leads away from the Christian path. Thus, belief in rebirth is often regarded as clearly not compatible with orthodox Christianity. However, no less than 30% of people in the Western world believe in a form of reincarnation, which indicates the urgency for an academic examination of this subject. Patrick Diemling examines under what conditions or restrictions a person who is attracted by the notion of reincarnation could at the same time remain fundamentally loyal to Christ. In a survey through the pivotal sections of Christian theology (such as soteriology, cosmology and eschatology), he investigates the critical points regarding the question of a possible compatibility of reincarnation with the Christian faith. What does the Bible say about reincarnation? What are the points of disagreement between orthodox Christians and defenders of the idea of rebirth? How would Christian theology need to be modified so as to integrate belief in reincarnation? The present volume tries to answer these questions.
As a non-contact process laser beam melt ablation offers several advantages compared to conventional processing mechanisms. During ablation the surface of the workpiece is molten by the energy of a CO2-laser beam, this melt is then driven out by the impulse of an additional process gas. Although the idea behind laser beam melt ablation is rather simple, the process itself has a major limitation in practical applications: with increasing ablation rate surface quality of the workpiece processed declines rapidly. With different ablation rates different surface structures can be distinguished, which can be characterised by suitable surface parameters. The corresponding regimes of pattern formation are found in linear and non-linear statistical properties of the recorded process emissions as well. While the ablation rate can be represented in terms of the line-energy, this parameter does not provide sufficient information about the full behaviour of the system. The dynamics of the system is dominated by oscillations due to the laser cycle but includes some periodically driven non-linear processes as well. Upon the basis of the measured time series, a corresponding model is developed. The deeper understanding of the process can be used to develop strategies for a process control.
As a non-contact process laser beam melt ablation offers several advantages compared to conventional processing mechanisms. During ablation the surface of the workpiece is molten by the energy of a CO2-laser beam, this melt is then driven out by the impulse of an additional process gas. Although the idea behind laser beam melt ablation is rather simple, the process itself has a major limitation in practical applications: with increasing ablation rate surface quality of the workpiece processed declines rapidly. With different ablation rates different surface structures can be distinguished, which can be characterised by suitable surface parameters. The corresponding regimes of pattern formation are found in linear and non-linear statistical properties of the recorded process emissions as well. While the ablation rate can be represented in terms of the line-energy, this parameter does not provide sufficient information about the full behaviour of the system. The dynamics of the system is dominated by oscillations due to the laser cycle but includes some periodically driven non-linear processes as well. Upon the basis of the measured time series, a corresponding model is developed. The deeper understanding of the process can be used to develop strategies for a process control.
Duplicate detection is the task of identifying all groups of records within a data set that represent the same real-world entity, respectively. This task is difficult, because (i) representations might differ slightly, so some similarity measure must be defined to compare pairs of records and (ii) data sets might have a high volume making a pair-wise comparison of all records infeasible. To tackle the second problem, many algorithms have been suggested that partition the data set and compare all record pairs only within each partition. One well-known such approach is the Sorted Neighborhood Method (SNM), which sorts the data according to some key and then advances a window over the data comparing only records that appear within the same window. We propose several variations of SNM that have in common a varying window size and advancement. The general intuition of such adaptive windows is that there might be regions of high similarity suggesting a larger window size and regions of lower similarity suggesting a smaller window size. We propose and thoroughly evaluate several adaption strategies, some of which are provably better than the original SNM in terms of efficiency (same results with fewer comparisons).
This book deals with the inner life of the capitalist firm. There we find numerous conflicts, the most important of which concerns the individual employment relationship which is understood as a principal-agent problem between the manager, the principal, who issues orders that are to be followed by the employee, the agent. Whereas economic theory traditionally analyses this relationship from a (normative) perspective of the firm in order to support the manager in finding ways to influence the behavior of the employees, such that the latter – ideally – act on behalf of their superior, this book takes a neutral stance. It focusses on explaining individual behavioral patterns and the resulting interactions between the actors in the firm by taking sociological, institutional, and above all, psychological research into consideration. In doing so, insights are gained which challenge many assertions economists take for granted.
While offering significant expressive power, graph transformation systems often come with rather limited capabilities for automated analysis, particularly if systems with many possible initial graphs and large or infinite state spaces are concerned. One approach that tries to overcome these limitations is inductive invariant checking. However, the verification of inductive invariants often requires extensive knowledge about the system in question and faces the approach-inherent challenges of locality and lack of context.
To address that, this report discusses k-inductive invariant checking for graph transformation systems as a generalization of inductive invariants. The additional context acquired by taking multiple (k) steps into account is the key difference to inductive invariant checking and is often enough to establish the desired invariants without requiring the iterative development of additional properties.
To analyze possibly infinite systems in a finite fashion, we introduce a symbolic encoding for transformation traces using a restricted form of nested application conditions. As its central contribution, this report then presents a formal approach and algorithm to verify graph constraints as k-inductive invariants. We prove the approach's correctness and demonstrate its applicability by means of several examples evaluated with a prototypical implementation of our algorithm.
Graph transformation systems are a powerful formal model to capture model transformations or systems with infinite state space, among others. However, this expressive power comes at the cost of rather limited automated analysis capabilities. The general case of unbounded many initial graphs or infinite state spaces is only supported by approaches with rather limited scalability or expressiveness. In this report we improve an existing approach for the automated verification of inductive invariants for graph transformation systems. By employing partial negative application conditions to represent and check many alternative conditions in a more compact manner, we can check examples with rules and constraints of substantially higher complexity. We also substantially extend the expressive power by supporting more complex negative application conditions and provide higher accuracy by employing advanced implication checks. The improvements are evaluated and compared with another applicable tool by considering three case studies.
The correctness of model transformations is a crucial element for model-driven engineering of high quality software. In particular, behavior preservation is the most important correctness property avoiding the introduction of semantic errors during the model-driven engineering process. Behavior preservation verification techniques either show that specific properties are preserved, or more generally and complex, they show some kind of behavioral equivalence or refinement between source and target model of the transformation. Both kinds of behavior preservation verification goals have been presented with automatic tool support for the instance level, i.e. for a given source and target model specified by the model transformation. However, up until now there is no automatic verification approach available at the transformation level, i.e. for all source and target models specified by the model transformation.
In this report, we extend our results presented in [27] and outline a new sophisticated approach for the automatic verification of behavior preservation captured by bisimulation resp. simulation for model transformations specified by triple graph grammars and semantic definitions given by graph transformation rules. In particular, we show that the behavior preservation problem can be reduced to invariant checking for graph transformation and that the resulting checking problem can be addressed by our own invariant checker even for a complex example where a sequence chart is transformed into communicating automata. We further discuss today's limitations of invariant checking for graph transformation and motivate further lines of future work in this direction.
For interactive construction of CSG models understanding the layout of a model is essential for its efficient manipulation. To understand position and orientation of aggregated components of a CSG model, we need to realize its visible and occluded parts as a whole. Hence, transparency and enhanced outlines are key techniques to assist comprehension. We present a novel real-time rendering technique for visualizing design and spatial assembly of CSG models. As enabling technology we combine an image-space CSG rendering algorithm with blueprint rendering. Blueprint rendering applies depth peeling for extracting layers of ordered depth from polygonal models and then composes them in sorted order facilitating a clear insight of the models. We develop a solution for implementing depth peeling for CSG models considering their depth complexity. Capturing surface colors of each layer and later combining the results allows for generating order-independent transparency as one major rendering technique for CSG models. We further define visually important edges for CSG models and integrate an image-space edgeenhancement technique for detecting them in each layer. In this way, we extract visually important edges that are directly and not directly visible to outline a model’s layout. Combining edges with transparency rendering, finally, generates edge-enhanced depictions of image-based CSG models and allows us to realize their complex, spatial assembly.
Little is known about how far-reaching decisions in UN Security Council sanctions committees are made. Developing a novel committee governance concept and using examples drawn from sanctions imposed on Iraq, Al-Qaida, Congo, Sudan and Iran, this book shows that Council members tend to follow the will of the powerful, whereas sanctions committee members often decide according to the rules. This is surprising since both Council and committees are staffed by the same member states.
Offering a fascinating account of Security Council micro-politics and decision-making processes on sanctions, this rigorous comparative and theory-driven analysis treats the Council and its sanctions committees as distinguishable entities that may differ in decision practice despite having the same members. Drawing extensively on primary documents, diplomatic cables, well-informed press coverage, reports by close observers and extensive interviews with committee members, Council diplomats and sanctions experts, it contrasts with the conventional wisdom on decision-making within these bodies, which suggests that the powerful permanent members would not accept rule-based decisions against their interests.
This book will be of interest to policy practitioners and scholars working in the broad field of international organizations and international relations theory as well as those specializing in sanctions, international law, the Security Council and counter-terrorism.
Learning from failure
(2022)
Regression testing is a widespread practice in today's software industry to ensure software product quality. Developers derive a set of test cases, and execute them frequently to ensure that their change did not adversely affect existing functionality. As the software product and its test suite grow, the time to feedback during regression test sessions increases, and impedes programmer productivity: developers wait longer for tests to complete, and delays in fault detection render fault removal increasingly difficult.
Test case prioritization addresses the problem of long feedback loops by reordering test cases, such that test cases of high failure probability run first, and test case failures become actionable early in the testing process. We ask, given test execution schedules reconstructed from publicly available data, to which extent can their fault detection efficiency improved, and which technique yields the most efficient test schedules with respect to APFD?
To this end, we recover regression 6200 test sessions from the build log files of Travis CI, a popular continuous integration service, and gather 62000 accompanying changelists. We evaluate the efficiency of current test schedules, and examine the prioritization results of state-of-the-art lightweight, history-based heuristics. We propose and evaluate a novel set of prioritization algorithms, which connect software changes and test failures in a matrix-like data structure.
Our studies indicate that the optimization potential is substantial, because the existing test plans score only 30% APFD. The predictive power of past test failures proves to be outstanding: simple heuristics, such as repeating tests with failures in recent sessions, result in efficiency scores of 95% APFD. The best-performing matrix-based heuristic achieves a similar score of 92.5% APFD. In contrast to prior approaches, we argue that matrix-based techniques are useful beyond the scope of effective prioritization, and enable a number of use cases involving software maintenance.
We validate our findings from continuous integration processes by extending a continuous testing tool within development environments with means of test prioritization, and pose further research questions. We think that our findings are suited to propel adoption of (continuous) testing practices, and that programmers' toolboxes should contain test prioritization as an existential productivity tool.
Postcoloniale Literatur bezeichnet die nationalen anglophonen Literaturen in den Amerikas, Asien, Afrika und Ozeanien (zeitweise auch New English Literatures genannt). Eine Darstellung nach Regionen ist wegen der migrantischen Bewegungen der Autor/innen allerdings nicht zu leisten. Daher behandelt der Band die zentralen Themen der postkolonialen Debatte, die jeweils Autor/innen aus verschiedenen Regionen betreffen.
Despite its many challenges and limitations the concept of in situ upgrading of informal settlements has become one of the most favoured approaches to the housing crisis in the ‘Global South’. Due to its inherent principles of incremental in situ development, prevention of relocations, protection of local livelihoods and democratic participation and cooperation, this approach is often perceived to be more sustainable than other housing approaches that often rely on quantitative housing delivery and top down planning methodologies. While this study does not question the benefits of the in situ upgrading approach, it seeks to identify problems of its practical implementation within a specific national and local context. The study discusses the origin and importance of this approach on the basis of a review of international housing policy development and analyses the broader political and social context of the incorporation of this approach into South African housing policy. It further uses insights from a recent case study in Cape Town to determine complications and conflicts that can arise when applying in situ upgrading of informal settlements in a complex local context. On that basis benefits and limitations of the in situ upgrading approach are specified and prerequisites for its successful implementation formulated.
Language developers who design domain-specific languages or new language features need a way to make fast changes to language definitions. Those fast changes require immediate feedback. Also, it should be possible to parse the developed languages quickly to handle extensive sets of code.
Parsing expression grammars provides an easy to understand method for language definitions. Packrat parsing is a method to parse grammars of this kind, but this method is unable to handle left-recursion properly. Existing solutions either partially rewrite left-recursive rules and partly forbid them, or use complex extensions to packrat parsing that are hard to understand and cost-intensive. We investigated methods to make parsing as fast as possible, using easy to follow algorithms while not losing the ability to make fast changes to grammars.
We focused our efforts on two approaches.
One is to start from an existing technique for limited left-recursion rewriting and enhance it to work for general left-recursive grammars. The second approach is to design a grammar compilation process to find left-recursion before parsing, and in this way, reduce computational costs wherever possible and generate ready to use parser classes.
Rewriting parsing expression grammars is a task that, if done in a general way, unveils a large number of cases such that any rewriting algorithm surpasses the complexity of other left-recursive parsing algorithms. Lookahead operators introduce this complexity. However, most languages have only little portions that are left-recursive and in virtually all cases, have no indirect or hidden left-recursion. This means that the distinction of left-recursive parts of grammars from components that are non-left-recursive holds great improvement potential for existing parsers.
In this report, we list all the required steps for grammar rewriting to handle left-recursion, including grammar analysis, grammar rewriting itself, and syntax tree restructuring. Also, we describe the implementation of a parsing expression grammar framework in Squeak/Smalltalk and the possible interactions with the already existing parser Ohm/S. We quantitatively benchmarked this framework directing our focus on parsing time and the ability to use it in a live programming context. Compared with Ohm, we achieved massive parsing time improvements while preserving the ability to use our parser it as a live programming tool.
The work is essential because, for one, we outlined the difficulties and complexity that come with grammar rewriting. Also, we removed the existing limitations that came with left-recursion by eliminating them before parsing.
Business processes are instrumental to manage work in organisations. To study the interdependencies between business processes, Business Process Architectures have been introduced. These express trigger and message ow relations between business processes. When we investigate real world Business Process Architectures, we find complex interdependencies, involving multiple process instances. These aspects have not been studied in detail so far, especially concerning correctness properties. In this paper, we propose a modular transformation of BPAs to open nets for the analysis of behavior involving multiple business processes with multiplicities. For this purpose we introduce intermediary nets to portray semantics of multiplicity specifications. We evaluate our approach on a use case from the public sector.
Literature on the move
(2003)
Writing-between-worlds
(2016)
This paper presents in the first section a methodological introduction concerning statistics of consumer prices in Georgia. The second section gives a general idea of the development of consumer prices from January 1994 till September 1999. A detailed regional analysis is added in section 3. The fourth section analyses the development of consumer prices for the eight main groups included in the total CPI. Section 5 compares the changes in Georgian CPI with the movements of foreign exchange rates in Georgian Lari. This paper ends with a summary including a short outlook to the next years.