Refine
Document Type
- Article (5)
- Doctoral Thesis (2)
Is part of the Bibliography
- yes (7)
Keywords
- dynamic (7) (remove)
In order to face the rapidly increasing need for computational resources of various scientific and engineering applications one has to think of new ways to make more efficient use of the worlds current computational resources. In this respect, the growing speed of wide area networks made a new kind of distributed computing possible: Metacomputing or (distributed) Grid computing. This is a rather new and uncharted field in computational science. The rapidly increasing speed of networks even outperforms the average increase of processor speed: Processor speeds double on average each 18 month whereas network bandwidths double every 9 months. Due to this development of local and wide area networks Grid computing will certainly play a key role in the future of parallel computing. This type of distributed computing, however, distinguishes from the traditional parallel computing in many ways since it has to deal with many problems not occurring in classical parallel computing. Those problems are for example heterogeneity, authentication and slow networks to mention only a few. Some of those problems, e.g. the allocation of distributed resources along with the providing of information about these resources to the application have been already attacked by the Globus software. Unfortunately, as far as we know, hardly any application or middle-ware software takes advantage of this information, since most parallelizing algorithms for finite differencing codes are implicitly designed for single supercomputer or cluster execution. We show that although it is possible to apply classical parallelizing algorithms in a Grid environment, in most cases the observed efficiency of the executed code is very poor. In this work we are closing this gap. In our thesis, we will - show that an execution of classical parallel codes in Grid environments is possible but very slow - analyze this situation of bad performance, nail down bottlenecks in communication, remove unnecessary overhead and other reasons for low performance - develop new and advanced algorithms for parallelisation that are aware of a Grid environment in order to generelize the traditional parallelization schemes - implement and test these new methods, replace and compare with the classical ones - introduce dynamic strategies that automatically adapt the running code to the nature of the underlying Grid environment. The higher the performance one can achieve for a single application by manual tuning for a Grid environment, the lower the chance that those changes are widely applicable to other programs. In our analysis as well as in our implementation we tried to keep the balance between high performance and generality. None of our changes directly affect code on the application level which makes our algorithms applicable to a whole class of real world applications. The implementation of our work is done within the Cactus framework using the Globus toolkit, since we think that these are the most reliable and advanced programming frameworks for supporting computations in Grid environments. On the other hand, however, we tried to be as general as possible, i.e. all methods and algorithms discussed in this thesis are independent of Cactus or Globus.
Prevalence of Achilles tendinopathy increases with age leading to a weaker tendon with predisposition to rupture. Conclusive evidence of the influence of age and pathology on Achilles tendon (AT) properties remains limited, as previous studies are based on standardized isometric conditions. The study investigates the influence of age and pathology on AT properties during single-leg vertical jump (SLVJ). 10 children (C), 10 asymptomatic adults (A), and 10 tendinopathic patients (T) were included. AT elongation [mm] from rest to maximal displacement during a SLVJ on a force-plate was sonographically assessed. AT compliance [mm/N]) and strain [%] was calculated by dividing elongation by peak ground reaction force [N] and length, respectively. One-way ANOVA followed by Bonferroni post-hoc correction (=0.05) were used to compare C with A and A with T. AT elongation (p=0.004), compliance (p=0.001), and strain were found to be statistically significant higher in C (27 +/- 3mm, 0.026 +/- 0.006[mm/N], 13 +/- 2%) compared to A (21 +/- 4mm, 0.017 +/- 0.005[mm/N], 10 +/- 2%). No statistically significant differences (p0.05) was found between A and T (25 +/- 5mm, 0.019 +/- 0.004[mm/N], 12 +/- 3%). During SLVJ, tendon responded differently in regards to age and pathology with children having the most compliant AT. Higher compliance found in healthy tendons might be considered as a protective factor against load-related injuries.
In this study, the synthesis of new 5 (2-x-phenyl)-N,N-dimethyl-2H-tetrazole-2-carboxamides (X = H and Cl) is reported coupled with the investigation of their dynamic H-1-NMR via rotation about C-N bonds in the moiety of urea group [a; CO-NMe2] in DMSO solvent (298-373 K). Accordingly, activation free energies of 17.32 and 17.50 kcal mol(-1) were obtained for X = H and Cl respectively, with respect to the conformational isomerization about the Me2N-C=O bond (a rotation). Moreover, a and b [b; 2-tetrazolyl-CO rotations] barrier to rotations in 5-(2-x-phenyl)-N,N-dimethyl-2H-tetrazole-2-carboxamides were also calculated by B3LYP/6-311++G** procedure. The optimized geometry parameters are well consistent with the X-ray data. Computed rotational energy barriers (X = Cl) for a and b were estimated to be 17.52 and 2.53 kcal mol(-1), respectively, the former in agreement with the dynamic NMR results. X-ray structures verify that just 2-acylated tetrazoles are formed in the case of 5-(2-x-phenyl)-N,N-dimethyl-2H-tetrazole-2-carboxamides. A planar trigonal orientation of the Me2N group was proven by X-ray data, which is coplanar to the carbonyl group, coupled with partial double bond C-N character. This also illustrates the syn-periplanar position of the tetrazolyl ring with C=O group. In solution, the planes containing tetrazolyl ring and the carbonyl bond are almost perpendicular to each other (because of steric effects as confirmed by calculations) while the planes containing carbonyl bond and Me2N group are coplanar. This phenomenon is in contrast with similar urea derivatives and explains the reason for the unusually high rotational energy barrier of these compounds. (C) 2020 Elsevier B.V. All rights reserved.
RangeShifter 2.0
(2021)
Process-based models are becoming increasingly used tools for understanding how species are likely to respond to environmental changes and to potential management options. RangeShifter is one such modelling platform, which has been used to address a range of questions including identifying effective reintroduction strategies, understanding patterns of range expansion and assessing population viability of species across complex landscapes. Here we introduce a new version, RangeShifter 2.0, which incorporates important new functionality. It is now possible to simulate dynamics over user-specified, temporally changing landscapes. Additionally, we integrated a new genetic module, notably introducing an explicit genetic modelling architecture, which allows for simulation of neutral and adaptive genetic processes. Furthermore, emigration, transfer and settlement traits can now all evolve, allowing for sophisticated simulation of the evolution of dispersal. We illustrate the potential application of RangeShifter 2.0's new functionality by two examples. The first illustrates the range expansion of a virtual species across a dynamically changing UK landscape. The second demonstrates how the software can be used to explore the concept of evolving connectivity in response to land-use modification, by examining how movement rules come under selection over landscapes of different structure and composition. RangeShifter 2.0 is built using object-oriented C++ providing computationally efficient simulation of complex individual-based, eco-evolutionary models. The code has been redeveloped to enable use across operating systems, including on high performance computing clusters, and the Windows graphical user interface has been enhanced. RangeShifter 2.0 will facilitate the development of in-silico assessments of how species will respond to environmental changes and to potential management options for conserving or controlling them. By making the code available open source, we hope to inspire further collaborations and extensions by the ecological community.
In many revenue management applications risk-averse decision-making is crucial. In dynamic settings, however, it is challenging to find the right balance between maximizing expected rewards and minimizing various kinds of risk. In existing approaches utility functions, chance constraints, or (conditional) value at risk considerations are used to influence the distribution of rewards in a preferred way. Nevertheless, common techniques are not flexible enough and typically numerically complex. In our model, we exploit the fact that a distribution is characterized by its mean and higher moments. We present a multi-valued dynamic programming heuristic to compute risk-sensitive feedback policies that are able to directly control the moments of future rewards. Our approach is based on recursive formulations of higher moments and does not require an extension of the state space. Finally, we propose a self-tuning algorithm, which allows to identify feedback policies that approximate predetermined (risk-sensitive) target distributions. We illustrate the effectiveness and the flexibility of our approach for different dynamic pricing scenarios. (C) 2020 Elsevier Ltd. All rights reserved.
In many businesses, firms are selling different types of products, which share mutual substitution effects in demand. To compute effective pricing strategies is challenging as the sales probabilities of each of a firm's products can also be affected by the prices of potential substitutes. In this paper, we analyze stochastic dynamic multi-product pricing models for the sale of perishable goods. To circumvent the limitations of time-consuming optimal solutions for highly complex models, we propose different relaxation techniques, which allow to reduce the size of critical model components, such as the state space, the action space, or the set of potential sales events. Our heuristics are able to decrease the size of those components by forming corresponding clusters and using subsets of representative elements. Using numerical examples, we verify that our heuristics make it possible to dramatically reduce the computation time while still obtaining close-to-optimal expected profits. Further, we show that our heuristics are (i) flexible, (ii) scalable, and (iii) can be arbitrarily combined in a mutually supportive way.
Was ist HipHop?
(2021)
Es handelt sich bei der vorliegenden Dissertation um eine investigative Forschungsarbeit, die sich mit dem dynamisch wandelnden HipHop-Phänomen befasst. Der Autor erläutert hierbei die anhaltende Attraktivität des kulturellen Phänomens HipHop und versucht die Tatsache der stetigen Reproduzierbarkeit des HipHops genauer zu erklären. Daher beginnt er mit einer historischen Diskursanalyse der HipHop-Kultur. Er analysiert hierfür die Formen, die Protagonisten und die Diskurse des HipHops, um diesen besser verstehen zu können. Durch die Herausarbeitung der genuinen Eigenschaft der Mehrfachkodierbarkeit des HipHops werden gängige Erklärungsmuster aus Wissenschaft und Medien relativiert und kritisiert. Der Autor kombiniert in seiner Studie kultur- und erziehungswissenschaftliche Literatur mit diversen aktuellen und historischen Darstellungen und Bildern. Es werden vor allem bildbasierte Selbstinszenierungen von HipHoppern und Selbstzeugnisse aus narrativen Interviews, die er selbst mit verschiedenen HipHoppern in Deutschland geführt hat, ausgewertet. Neben den narrativen Interviews dient vor allem die Bildinterpretation nach Bohnsack als Quelle zur Bildung der These der Mehrfachkodierbarkeit. Hierbei werden zwei Bilder der HipHopper Lady Bitch Ray und Kollegah nach Bohnsack (2014) interpretiert und gezeigt wie HipHop neben der lyrischen und der klanglichen Komponente auch visuell inszeniert und produziert wird. Hieraus wird geschlussfolgert, dass es im HipHop möglich ist konträre Sichtweisen bei gleichzeitiger Anwendung von typischen Kulturpraktiken wie zum Beispiel dem Boasting darzustellen und zu vermitteln. Die stetige Offenheit des HipHops wird durch Praktiken wie dem Sampling oder dem Battle deutlich und der Autor erklärt, dass durch diese Techniken die generative Eigenschaft der Mehrfachkodierbarkeit hergestellt wird. Damit vertritt er eine Art Baukasten-Theorie, die besagt, dass sich prinzipiell jeder aus dem Baukasten HipHop, je nach Vorliebe, Interesse und Affinität, bedienen kann. Durch die Vielfalt an Meinungen zu HipHop, die der Autor durch die Kodierung der geführten narrativen Interviews erhält, wird diese These verdeutlicht und es wird klar, dass es sich bei HipHop um mehr als nur eine Mode handelt. HipHop besitzt die prinzipielle Möglichkeit durch die Offenheit, die er in sich trägt, sich stetig neu zu wandeln und damit an Beliebtheit und Popularität zuzunehmen. Die vorliegende Arbeit erweitert damit die immer größer werdende Forschung in den HipHop-Studies und setzt wichtige Akzente um weiter zu forschen und HipHop besser verständlich zu machen.