Refine
Year of publication
- 2014 (47) (remove)
Document Type
- Preprint (47) (remove)
Is part of the Bibliography
- yes (47)
Keywords
- singular perturbation (2)
- 2AFC (1)
- Betriebssysteme (1)
- Brownian motion (1)
- Carrera Digital D132 (1)
- Climate change (1)
- Climate variability (1)
- Echtzeit (1)
- Erfahrungsbericht (1)
- Fredholm property (1)
Institute
- Institut für Mathematik (12)
- Institut für Geowissenschaften (8)
- Department Psychologie (6)
- Institut für Physik und Astronomie (5)
- Institut für Chemie (4)
- Institut für Biochemie und Biologie (3)
- Sozialwissenschaften (2)
- Department Linguistik (1)
- Department Sport- und Gesundheitswissenschaften (1)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (1)
Many perceptual and cognitive tasks permit or require the integrated cooperation of specialized sensory channels, detectors, or other functionally separate units. In compound detection or discrimination tasks, 1 prominent general mechanism to model the combination of the output of different processing channels is probability summation. The classical example is the binocular summation model of Pirenne (1943), according to which a weak visual stimulus is detected if at least 1 of the 2 eyes detects this stimulus; as we review briefly, exactly the same reasoning is applied in numerous other fields. It is generally accepted that this mechanism necessarily predicts performance based on 2 (or more) channels to be superior to single channel performance, because 2 separate channels provide "2 chances" to succeed with the task. We argue that this reasoning is misleading because it neglects the increased opportunity with 2 channels not just for hits but also for false alarms and that there may well be no redundancy gain at all when performance is measured in terms of receiver operating characteristic curves. We illustrate and support these arguments with a visual detection experiment involving different spatial uncertainty conditions. Our arguments and findings have important implications for all models that, in one way or another, rest on, or incorporate, the notion of probability summation for the analysis of detection tasks, 2-alternative forced-choice tasks, and psychometric functions.
We argue that the theories of Volokitin and Persson (2014 New J. Phys. 16 118001), Dedkov and Kyasov (2008 J. Phys.: Condens. Matter 20 354006), and Pieplow and Henkel (2013 New J. Phys. 15 023027) agree on the electromagnetic force on a small, polarizable particle that is moving parallel to a planar, macroscopic body, as far as the contribution of evanescent waves is concerned. The apparent differences are discussed in detail and explained by choices of units and integral transformations. We point out in particular the role of the Lorentz contraction in the procedure used by Volokitin and Persson, where a macroscopic body is 'diluted' to obtain the force on a small particle. Differences that appear in the contribution of propagating photons are briefly mentioned.
Removing spatial responses reveals spatial concepts even in a culture with mixed reading habits
(2014)
In today’s life, embedded systems are ubiquitous. But they differ from traditional desktop systems in many aspects – these include predictable timing behavior (real-time), the management of scarce resources (memory, network), reliable communication protocols, energy management, special purpose user-interfaces (headless operation), system configuration, programming languages (to support software/hardware co-design), and modeling techniques. Within this technical report, authors present results from the lecture “Operating Systems for Embedded Computing” that has been offered by the “Operating Systems and Middleware” group at HPI in Winter term 2013/14. Focus of the lecture and accompanying projects was on principles of real-time computing. Students had the chance to gather practical experience with a number of different OSes and applications and present experiences with near-hardware programming. Projects address the entire spectrum, from bare-metal programming to harnessing a real-time OS to exercising the full software/hardware co-design cycle. Three outstanding projects are at the heart of this technical report. Project 1 focuses on the development of a bare-metal operating system for LEGO Mindstorms EV3. While still a toy, it comes with a powerful ARM processor, 64 MB of main memory, standard interfaces, such as Bluetooth and network protocol stacks. EV3 runs a version of 1 1 Introduction Linux. Sources are available from Lego’s web site. However, many devices and their driver software are proprietary and not well documented. Developing a new, bare-metal OS for the EV3 requires an understanding of the EV3 boot process. Since no standard input/output devices are available, initial debugging steps are tedious. After managing these initial steps, the project was able to adapt device drivers for a few Lego devices to an extent that a demonstrator (the Segway application) could be successfully run on the new OS. Project 2 looks at the EV3 from a different angle. The EV3 is running a pretty decent version of Linux- in principle, the RT_PREEMPT patch can turn any Linux system into a real-time OS by modifying the behavior of a number of synchronization constructs at the heart of the OS. Priority inversion is a problem that is solved by protocols such as priority inheritance or priority ceiling. Real-time OSes implement at least one of the protocols. The central idea of the project was the comparison of non-real-time and real-time variants of Linux on the EV3 hardware. A task set that showed effects of priority inversion on standard EV3 Linux would operate flawlessly on the Linux version with the RT_PREEMPT-patch applied. If only patching Lego’s version of Linux was that easy... Project 3 takes the notion of real-time computing more seriously. The application scenario was centered around our Carrera Digital 132 racetrack. Obtaining position information from the track, controlling individual cars, detecting and modifying the Carrera Digital protocol required design and implementation of custom controller hardware. What to implement in hardware, firmware, and what to implement in application software – this was the central question addressed by the project.
We establish in this paper the existence of weak solutions of infinite-dimensional shift invariant stochastic differential equations driven by a Brownian term. The drift function is very general, in the sense that it is supposed to be neither small or continuous, nor Markov. On the initial law we only assume that it admits a finite specific entropy. Our result strongly improves the previous ones obtained for free dynamics with a small perturbative drift. The originality of our method leads in the use of the specific entropy as a tightness tool and on a description of such stochastic differential equation as solution of a variational problem on the path space.
The zero-noise limit of differential equations with singular coefficients is investigated for the first time in the case when the noise is a general alpha-stable process. It is proved that extremal solutions are selected and the probability of selection is computed. Detailed analysis of the characteristic function of an exit time form on the half-line is performed, with a suitable decomposition in small and large jumps adapted to the singular drift.