### Refine

#### Document Type

- Article (17)
- Doctoral Thesis (1)

By means of an intriguing physical example, magnetic surface swimmers, that can be described in terms of Dennett's intentional stance, I reconstruct a hierarchy of necessary and sufficient conditions for the applicability of the intentional strategy. It turns out that the different levels of the intentional hierarchy are contextually emergent from their respective subjacent levels by imposing stability constraints upon them. At the lowest level of the hierarchy, phenomenal physical laws emerge for the coarse-grained description of open, nonlinear, and dissipative non-equilibrium systems in critical states. One level higher, dynamic patterns, such as, for example, magnetic surface swimmers, are contextually emergent as they are invariant under certain symmetry operations. Again one level up, these patterns behave apparently rationally by selecting optimal pathways for the dissipation of energy that is delivered by external gradients. This is in accordance with the restated Second Law of thermodynamics as a stability criterion. At the highest level, true believers are intentional systems that are stable under exchanging their observation conditions.

Syntactic theory provides a rich array of representational assumptions about linguistic knowledge and processes. Such detailed and independently motivated constraints on grammatical knowledge ought to play a role in sentence comprehension. However most grammar-based explanations of processing difficulty in the literature have attempted to use grammatical representations and processes per se to explain processing difficulty. They did not take into account that the description of higher cognition in mind and brain encompasses two levels: on the one hand, at the macrolevel, symbolic computation is performed, and on the other hand, at the microlevel, computation is achieved through processes within a dynamical system. One critical question is therefore how linguistic theory and dynamical systems can be unified to provide an explanation for processing effects. Here, we present such a unification for a particular account to syntactic theory: namely a parser for Stabler's Minimalist Grammars, in the framework of Smolensky's Integrated Connectionist/Symbolic architectures. In simulations we demonstrate that the connectionist minimalist parser produces predictions which mirror global empirical findings from psycholinguistic research.

In most experiments using event-related brain potentials (ERPs), there is a straightforward way to define-on theoretical grounds-which of the conditions tested is the experimental condition and which is the control condition. It, however, theoretical assumptions do not give sufficient and unambiguous information to decide this question, then the interpretation of an ERP effect becomes difficult, especially if one takes into account that certain effects can be both a positivity or a negativity on the basis of the morphology of the pattern as well as with respect to peak latency (regard for example, N400 and P345). Exemplified with an ERP experiment on language processing, we present such a critical case and offer a possible solution on the basis of nonlinear data analysis. We show that a generalized polarity histogram, the word statistics of symbolic dynamics, is in principle able to distinguish negative going ERP components from positive ones when an appropriate encoding strategy, the half wave encoding is employed. We propose statistical criteria which allow to determine ERP components on purely methodological grounds

We describe a part of the stimulus sentences of a German language processing ERP experiment using a context- free grammar and represent different processing preferences by its unambiguous partitions. The processing is modeled by deterministic pushdown automata. Using a theorem proven by Moore, we map these automata onto discrete time dynamical systems acting at the unit square, where the processing preferences are represented by a control parameter. The actual states of the automata are rectangles lying in the unit square that can be interpreted as cylinder sets in the context of symbolic dynamics theory. We show that applying a wrong processing preference to a certain input string leads to an unwanted invariant set in the parsers dynamics. Then, syntactic reanalysis and repair can be modeled by a switching of the control parameter - in analogy to phase transitions observed in brain dynamics. We argue that ERP components are indicators of these bifurcations and propose an ERP-like measure of the parsing model