### Refine

#### Is part of the Bibliography

- yes (8) (remove)

#### Keywords

- 2AFC (1)
- Activation suppression model (1)
- Bayesian inference (1)
- Category effect (1)
- Consciousness (1)
- Decision making (1)
- Delta plot (1)
- Libet (1)
- Mental number line (1)
- Neuroscience (1)

#### Institute

- Institut fĂĽr Psychologie (8) (remove)

Many perceptual and cognitive tasks permit or require the integrated cooperation of specialized sensory channels, detectors, or other functionally separate units. In compound detection or discrimination tasks, 1 prominent general mechanism to model the combination of the output of different processing channels is probability summation. The classical example is the binocular summation model of Pirenne (1943), according to which a weak visual stimulus is detected if at least 1 of the 2 eyes detects this stimulus; as we review briefly, exactly the same reasoning is applied in numerous other fields. It is generally accepted that this mechanism necessarily predicts performance based on 2 (or more) channels to be superior to single channel performance, because 2 separate channels provide "2 chances" to succeed with the task. We argue that this reasoning is misleading because it neglects the increased opportunity with 2 channels not just for hits but also for false alarms and that there may well be no redundancy gain at all when performance is measured in terms of receiver operating characteristic curves. We illustrate and support these arguments with a visual detection experiment involving different spatial uncertainty conditions. Our arguments and findings have important implications for all models that, in one way or another, rest on, or incorporate, the notion of probability summation for the analysis of detection tasks, 2-alternative forced-choice tasks, and psychometric functions.

We consider the specific transformation of a Wiener process {X(t), t >= 0} in the presence of an absorbing barrier a that results when this process is "time-locked" with respect to its first passage time T-a through a criterion level a, and the evolution of X(t) is considered backwards ( retrospectively) from T-a. Formally, we study the random variables defined by Y(t) = X(T-a - t) and derive explicit results for their density and mean, and also for their asymptotic forms. We discuss how our results can aid interpretations of time series "response-locked" to their times of crossing a criterion level.

We present a new quantitative process model (GSDT) of visual search that seeks to integrate various processing mechanisms suggested by previous studies within a single, coherent conceptual frame. It incorporates and combines 4 distinct model components: guidance (G), a serial (S) item inspection process, diffusion (D) modeling of individual item inspections, and a strategic termination (T) rule. For this model, we derive explicit closed-form results for response probability and mean search time (reaction time [RT]) as a function of display size and target presence/absence. The fit of the model is compared in detail to data from 4 visual search experiments in which the effects of target/distractor discriminability and of target prevalence on performance (present/absent display size functions for mean RT and error rate) are studied. We describe how GSDT accounts for various detailed features of our results such as the probabilities of hits and correct rejections and their mean RTs; we also apply the model to explain further aspects of the data, such as RT variance and mean miss RT.

Delta plots (DPs) graphically compare reaction time (RT) quantiles obtained under two experimental conditions. In some research areas (e.g., Simon effects), decreasing delta plots (nDPs) have consistently been found, indicating that the experimental effect is largest at low quantiles and decreases for higher quantiles. nDPs are unusual and intriguing: They imply that RT in the faster condition is more variable, a pattern predicted by few standard RT models. We describe and analyze five classes of well-established latency mechanisms that are consistent with nDPs-exhaustive processing models, correlated stage models, mixture models, cascade models, and parallel channels models-and discuss the implications of our analyses for the interpretation of DPs. DPs generally do not imply any specific processing model; therefore, it is more fruitful to start from a specific quantitative model and to compare the DP it predicts with empirical data.

We present three experiments in which observers searched for a target digit among distractor digits in displays in which the mean numerical target-distractor distance was varied. Search speed and accuracy increased with numerical distance in both target-present and target-absent trials (Exp. 1A). In Experiment 1B, the target 5 was replaced with the letter S. The results suggest that the findings of Experiment 1A do not simply reflect the fact that digits that were numerically closer to the target coincidentally also shared more physical features with it. In Experiment 2, the numerical distance effect increased with set size in both target-present and target-absent trials. These findings are consistent with the view that increasing numerical target-distractor distance affords faster nontarget rejection and target identification times. Recent neurobiological findings (e.g., Nieder, 2011) on the neuronal coding of numerosity have reported a width of tuning curves of numerosity-selective neurons that suggests graded, distance-dependent coactivation of the representations of adjacent numbers, which in visual search would make it harder to reject numerically closer distractors as nontargets.

Comparing continuous and discrete birthday coincidences : "Same-Day" versus "Within 24 Hours"
(2010)

In its classical form the famous birthday problem (Feller 1968; Mosteller 1987) addresses coincidences within a discrete sample space, looking at births that fall on the same calendar day. However, coincidence phenomena often arise in situations in which it is more natural to consider a continuous-time parameter. We first describe an elementary variant of the classical problem in continuous time, and then derive and illustrate close approximate relations that exist between the discrete and the continuous formulations.

Neuroscientific studies have shown that brain activity correlated with a decision to move can be observed before a person reports being consciously aware of having made that decision (e.g., Libet, Gleason, Wright, & Pearl, 1983; Soon, Brass, Heinze, & Haynes, 2008). Given that a later event (i.e., conscious awareness) cannot cause an earlier one (i.e., decision-related brain activity), such results have been interpreted as evidence that decisions are made unconsciously (e.g., Libet, 1985). We argue that this interpretation depends upon an all-or-none view of consciousness, and we offer an alternative interpretation of the early decision-related brain activity based on models in which conscious awareness of the decision to move develops gradually up to the level of a reporting criterion. Under this interpretation, the early brain activity reflects sub-criterion levels of awareness rather than complete absence of awareness and thus does not suggest that decisions are made unconsciously.

Aggregate and individual replication probability within an explicit model of the research process
(2011)

We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by obtaining either a statistically significant result in the same direction or any effect in that direction. We analyze both the probability of successfully replicating a particular experimental effect (i.e., the individual replication probability) and the average probability of successful replication across different studies within some research context (i.e., the aggregate replication probability), and we identify the conditions under which the latter can be approximated using the formulas of Killeen (2005a, 2007). We show how both of these probabilities depend on parameters of the research context that would rarely be known in practice. In addition, we show that the statistical uncertainty associated with the size of an initial observed effect would often prevent accurate estimation of the desired individual replication probability even if these research context parameters were known exactly. We conclude that accurate estimates of replication probability are generally unattainable.