Refine
Year of publication
- 2016 (41) (remove)
Is part of the Bibliography
- yes (41) (remove)
Keywords
- morphologically complex words (4)
- language (3)
- prosody (3)
- 2nd-language (2)
- German (2)
- Japanese (2)
- brain potentials (2)
- cognitive enhancement (2)
- decompositon (2)
- dementia (2)
Institute
- Humanwissenschaftliche Fakultät (41) (remove)
The present study aimed to integrate findings from technology acceptance research with research on applicant reactions to new technology for the emerging selection procedure of asynchronous video interviewing. One hundred six volunteers experienced asynchronous video interviewing and filled out several questionnaires including one on the applicants' personalities. In line with previous technology acceptance research, the data revealed that perceived usefulness and perceived ease of use predicted attitudes toward asynchronous video interviewing. Furthermore, openness revealed to moderate the relation between perceived usefulness and attitudes toward this particular selection technology. No significant effects emerged for computer self-efficacy, job interview self efficacy, extraversion, neuroticism, and conscientiousness. Theoretical and practical implications are discussed.
It has been proposed that in online sentence comprehension the dependency between a reflexive pronoun such as himself/herself and its antecedent is resolved using exclusively syntactic constraints. Under this strictly syntactic search account, Principle A of the binding theory which requires that the antecedent c-command the reflexive within the same clause that the reflexive occurs in constrains the parser's search for an antecedent. The parser thus ignores candidate antecedents that might match agreement features of the reflexive (e.g., gender) but are ineligible as potential antecedents because they are in structurally illicit positions. An alternative possibility accords no special status to structural constraints: in addition to using Principle A, the parser also uses non-structural cues such as gender to access the antecedent. According to cue -based retrieval theories of memory (e.g., Lewis and Vasishth, 2005), the use of non-structural cues should result in increased retrieval times and occasional errors when candidates partially match the cues, even if the candidates are in structurally illicit positions. In this paper, we first show how the retrieval processes that underlie the reflexive binding are naturally realized in the Lewis and Vasishth (2005) model. We present the predictions of the model under the assumption that both structural and non-structural cues are used during retrieval, and provide a critical analysis of previous empirical studies that failed to find evidence for the use of non-structural cues, suggesting that these failures may be Type II errors. We use this analysis and the results of further modeling to motivate a new empirical design that we use in an eye tracking study. The results of this study confirm the key predictions of the model concerning the use of non-structural cues, and are inconsistent with the strictly syntactic search account. These results present a challenge for theories advocating the infallibility of the human parser in the case of reflexive resolution, and provide support for the inclusion of agreement features such as gender in the set of retrieval cues.
Do properties of individual languages shape the mechanisms by which they are processed? By virtue of their non-concatenative morphological structure, the recognition of complex words in Semitic languages has been argued to rely strongly on morphological information and on decomposition into root and pattern constituents. Here, we report results from a masked priming experiment in Hebrew in which we contrasted verb forms belonging to two morphological classes, Paal and Piel, which display similar properties, but crucially differ on whether they are extended to novel verbs. Verbs from the open-class Piel elicited familiar root priming effects, but verbs from the closed-class Paal did not. Our findings indicate that, similarly to other (e.g., Indo-European) languages, down-to-the-root decomposition in Hebrew does not apply to stems of non-productive verbal classes. We conclude that the Semitic word processor is less unique than previously thought: Although it operates on morphological units that are combined in a non-linear way, it engages the same universal mechanisms of storage and computation as those seen in other languages.
Do properties of individual languages shape the mechanisms by which they are processed? By virtue of their nonconcatenative
morphological structure, the recognition of complex words in Semitic languages has been argued to rely
strongly on morphological information and on decomposition into root and pattern constituents. Here, we report results
from a masked priming experiment in Hebrew in which we contrasted verb forms belonging to two morphological
classes, Paal and Piel, which display similar properties, but crucially differ on whether they are extended to novel verbs.
Verbs from the open-class Piel elicited familiar root priming effects, but verbs from the closed-class Paal did not. Our
findings indicate that, similarly to other (e.g., Indo-European) languages, down-to-the-root decomposition in Hebrew
does not apply to stems of non-productive verbal classes. We conclude that the Semitic word processor is less unique
than previously thought: Although it operates on morphological units that are combined in a non-linear way, it engages
the same universal mechanisms of storage and computation as those seen in other languages.
Background: The goal of this study was to estimate the prevalence of and risk factors for diagnosed depression in heart failure (HF) patients in German primary care practices.
Methods: This study was a retrospective database analysis in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 132,994 patients between 40 and 90 years of age from 1,072 primary care practices. The observation period was between 2004 and 2013. Follow-up lasted up to five years and ended in April 2015. A total of 66,497 HF patients were selected after applying exclusion criteria. The same number of 66,497 controls were chosen and were matched (1:1) to HF patients on the basis of age, sex, health insurance, depression diagnosis in the past, and follow-up duration after index date.
Results: HF was a strong risk factor for diagnosed depression (p < 0.0001). A total of 10.5% of HF patients and 6.3% of matched controls developed depression after one year of follow-up (p < 0.001). Depression was documented in 28.9% of the HF group and 18.2% of the control group after the five-year follow-up (p < 0.001). Cancer, dementia, osteoporosis, stroke, and osteoarthritis were associated with a higher risk of developing depression. Male gender and private health insurance were associated with lower risk of depression.
Conclusions: The risk of diagnosed depression is significantly increased in patients with HF compared to patients without HF in primary care practices in Germany.
Changes in free symptom attributions in hypochondriasis after cognitive therapy and exposure therapy
(2016)
Background: Cognitive-behavioural therapy can change dysfunctional symptom attributions in patients with hypochondriasis. Past research has used forced-choice answer formats, such as questionnaires, to assess these misattributions; however, with this approach, idiosyncratic attributions cannot be assessed. Free associations are an important complement to existing approaches that assess symptom attributions. Aims: With this study, we contribute to the current literature by using an open-response instrument to investigate changes in freely associated attributions after exposure therapy (ET) and cognitive therapy (CT) compared with a wait list (WL). Method: The current study is a re-examination of a formerly published randomized controlled trial (Weck, Neng, Richtberg, Jakob and Stangier, 2015) that investigated the effectiveness of CT and ET. Seventy-three patients with hypochondriasis were randomly assigned to CT, ET or a WL, and completed a 12-week treatment (or waiting period). Before and after the treatment or waiting period, patients completed an Attribution task in which they had to spontaneously attribute nine common bodily sensations to possible causes in an open-response format. Results: Compared with the WL, both CT and ET reduced the frequency of somatic attributions regarding severe diseases (CT: Hedges's g = 1.12; ET: Hedges's g = 1.03) and increased the frequency of normalizing attributions (CT: Hedges's g = 1.17; ET: Hedges's g = 1.24). Only CT changed the attributions regarding moderate diseases (Hedges's g = 0.69). Changes in somatic attributions regarding mild diseases and psychological attributions were not observed. Conclusions: Both CT and ET are effective for treating freely associated misattributions in patients with hypochondriasis. This study supplements research that used a forced-choice assessment.
Much research on language control in bilinguals has relied on the interpretation of the costs of switching between two languages. Of the two types of costs that are linked to language control, switching costs are assumed to be transient in nature and modulated by trial-specific manipulations (e.g., by preparation time), while mixing costs are supposed to be more stable and less affected by trial-specific manipulations. The present study investigated the effect of preparation time on switching and mixing costs, revealing that both types of costs can be influenced by trial-specific manipulations.
Rhythm perception is assumed to be guided by a domain-general auditory principle, the Iambic/Trochaic Law, stating that sounds varying in intensity are grouped as strong-weak, and sounds varying in duration are grouped as weak-strong. Recently, Bhatara et al. (2013) showed that rhythmic grouping is influenced by native language experience, French listeners having weaker grouping preferences than German listeners. This study explores whether L2 knowledge and musical experience also affect rhythmic grouping. In a grouping task, French late learners of German listened to sequences of coarticulated syllables varying in either intensity or duration. Data on their language and musical experience were obtained by a questionnaire. Mixed-effect model comparisons showed influences of musical experience as well as L2 input quality and quantity on grouping preferences. These results imply that adult French listeners' sensitivity to rhythm can be enhanced through L2 and musical experience.
Background: Dementia is a psychiatric condition the development of which is associated with numerous aspects of life. Our aim was to estimate dementia risk factors in German primary care patients.
Methods: The case-control study included primary care patients (70-90 years) with first diagnosis of dementia (all-cause) during the index period (01/2010-12/2014) (Disease Analyzer, Germany), and controls without dementia matched (1:1) to cases on the basis of age, sex, type of health insurance, and physician. Practice visit records were used to verify that there had been 10 years of continuous follow-up prior to the index date. Multivariate logistic regression models were fitted with dementia as a dependent variable and the potential predictors.
Results: The mean age for the 11,956 cases and the 11,956 controls was 80.4 (SD: 5.3) years. 39.0% of them were male and 1.9% had private health insurance. In the multivariate regression model, the following variables were linked to a significant extent with an increased risk of dementia: diabetes (OR: 1.17; 95% CI: 1.10-1.24), lipid metabolism (1.07; 1.00-1.14), stroke incl. TIA (1.68; 1.57-1.80), Parkinson's disease (PD) (1.89; 1.64-2.19), intracranial injury (1.30; 1.00-1.70), coronary heart disease (1.06; 1.00-1.13), mild cognitive impairment (MCI) (2.12; 1.82-2.48), mental and behavioral disorders due to alcohol use (1.96; 1.50-2.57). The use of statins (OR: 0.94; 0.90-0.99), proton-pump inhibitors (PPI) (0.93; 0.90-0.97), and antihypertensive drugs (0.96, 0.94-0.99) were associated with a decreased risk of developing dementia.
Conclusions: Risk factors for dementia found in this study are consistent with the literature. Nevertheless, the associations between statin, PPI and antihypertensive drug use, and decreased risk of dementia need further investigations.
Background: Given the well-established association between perceived stress and quality of life (QoL) in dementia patients and their partners, our goal was to identify whether relationship quality and dyadic coping would operate as mediators between perceived stress and QoL.
Methods: 82 dyads of dementia patients and their spousal caregivers were included in a cross-sectional assessment from a prospective study. QoL was assessed with the Quality of Life in Alzheimer's Disease scale (QoL-AD) for dementia patients and the WHO Quality of Life-BREF for spousal caregivers. Perceived stress was measured with the Perceived Stress Scale (PSS-14). Both partners were assessed with the Dyadic Coping Inventory (DCI). Analyses of correlation as well as regression models including mediator analyses were performed.
Results: We found negative correlations between stress and QoL in both partners (QoL-AD: r = -0.62; p < 0.001; WHO-QOL Overall: r = -0.27; p = 0.02). Spousal caregivers had a significantly lower DCI total score than dementia patients (p < 0.001). Dyadic coping was a significant mediator of the relationship between stress and QoL in spousal caregivers (z = 0.28; p = 0.02), but not in dementia patients. Likewise, relationship quality significantly mediated the relationship between stress and QoL in caregivers only (z = -2.41; p = 0.02).
Conclusions: This study identified dyadic coping as a mediator on the relationship between stress and QoL in (caregiving) partners of dementia patients. In patients, however, we found a direct negative effect of stress on QoL. The findings suggest the importance of stress reducing and dyadic interventions for dementia patients and their partners, respectively.
Several personality dispositions with common features capturing sensitivities to negative social cues have recently been introduced into psychological research. To date, however, little is known about their interrelations, their conjoint effects on behavior, or their interplay with other risk factors. We asked N = 349 adults from Germany to rate their justice, rejection, moral disgust, and provocation sensitivity, hostile attribution bias, trait anger, and forms and functions of aggression. The sensitivity measures were mostly positively correlated; particularly those with an egoistic focus, such as victim justice, rejection, and provocation sensitivity, hostile attributions and trait anger as well as those with an altruistic focus, such as observer justice, perpetrator justice, and moral disgust sensitivity. The sensitivity measures had independent and differential effects on forms and functions of aggression when considered simultaneously and when controlling for hostile attributions and anger. They could not be integrated into a single factor of interpersonal sensitivity or reduced to other well-known risk factors for aggression. The sensitivity measures, therefore, require consideration in predicting and preventing aggression.
Background:
Deception can distort psychological tests on socially sensitive topics. Understanding the cerebral
processes that are involved in such faking can be useful in detection and prevention of deception. Previous research
shows that faking a brief implicit association test (BIAT ) evokes a characteristic ERP response. It is not yet known
whether temporarily available self-control resources moderate this response. We randomly assigned 22 participants
(15 females, 24.23
±
2.91
years old) to a counterbalanced repeated-measurements design. Participants first com-
pleted a Brief-IAT (BIAT ) on doping attitudes as a baseline measure and were then instructed to fake a negative dop
-
ing attitude both when self-control resources were depleted and non-depleted. Cerebral activity during BIAT perfor
-
mance was assessed using high-density EEG.
Results:
Compared to the baseline BIAT, event-related potentials showed a first interaction at the parietal P1,
while significant post hoc differences were found only at the later occurring late positive potential. Here, signifi-
cantly decreased amplitudes were recorded for ‘normal’ faking, but not in the depletion condition. In source space,
enhanced activity was found for ‘normal’ faking in the bilateral temporoparietal junction. Behaviorally, participants
were successful in faking the BIAT successfully in both conditions.
Conclusions:
Results indicate that temporarily available self-control resources do not affect overt faking success on
a BIAT. However, differences were found on an electrophysiological level. This indicates that while on a phenotypical
level self-control resources play a negligible role in deliberate test faking the underlying cerebral processes are markedly different.
Aim: We aimed to identify patient characteristics and comorbidities that correlate with the initial exercise capacity of
cardiac rehabilitation (CR) patients and to study the significance of patient characteristics, comorbidities and training
methods for training achievements and final fitness of CR patients.
Methods: We studied 557 consecutive patients (51.7 Æ 6.9 years; 87.9% men) admitted to a three-week in-patient CR.
Cardiopulmonary exercise testing (CPX) was performed at discharge. Exercise capacity (watts) at entry, gain in training
volume and final physical fitness (assessed by peak O 2 utilization (VO 2peak ) were analysed using analysis of covariance
(ANCOVA) models.
Results: Mean training intensity was 90.7 Æ 9.7% of maximum heart rate (81% continuous/19% interval training, 64%
additional strength training). A total of 12.2 Æ 2.6 bicycle exercise training sessions were performed. Increase of training
volume by an average of more than 100% was achieved (difference end/beginning of CR: 784 Æ 623 watts  min). In the
multivariate model the gain in training volume was significantly associated with smoking, age and exercise capacity at
entry of CR. The physical fitness level achieved at discharge from CR as assessed by VO 2peak was mainly dependent on
age, but also on various factors related to training, namely exercise capacity at entry, increase of training volume and
training method.
Conclusion: CR patients were trained in line with current guidelines with moderate-to-high intensity and reached a
considerable increase of their training volume. The physical fitness level achieved at discharge from CR depended on
various factors associated with training, which supports the recommendation that CR should be offered to all cardiac
patients.
Editorial
(2016)
Infants start learning the prosodic properties of their native language before 12 months, as shown by the emergence of a trochaic bias in English-learning infants between 6 and 9 months (Jusczyk et al., 1993), and in German-learning infants between 4 and 6 months (Huhle et al., 2009, 2014), while French-learning infants do not show a bias at 6 months (Hohle et al., 2009). This language-specific emergence of a trochaic bias is supported by the fact that English and German are languages with trochaic predominance in their lexicons, while French is a language with phrase-final lengthening but lacking lexical stress. We explored the emergence of a trochaic bias in bilingual French/German infants, to study whether the developmental trajectory would be similar to monolingual infants and whether amount of relative exposure to the two languages has an impact on the emergence of the bias. Accordingly, we replicated Hohle et al. (2009) with 24 bilingual 6-month-olds learning French and German simultaneously. All infants had been exposed to both languages for 30 to 70% of the time from birth. Using the Head Preference Procedure, infants were presented with two lists of stimuli, one made up of several occurrences of the pseudoword /GAba/ with word-initial stress (trochaic pattern), the second one made up of several occurrences of the pseudoword /gaBA/ with word-final stress (iambic pattern). The stimuli were recorded by a native German female speaker. Results revealed that these French/German bilingual 6-month olds have a trochaic bias (as evidenced by a preference to listen to the trochaic pattern). Hence, their listening preference is comparable to that of monolingual German-learning 6-month-olds, but differs from that of monolingual French-learning 6-month-olds who did not show any preference (Noble et al., 2009). Moreover, the size of the trochaic bias in the bilingual infants was not correlated with their amount of exposure to German. The present results thus establish that the development of a trochaic bias in simultaneous bilinguals is not delayed compared to monolingual German-learning infants (Hohle et al., 2009) and is rather independent of the amount of exposure to German relative to French.
Drugs as instruments
(2016)
Neuroenhancement (NE) is the non-medical use of psychoactive substances to produce a subjective enhancement in psychological functioning and experience. So far empirical investigations of individuals' motivation for NE however have been hampered by the lack of theoretical foundation. This study aimed to apply drug instrumentalization theory to user motivation for NE. We argue that NE should be defined and analyzed from a behavioral perspective rather than in terms of the characteristics of substances used for NE. In the empirical study we explored user behavior by analyzing relationships between drug options (use over-the-counter products, prescription drugs, illicit drugs) and postulated drug instrumentalization goals (e.g., improved cognitive performance, counteracting fatigue, improved social interaction). Questionnaire data from 1438 university students were subjected to exploratory and confirmatory factor analysis to address the question of whether analysis of drug instrumentalization should be based on the assumption that users are aiming to achieve a certain goal and choose their drug accordingly or whether NE behavior is more strongly rooted in a decision to try or use a certain drug option. We used factor mixture modeling to explore whether users could be separated into qualitatively different groups defined by a shared "goal X drug option" configuration. Our results indicate, first, that individuals decisions about NE are eventually based on personal attitude to drug options (e.g., willingness to use an over-the-counter product but not to abuse prescription drugs) rather than motivated by desire to achieve a specific goal (e.g., fighting tiredness) for which different drug options might be tried. Second, data analyses suggested two qualitatively different classes of users. Both predominantly used over-the-counter products, but "neuroenhancers" might be characterized by a higher propensity to instrumentalize over-the-counter products for virtually all investigated goals whereas "fatigue-fighters" might be inclined to use over-the-counter products exclusively to fight fatigue. We believe that psychological investigations like these are essential, especially for designing programs to prevent risky behavior.
Calcularis is a computer-based training program which focuses on basic numerical skills, spatial representation of numbers and arithmetic operations. The program includes a user model allowing flexible adaptation to the child's individual knowledge and learning profile. The study design to evaluate the training comprises three conditions (Calcularis group, waiting control group, spelling training group). One hundred and thirty-eight children from second to fifth grade participated in the study. Training duration comprised a minimum of 24 training sessions of 20 min within a time period of 6-8 weeks. Compared to the group without training (waiting control group) and the group with an alternative training (spelling training group), the children of the Calcularis group demonstrated a higher benefit in subtraction and number line estimation with medium to large effect sizes. Therefore, Calcularis can be used effectively to support children in arithmetic performance and spatial number representation.
Effects of resistance training in youth athletes on muscular fitness and athletic performance
(2016)
During the stages of long-term athlete development (LTAD), resistance training (RT) is an important means for (i) stimulating athletic development, (ii) tolerating the demands of long-term training and competition, and (iii) inducing long-term health promoting effects that are robust over time and track into adulthood. However, there is a gap in the literature with regards to optimal RT methods during LTAD and how RT is linked to biological age. Thus, the aims of this scoping review were (i) to describe and discuss the effects of RT on muscular fitness and athletic performance in youth athletes, (ii) to introduce a conceptual model on how to appropriately implement different types of RT within LTAD stages, and (iii) to identify research gaps from the existing literature by deducing implications for future research. In general, RT produced small -to -moderate effects on muscular fitness and athletic performance in youth athletes with muscular strength showing the largest improvement. Free weight, complex, and plyometric training appear to be well -suited to improve muscular fitness and athletic performance. In addition, balance training appears to be an important preparatory (facilitating) training program during all stages of LTAD but particularly during the early stages. As youth athletes become more mature, specificity, and intensity of RT methods increase. This scoping review identified research gaps that are summarized in the following and that should be addressed in future studies: (i) to elucidate the influence of gender and biological age on the adaptive potential following RT in youth athletes (especially in females), (ii) to describe RT protocols in more detail (i.e., always report stress and strain based parameters), and (iii) to examine neuromuscular and tendomuscular adaptations following RT in youth athletes.
Previous research on the interplay between static manual postures and visual attention revealed enhanced visual selection near the hands (near-hand effect). During active movements there is also superior visual performance when moving toward compared to away from the stimulus (direction effect). The "modulated visual pathways" hypothesis argues that differential involvement of magno- and parvocellular visual processing streams causes the near-hand effect. The key finding supporting this hypothesis is an increase in temporal and a reduction in spatial processing in near-hand space (Gozli et al., 2012). Since this hypothesis has, so far, only been tested with static hand postures, we provide a conceptual replication of Gozli et al.'s (2012) result with moving hands, thus also probing the generality of the direction effect. Participants performed temporal or spatial gap discriminations while their right hand was moving below the display. In contrast to Gozli et al (2012), temporal gap discrimination was superior at intermediate and not near hand proximity. In spatial gap discrimination, a direction effect without hand proximity effect suggests that pragmatic attentional maps overshadowed temporal/spatial processing biases for far/near-hand space.