Frontiers
Refine
Has Fulltext
- yes (26)
Document Type
- Postprint (26)
Language
- English (26)
Is part of the Bibliography
- yes (26)
Keywords
- embodied cognition (3)
- mental number line (3)
- numerical cognition (3)
- EEG/ERP (2)
- aggression (2)
- anti-doping (2)
- cognitive enhancement (2)
- eye tracking (2)
- indirect tests (2)
- neuroenhancement (2)
Institute
- Humanwissenschaftliche Fakultät (26) (remove)
Objective: The aim of the present study was to examine the effect of Cold Water Immersion (CWI) on the recovery of physical performance, hematological stress markers and perceived wellness (i.e., Hooper scores) following a simulated Mixed Martial Arts (MMA) competition.
Methods: Participants completed two experimental sessions in a counter-balanced order (CWI or passive recovery for control condition: CON), after a simulated MMAs competition (3 x 5-min MMA rounds separated by 1-min of passive rest). During CWI, athletes were required to submerge their bodies, except the trunk, neck and head, in the seated position in a temperature-controlled bath (similar to 10 degrees C) for 15-min. During CON, athletes were required to be in a seated position for 15-min in same room ambient temperature. Venous blood samples (creatine kinase, cortisol, and testosterone concentrations) were collected at rest (PRE-EX, i.e., before MMAs), immediately following MMAs (POST-EX), immediately following recovery (POST-R) and 24 h post MMAs (POST-24), whilst physical fitness (squat jump, countermovement-jump and 5- and 10-m sprints) and perceptual measures (well-being Hooper index: fatigue, stress, delayed onset muscle soreness (DOMS), and sleep) were collected at PRE-EX, POST-R and POST-24, and at PRE-EX and POST-24, respectively.
Results: The main results indicate that POST-R sprint (5- and 10-m) performances were 'likely to very likely' (d = 0.64 and 0.65) impaired by prior CWI. However, moderate improvements were in 10-m sprint performance were 'likely' evident at POST-24 after CWI compared with CON (d = 0.53). Additionally, the use of CWI 'almost certainly' resulted in a large overall improvement in Hooper scores (d = 1.93). Specifically, CWI 'almost certainly' resulted in improved sleep quality (d = 1.36), stress (d = 1.56) and perceived fatigue (d = 1.51), and 'likely' resulted in a moderate decrease in DOMS (d = 0.60).
Conclusion: The use of CWI resulted in an enhanced recovery of 10-m sprint performance, as well as improved perceived wellness 24-h following simulated MMA competition.
From a health and performance-related perspective, it is crucial to evaluate subjective symptoms and objective signs of acute training-induced immunological responses in young athletes. The limited number of available studies focused on immunological adaptations following aerobic training. Hardly any studies have been conducted on resistance-training induced stress responses. Therefore, the aim of this observational study was to investigate subjective symptoms and objective signs of immunological stress responses following resistance training in young athletes. Fourteen (7 females and 7 males) track and field athletes with a mean age of 16.4 years and without any symptoms of upper or lower respiratory tract infections participated in this study. Over a period of 7 days, subjective symptoms using the Acute Recovery and Stress Scale (ARSS) and objective signs of immunological responses using capillary blood markers were taken each morning and after the last training session. Differences between morning and evening sessions and associations between subjective and objective parameters were analyzed using generalized estimating equations (GEE). In post hoc analyses, daily change-scores of the ARSS dimensions were compared between participants and revealed specific changes in objective capillary blood samples. In the GEE models, recovery (ARSS) was characterized by a significant decrease while stress (ARSS) showed a significant increase between morning and evening-training sessions. A concomitant increase in white blood cell count (WBC), granulocytes (GRAN) and percentage shares of granulocytes (GRAN%) was found between morning and evening sessions. Of note, percentage shares of lymphocytes (LYM%) showed a significant decrease. Furthermore, using multivariate regression analyses, we identified that recovery was significantly associated with LYM%, while stress was significantly associated with WBC and GRAN%. Post hoc analyses revealed significantly larger increases in participants’ stress dimensions who showed increases in GRAN%. For recovery, significantly larger decreases were found in participants with decreases in LYM% during recovery. More specifically, daily change-scores of the recovery and stress dimensions of the ARSS were associated with specific changes in objective immunological markers (GRAN%, LYM%) between morning and evening-training sessions. Our results indicate that changes of subjective symptoms of recovery and stress dimensions using the ARSS were associated with specific changes in objectively measured immunological markers.
The present study aimed to integrate findings from technology acceptance research with research on applicant reactions to new technology for the emerging selection procedure of asynchronous video interviewing. One hundred six volunteers experienced asynchronous video interviewing and filled out several questionnaires including one on the applicants' personalities. In line with previous technology acceptance research, the data revealed that perceived usefulness and perceived ease of use predicted attitudes toward asynchronous video interviewing. Furthermore, openness revealed to moderate the relation between perceived usefulness and attitudes toward this particular selection technology. No significant effects emerged for computer self-efficacy, job interview self efficacy, extraversion, neuroticism, and conscientiousness. Theoretical and practical implications are discussed.
It has been proposed that in online sentence comprehension the dependency between a reflexive pronoun such as himself/herself and its antecedent is resolved using exclusively syntactic constraints. Under this strictly syntactic search account, Principle A of the binding theory which requires that the antecedent c-command the reflexive within the same clause that the reflexive occurs in constrains the parser's search for an antecedent. The parser thus ignores candidate antecedents that might match agreement features of the reflexive (e.g., gender) but are ineligible as potential antecedents because they are in structurally illicit positions. An alternative possibility accords no special status to structural constraints: in addition to using Principle A, the parser also uses non-structural cues such as gender to access the antecedent. According to cue -based retrieval theories of memory (e.g., Lewis and Vasishth, 2005), the use of non-structural cues should result in increased retrieval times and occasional errors when candidates partially match the cues, even if the candidates are in structurally illicit positions. In this paper, we first show how the retrieval processes that underlie the reflexive binding are naturally realized in the Lewis and Vasishth (2005) model. We present the predictions of the model under the assumption that both structural and non-structural cues are used during retrieval, and provide a critical analysis of previous empirical studies that failed to find evidence for the use of non-structural cues, suggesting that these failures may be Type II errors. We use this analysis and the results of further modeling to motivate a new empirical design that we use in an eye tracking study. The results of this study confirm the key predictions of the model concerning the use of non-structural cues, and are inconsistent with the strictly syntactic search account. These results present a challenge for theories advocating the infallibility of the human parser in the case of reflexive resolution, and provide support for the inclusion of agreement features such as gender in the set of retrieval cues.
Understanding a sentence and integrating it into the discourse depends upon the identification of its focus, which, in spoken German, is marked by accentuation. In the case of written language, which lacks explicit cues to accent, readers have to draw on other kinds of information to determine the focus. We study the joint or interactive effects of two kinds of information that have no direct representation in print but have each been shown to be influential in the reader's text comprehension: (i) the (low-level) rhythmic-prosodic structure that is based on the distribution of lexically stressed syllables, and (ii) the (high-level) discourse context that is grounded in the memory of previous linguistic content. Systematically manipulating these factors, we examine the way readers resolve a syntactic ambiguity involving the scopally ambiguous focus operator auch (engl. "too") in both oral (Experiment 1) and silent reading (Experiment 2). The results of both experiments attest that discourse context and local linguistic rhythm conspire to guide the syntactic and, concomitantly, the focus-structural analysis of ambiguous sentences. We argue that reading comprehension requires the (implicit) assignment of accents according to the focus structure and that, by establishing a prominence profile, the implicit prosodic rhythm directly affects accent assignment.
Several personality dispositions with common features capturing sensitivities to negative social cues have recently been introduced into psychological research. To date, however, little is known about their interrelations, their conjoint effects on behavior, or their interplay with other risk factors. We asked N = 349 adults from Germany to rate their justice, rejection, moral disgust, and provocation sensitivity, hostile attribution bias, trait anger, and forms and functions of aggression. The sensitivity measures were mostly positively correlated; particularly those with an egoistic focus, such as victim justice, rejection, and provocation sensitivity, hostile attributions and trait anger as well as those with an altruistic focus, such as observer justice, perpetrator justice, and moral disgust sensitivity. The sensitivity measures had independent and differential effects on forms and functions of aggression when considered simultaneously and when controlling for hostile attributions and anger. They could not be integrated into a single factor of interpersonal sensitivity or reduced to other well-known risk factors for aggression. The sensitivity measures, therefore, require consideration in predicting and preventing aggression.
Editorial
(2016)
Infants start learning the prosodic properties of their native language before 12 months, as shown by the emergence of a trochaic bias in English-learning infants between 6 and 9 months (Jusczyk et al., 1993), and in German-learning infants between 4 and 6 months (Huhle et al., 2009, 2014), while French-learning infants do not show a bias at 6 months (Hohle et al., 2009). This language-specific emergence of a trochaic bias is supported by the fact that English and German are languages with trochaic predominance in their lexicons, while French is a language with phrase-final lengthening but lacking lexical stress. We explored the emergence of a trochaic bias in bilingual French/German infants, to study whether the developmental trajectory would be similar to monolingual infants and whether amount of relative exposure to the two languages has an impact on the emergence of the bias. Accordingly, we replicated Hohle et al. (2009) with 24 bilingual 6-month-olds learning French and German simultaneously. All infants had been exposed to both languages for 30 to 70% of the time from birth. Using the Head Preference Procedure, infants were presented with two lists of stimuli, one made up of several occurrences of the pseudoword /GAba/ with word-initial stress (trochaic pattern), the second one made up of several occurrences of the pseudoword /gaBA/ with word-final stress (iambic pattern). The stimuli were recorded by a native German female speaker. Results revealed that these French/German bilingual 6-month olds have a trochaic bias (as evidenced by a preference to listen to the trochaic pattern). Hence, their listening preference is comparable to that of monolingual German-learning 6-month-olds, but differs from that of monolingual French-learning 6-month-olds who did not show any preference (Noble et al., 2009). Moreover, the size of the trochaic bias in the bilingual infants was not correlated with their amount of exposure to German. The present results thus establish that the development of a trochaic bias in simultaneous bilinguals is not delayed compared to monolingual German-learning infants (Hohle et al., 2009) and is rather independent of the amount of exposure to German relative to French.
Drugs as instruments
(2016)
Neuroenhancement (NE) is the non-medical use of psychoactive substances to produce a subjective enhancement in psychological functioning and experience. So far empirical investigations of individuals' motivation for NE however have been hampered by the lack of theoretical foundation. This study aimed to apply drug instrumentalization theory to user motivation for NE. We argue that NE should be defined and analyzed from a behavioral perspective rather than in terms of the characteristics of substances used for NE. In the empirical study we explored user behavior by analyzing relationships between drug options (use over-the-counter products, prescription drugs, illicit drugs) and postulated drug instrumentalization goals (e.g., improved cognitive performance, counteracting fatigue, improved social interaction). Questionnaire data from 1438 university students were subjected to exploratory and confirmatory factor analysis to address the question of whether analysis of drug instrumentalization should be based on the assumption that users are aiming to achieve a certain goal and choose their drug accordingly or whether NE behavior is more strongly rooted in a decision to try or use a certain drug option. We used factor mixture modeling to explore whether users could be separated into qualitatively different groups defined by a shared "goal X drug option" configuration. Our results indicate, first, that individuals decisions about NE are eventually based on personal attitude to drug options (e.g., willingness to use an over-the-counter product but not to abuse prescription drugs) rather than motivated by desire to achieve a specific goal (e.g., fighting tiredness) for which different drug options might be tried. Second, data analyses suggested two qualitatively different classes of users. Both predominantly used over-the-counter products, but "neuroenhancers" might be characterized by a higher propensity to instrumentalize over-the-counter products for virtually all investigated goals whereas "fatigue-fighters" might be inclined to use over-the-counter products exclusively to fight fatigue. We believe that psychological investigations like these are essential, especially for designing programs to prevent risky behavior.