Refine
Has Fulltext
- no (49) (remove)
Year of publication
Document Type
- Article (47)
- Habilitation Thesis (2)
Is part of the Bibliography
- yes (49)
Keywords
- sentence comprehension (3)
- Exercise (2)
- MiSpEx (2)
- cognitive impairment (2)
- physical activity (2)
- postural control (2)
- stress (2)
- validity (2)
- Basketball (1)
- Blood (1)
Institute
- Humanwissenschaftliche Fakultät (49) (remove)
Introduction
To date, several meta-analyses clearly demonstrated that resistance and plyometric training are effective to improve physical fitness in children and adolescents. However, a methodological limitation of meta-analyses is that they synthesize results from different studies and hence ignore important differences across studies (i.e., mixing apples and oranges). Therefore, we aimed at examining comparative intervention studies that assessed the effects of age, sex, maturation, and resistance or plyometric training descriptors (e.g., training intensity, volume etc.) on measures of physical fitness while holding other variables constant.
Methods
To identify relevant studies, we systematically searched multiple electronic databases (e.g., PubMed) from inception to March 2018. We included resistance and plyometric training studies in healthy young athletes and non-athletes aged 6 to 18 years that investigated the effects of moderator variables (e.g., age, maturity, sex, etc.) on components of physical fitness (i.e., muscle strength and power).
Results
Our systematic literature search revealed a total of 75 eligible resistance and plyometric training studies, including 5,138 participants. Mean duration of resistance and plyometric training programs amounted to 8.9 ± 3.6 weeks and 7.1±1.4 weeks, respectively. Our findings showed that maturation affects plyometric and resistance training outcomes differently, with the former eliciting greater adaptations pre-peak height velocity (PHV) and the latter around- and post-PHV. Sex has no major impact on resistance training related outcomes (e.g., maximal strength, 10 repetition maximum). In terms of plyometric training, around-PHV boys appear to respond with larger performance improvements (e.g., jump height, jump distance) compared with girls. Different types of resistance training (e.g., body weight, free weights) are effective in improving measures of muscle strength (e.g., maximum voluntary contraction) in untrained children and adolescents. Effects of plyometric training in untrained youth primarily follow the principle of training specificity. Despite the fact that only 6 out of 75 comparative studies investigated resistance or plyometric training in trained individuals, positive effects were reported in all 6 studies (e.g., maximum strength and vertical jump height, respectively).
Conclusions
The present review article identified research gaps (e.g., training descriptors, modern alternative training modalities) that should be addressed in future comparative studies.
Accuracy of training recommendations based on a treadmill multistage incremental exercise test
(2018)
Competitive runners will occasionally undergo exercise in a laboratory setting to obtain predictive and prescriptive information regarding their performance. The present research aimed to assess whether the physiological demands of lab-based treadmill running (TM) can simulate that of over-ground (OG) running using a commonly used protocol. Fifteen healthy volunteers with a weekly mileage of ≥ 20 km over the past 6 months and treadmill experience participated in this cross-sectional study. Two stepwise incremental tests until volitional exhaustion was performed in a fixed order within one week in an Outpatient Clinic research laboratory and outdoor athletic track. Running velocity (IATspeed), heart rate (IATHR) and lactate concentration at the individual anaerobic threshold (IATbLa) were primary endpoints. Additionally, distance covered (DIST), maximal heart rate (HRmax), maximal blood lactate concentration (bLamax) and rate of perceived exertion (RPE) at IATspeed were analyzed. IATspeed, DIST and HRmax were not statistically significantly different between conditions, whereas bLamax and RPE at IATspeed showed statistical significance (p < 0.05). Apart from RPE at IATspeed, IATspeed, DIST, HRmax and bLamax strongly correlate between conditions (r = 0.815–0.988). High reliability between conditions provides strong evidence to suggest that running on a treadmill are physiologically comparable to that of OG and that training recommendations and be made with assurance.
Recent research indicates that affective responses during exercise are an important determinant of future exercise and physical activity. Thus far these responses have been measured with standardized self-report scales, but this study used biometric software for automated facial action analysis to analyze the changes that occur during physical exercise. A sample of 132 young, healthy individuals performed an incremental test on a cycle ergometer. During that test the participants’ faces were video-recorded and the changes were algorithmically analyzed at frame rate (30 fps). Perceived exertion and affective valence were measured every two minutes with established psychometric scales. Taking into account anticipated inter-individual variability, multilevel regression analysis was used to model how affective valence and ratings of perceived exertion (RPE) covaried with movement in 20 facial action areas. We found the expected quadratic decline in self-reported affective valence (more negative) as exercise intensity increased. Repeated measures correlation showed that the facial action mouth open was linked to changes in (highly intercorrelated) affective valence and RPE. Multilevel trend analyses were calculated to investigate whether facial actions were typically linked to either affective valence or RPE. These analyses showed that mouth open and jaw drop predicted RPE, whereas (additional) nose wrinkle was indicative for the decline in affective valence. Our results contribute to the view that negative affect, escalating with increasing exercise intensity, may be the body’s essential warning signal that physiological overload is imminent. We conclude that automated facial action analysis provides new options for researchers investigating feelings during exercise. In addition, our findings offer physical educators and coaches a new way of monitoring the affective state of exercisers, without interrupting and asking them.
Hatred directed at members of groups due to their origin, race, gender, religion, or sexual orientation is not new, but it has taken on a new dimension in the online world. To date, very little is known about online hate among adolescents. It is also unknown how online disinhibition might influence the association between being bystanders and being perpetrators of online hate. Thus, the present study focused on examining the associations among being bystanders of online hate, being perpetrators of online hate, and the moderating role of toxic online disinhibition in the relationship between being bystanders and perpetrators of online hate. In total, 1480 students aged between 12 and 17 years old were included in this study. Results revealed positive associations between being online hate bystanders and perpetrators, regardless of whether adolescents had or had not been victims of online hate themselves. The results also showed an association between toxic online disinhibition and online hate perpetration. Further, toxic online disinhibition moderated the relationship between being bystanders of online hate and being perpetrators of online hate. Implications for prevention programs and future research are discussed.
The objective of the study is to develop a better understanding of the capillary circulation in contracting muscles. Ten subjects were measured during a submaximal fatiguing isometric muscle action by use of the O2C spectrophotometer. In all measurements the capillary-venous oxygen saturation of hemoglobin (SvO2) decreases immediately after the start of loading and levels off into a steady state. However, two different patterns (type I and type II) emerged. They differ in the extent of deoxygenation (–10.37 ±2.59 percent points (pp) vs. –33.86 ±17.35 pp, P = .008) and the behavior of the relative hemoglobin amount (rHb). Type I reveals a positive rank correlation of SvO2 and rHb (? = 0.735, P <.001), whereas a negative rank correlation (? = –0.522, P <.001) occurred in type II, since rHb decreases until a reversal point, then increases averagely 13% above the baseline value and levels off into a steady state. The results reveal that a homeostasis of oxygen delivery and consumption during isometric muscle actions is possible. A rough distinction in two types of regulation is suggested.
Direct assessment of attitudes toward socially sensitive topics can be affected by deception attempts. Reaction-time based indirect measures, such as the Implicit Association Test (IAT), are less susceptible to such biases. Neuroscientific evidence shows that deception can evoke characteristic ERP differences. However, the cerebral processes involved in faking an IAT are still unknown. We randomly assigned 20 university students (15 females, 24.65 +/- 3.50 years of age) to a counterbalanced repeated-measurements design, requesting them to complete a Brief-IAT (BIAT) on attitudes toward doping without deception instruction, and with the instruction to fake positive and negative doping attitudes. Cerebral activity during BIAT completion was assessed using high-density EEG. Event-related potentials during faking revealed enhanced frontal and reduced occipital negativity, starting around 150 ms after stimulus presentation. Further, a decrease in the P300 and LPP components was observed. Source analyses showed enhanced activity in the right inferior frontal gyrus between 150 and 200 ms during faking, thought to reflect the suppression of automatic responses. Further, more activity was found for faking in the bilateral middle occipital gyri and the bilateral temporoparietal junction. Results indicate that faking reaction-time based tests alter brain processes from early stages of processing and reveal the cortical sources of the effects. Analyzing the EEG helps to uncover response patterns in indirect attitude tests and broadens our understanding of the neural processes involved in such faking. This knowledge might be useful for uncovering faking in socially sensitive contexts, where attitudes are likely to be concealed.
The purpose of this study was to compare the effects of combined resistance and plyometric/sprint training with plyometric/sprint training or typical soccer training alone on muscle strength and power, speed, change-of-direction ability in young soccer players. Thirty-one young (14.5 ± 0.52 years; tanner stage 3–4) soccer players were randomly assigned to either a combined- (COMB, n = 14), plyometric-training (PLYO, n = 9) or an active control group (CONT, n = 8). Two training sessions were added to the regular soccer training consisting of one session of light-load high-velocity resistance exercises combined with one session of plyometric/sprint training (COMB), two sessions of plyometric/sprint training (PLYO) or two soccer training sessions (CONT). Training volume was similar between the experimental groups. Before and after 7-weeks of training, peak torque, as well as absolute and relative (normalized to torque; RTDr) rate of torque development (RTD) during maximal voluntary isometric contraction of the knee extensors (KE) were monitored at time intervals from the onset of contraction to 200 ms. Jump height, sprinting speed at 5, 10, 20-m and change-of-direction ability performances were also assessed. There were no significant between–group baseline differences. Both COMB and PLYO significantly increased their jump height (Δ14.3%; ES = 0.94; Δ12.1%; ES = 0.54, respectively) and RTD at mid to late phases but with greater within effect sizes in COMB in comparison with PLYO. However, significant increases in peak torque (Δ16.9%; p < 0.001; ES = 0.58), RTD (Δ44.3%; ES = 0.71), RTDr (Δ27.3%; ES = 0.62) and sprint performance at 5-m (Δ-4.7%; p < 0.001; ES = 0.73) were found in COMB without any significant pre-to-post change in PLYO and CONT groups. Our results suggest that COMB is more effective than PLYO or CONT for enhancing strength, sprint and jump performances.
Social comparison processes and the social position within a school class already play a major role in performance evaluation as early as in elementary school. The influence of contrast and assimilation effects on self-evaluation of performance as well as task interest has been widely researched in observational studies under the labels big-fish-little-pond and basking-in-reflected-glory effect. This study examined the influence of similar contrast and assimilation effects in an experimental paradigm. Fifth and sixth grade students (n = 230) completed a computer-based learning task during which they received social comparative feedback based on 2 × 2 experimentally manipulated feedback conditions: social position (high vs. low) and peer performance (high vs. low). Results show a more positive development of task interest and self-evaluation of performance in both the high social position and the high peer performance condition. When applied to the school setting, results of this study suggest that students who already perform well in comparison to their peer group are also the ones who profit most from social comparative feedback, given that they are the ones who usually receive the corresponding positive performance feedback.
There is evidence for cortical contribution to the regulation of human postural control. Interference from concurrently performed cognitive tasks supports this notion, and the lateral prefrontal cortex (lPFC) has been suggested to play a prominent role in the processing of purely cognitive as well as cognitive-postural dual tasks. The degree of cognitive-motor interference varies greatly between individuals, but it is unresolved whether individual differences in the recruitment of specific lPFC regions during cognitive dual tasking are associated with individual differences in cognitive-motor interference. Here, we investigated inter-individual variability in a cognitive-postural multitasking situation in healthy young adults (n = 29) in order to relate these to inter-individual variability in lPFC recruitment during cognitive multitasking. For this purpose, a oneback working memory task was performed either as single task or as dual task in order to vary cognitive load. Participants performed these cognitive single and dual tasks either during upright stance on a balance pad that was placed on top of a force plate or during fMRI measurement with little to no postural demands. We hypothesized dual one-back task performance to be associated with lPFC recruitment when compared to single one-back task performance. In addition, we expected individual variability in lPFC recruitment to be associated with postural performance costs during concurrent dual one-back performance. As expected, behavioral performance costs in postural sway during dual-one back performance largely varied between individuals and so did lPFC recruitment during dual one-back performance. Most importantly, individuals who recruited the right mid-lPFC to a larger degree during dual one-back performance also showed greater postural sway as measured by larger performance costs in total center of pressure displacements. This effect was selective to the high-load dual one-back task and suggests a crucial role of the right lPFC in allocating resources during cognitivemotor interference. Our study provides further insight into the mechanisms underlying cognitive-motor multitasking and its impairments.
Purpose: The acquisition of skills is essential to the conceptualization of cognitive-behavioural therapy. Yet, what experiences are encountered and what skills actually learned during therapy, and whether patients and therapists have concurrent views hereof, remains poorly understood. Method: An explorative pilot study with semi-structured, corresponding interview guides was conducted. Pilot data from our outpatient unit were transcribed and content-analyzed following current guidelines. Results: The responses of 18 participants (patients and their psychotherapists) were assigned to six main categories. Educational and cognitive aspects were mentioned most frequently and consistently by both groups. Having learned Behavioural alternatives attained the second highest agreement between perspectives. Conclusions: Patients and therapists valued CBT as an opportunity to learn new skills, which is an important prerequisite also for the maintenance of therapeutic change. We discuss limitations to generalizability but also theoretical and therapy implications.
While the consequences of cyberbullying victimization have received some attention in the literature, to date, little is known about the multiple types of strains in adolescents’ lives, such as whether cyberbullying victimization and peer rejection increase their vulnerability to depression and anxiety. Even though some research found that adolescents with disabilities show higher risk for cyberbullying victimization, most research has focused on typically developing adolescents. Thus, the present study focused on examining the moderating effect of peer rejection in the relationships between cyberbullying victimization, depression, and anxiety among adolescents with autism spectrum disorder. There were 128 participants (89% male; ages ranging from 11–16 years old) with autism spectrum disorder in the sixth, seventh, or eighth grade at 16 middle schools in the United States. Participants completed questionnaires on cyberbullying victimization, peer rejection, depression, and anxiety. Results revealed that cyberbullying victimization was associated positively with peer rejection, anxiety, and depression among adolescents with autism spectrum disorder. Further, peer rejection was linked positively with depression and anxiety. Peer rejection moderated the positive relationship between cyberbullying victimization and depression, but not anxiety. Implications for prevention programs and future research are discussed.
Background: Core-specific sensorimotor exercises are proven to enhance neuromuscular activity of the trunk, improve athletic performance and prevent back pain. However, the dose-response relationship and, therefore, the dose required to improve trunk function is still under debate. The purpose of the present trial will be to compare four different intervention strategies of sensorimotor exercises that will result in improved trunk function.
Methods/design: A single-blind, four-armed, randomized controlled trial with a 3-week (home-based) intervention phase and two measurement days pre and post intervention (M1/M2) is designed. Experimental procedures on both measurement days will include evaluation of maximum isokinetic and isometric trunk strength (extension/flexion, rotation) including perturbations, as well as neuromuscular trunk activity while performing strength testing. The primary outcome is trunk strength (peak torque). Neuromuscular activity (amplitude, latencies as a response to perturbation) serves as secondary outcome. The control group will perform a standardized exercise program of four sensorimotor exercises (three sets of 10 repetitions) in each of six training sessions (30 min duration) over 3 weeks. The intervention groups’ programs differ in the number of exercises, sets per exercise and, therefore, overall training amount (group I: six sessions, three exercises, two sets; group II: six sessions, two exercises, two sets; group III: six sessions, one exercise, three sets). The intervention programs of groups I, II and III include additional perturbations for all exercises to increase both the difficulty and the efficacy of the exercises performed. Statistical analysis will be performed after examining the underlying assumptions for parametric and non-parametric testing.
Discussion: The results of the study will be clinically relevant, not only for researchers but also for (sports) therapists, physicians, coaches, athletes and the general population who have the aim of improving trunk function.
Objective: To investigate associations between socioeconomic status (SES) indicators (education, job position, income, multidimensional index) and the genesis of chronic low back pain (CLBP).
Design: Longitudinal field study (baseline and 6-month follow-up).
Setting: Four medical clinics across Germany.
Participants: 352 people were included according to the following criteria: (1) between 18 and 65 years of age, (2) intermittent pain and (3) an understanding of the study and the ability to answer a questionnaire without help. Exclusion criteria were: (1) pregnancy, (2) inability to stand upright, (3) inability to give sick leave information, (4) signs of serious spinal pathology, (5) acute pain in the past 7 days or (6) an incomplete SES indicators questionnaire.
Outcome measures: Subjective intensity and disability of CLBP.
Results: Analysis showed that job position was the best single predictor of CLBP intensity, followed by a multidimensional index. Education and income had no significant association with intensity. Subjective disability was best predicted by job position, succeeded by the multidimensional index and education, while income again had no significant association.
Conclusion: The results showed that SES indicators have different strong associations with the genesis of CLBP and should therefore not be used interchangeably. Job position was found to be the single most important indicator. These results could be helpful in the planning of back pain care programmes, but in general, more research on the relationship between SES and health outcomes is needed.
Objective: To examine the effect of plyometric jump training on skeletal muscle hypertrophy in healthy individuals.
Methods: A systematic literature search was conducted in the databases PubMed, SPORTDiscus, Web of Science, and Cochrane Library up to September 2021.
Results: Fifteen studies met the inclusion criteria. The main overall finding (44 effect sizes across 15 clusters median = 2, range = 1–15 effects per cluster) indicated that plyometric jump training had small to moderate effects [standardised mean difference (SMD) = 0.47 (95% CIs = 0.23–0.71); p < 0.001] on skeletal muscle hypertrophy. Subgroup analyses for training experience revealed trivial to large effects in non-athletes [SMD = 0.55 (95% CIs = 0.18–0.93); p = 0.007] and trivial to moderate effects in athletes [SMD = 0.33 (95% CIs = 0.16–0.51); p = 0.001]. Regarding muscle groups, results showed moderate effects for the knee extensors [SMD = 0.72 (95% CIs = 0.66–0.78), p < 0.001] and equivocal effects for the plantar flexors [SMD = 0.65 (95% CIs = −0.25–1.55); p = 0.143]. As to the assessment methods of skeletal muscle hypertrophy, findings indicated trivial to small effects for prediction equations [SMD = 0.29 (95% CIs = 0.16–0.42); p < 0.001] and moderate-to-large effects for ultrasound imaging [SMD = 0.74 (95% CIs = 0.59–0.89); p < 0.001]. Meta-regression analysis indicated that the weekly session frequency moderates the effect of plyometric jump training on skeletal muscle hypertrophy, with a higher weekly session frequency inducing larger hypertrophic gains [β = 0.3233 (95% CIs = 0.2041–0.4425); p < 0.001]. We found no clear evidence that age, sex, total training period, single session duration, or the number of jumps per week moderate the effect of plyometric jump training on skeletal muscle hypertrophy [β = −0.0133 to 0.0433 (95% CIs = −0.0387 to 0.1215); p = 0.101–0.751].
Conclusion: Plyometric jump training can induce skeletal muscle hypertrophy, regardless of age and sex. There is evidence for relatively larger effects in non-athletes compared with athletes. Further, the weekly session frequency seems to moderate the effect of plyometric jump training on skeletal muscle hypertrophy, whereby more frequent weekly plyometric jump training sessions elicit larger hypertrophic adaptations.
Background: Infection with human immunodeficiency virus (HIV) affects muscle mass, altering independent activities of people living with HIV (PLWH). Resistance training alone (RT) or combined with aerobic exercise (AE) is linked to improved muscle mass and strength maintenance in PLWH. These exercise benefits have been the focus of different meta-analyses, although only a limited number of studies have been identified up to the year 2013/4. An up-to-date systematic review and meta-analysis concerning the effect of RT alone or combined with AE on strength parameters and hormones is of high value, since more and recent studies dealing with these types of exercise in PLWH have been published.
Methods: Randomized controlled trials evaluating the effects of RT alone, AE alone or the combination of both (AERT) on PLWH was performed through five web-databases up to December 2017. Risk of bias and study quality was attained using the PEDro scale. Weighted mean difference (WMD) from baseline to post-intervention changes was calculated. The I2 statistics for heterogeneity was calculated.
Results: Thirteen studies reported strength outcomes. Eight studies presented a low risk of bias. The overall change in upper body strength was 19.3 Kg (95% CI: 9.8±28.8, p< 0.001) after AERT and 17.5 Kg (95% CI: 16±19.1, p< 0.001) for RT. Lower body change was 29.4 Kg (95% CI: 18.1±40.8, p< 0.001) after RT and 10.2 Kg (95% CI: 6.7±13.8, p< 0.001) for AERT. Changes were higher after controlling for the risk of bias in upper and lower body strength and for supervised exercise in lower body strength. A significant change towards lower levels of IL-6 was found (-2.4 ng/dl (95% CI: -2.6, -2.1, p< 0.001).
Conclusion: Both resistance training alone and combined with aerobic exercise showed a positive change when studies with low risk of bias and professional supervision were analyzed, improving upper and, more critically, lower body muscle strength. Also, this study found that exercise had a lowering effect on IL-6 levels in PLWH.
Effects of intermittent hypoxia-hyperoxia on performance- and health-related outcomes in humans
(2022)
Background:
Intermittent hypoxia applied at rest or in combination with exercise promotes multiple beneficial adaptations with regard to performance and health in humans. It was hypothesized that replacing normoxia by moderate hyperoxia can increase the adaptive response to the intermittent hypoxic stimulus.
Objective:
Our objective was to systematically review the current state of the literature on the effects of chronic intermittent hypoxia-hyperoxia (IHH) on performance- and health-related outcomes in humans.
Methods:
PubMed, Web of Science (TM), Scopus, and Cochrane Library databases were searched in accordance with PRISMA guidelines (January 2000 to September 2021) using the following inclusion criteria: (1) original research articles involving humans, (2) investigation of the chronic effect of IHH, (3) inclusion of a control group being not exposed to IHH, and (4) articles published in peer-reviewed journals written in English.
Results:
Of 1085 articles initially found, eight studies were included. IHH was solely performed at rest in different populations including geriatric patients (n = 1), older patients with cardiovascular (n = 3) and metabolic disease (n = 2) or cognitive impairment (n = 1), and young athletes with overtraining syndrome (n = 1). The included studies confirmed the beneficial effects of chronic exposure to IHH, showing improvements in exercise tolerance, peak oxygen uptake, and global cognitive functions, as well as lowered blood glucose levels. A trend was discernible that chronic exposure to IHH can trigger a reduction in systolic and diastolic blood pressure. The evidence of whether IHH exerts beneficial effects on blood lipid levels and haematological parameters is currently inconclusive. A meta-analysis was not possible because the reviewed studies had a considerable heterogeneity concerning the investigated populations and outcome parameters.
Conclusion:
Based on the published literature, it can be suggested that chronic exposure to IHH might be a promising non-pharmacological intervention strategy for improving peak oxygen consumption, exercise tolerance, and cognitive performance as well as reducing blood glucose levels, and systolic and diastolic blood pressure in older patients with cardiovascular and metabolic diseases or cognitive impairment. However, further randomized controlled trials with adequate sample sizes are needed to confirm and extend the evidence. This systematic review was registered on the international prospective register of systematic reviews (PROSPERO-ID: CRD42021281248) (https://www.crd.york.ac.uk/prospero/).
Instructions given prior to extinction training facilitate the extinction of conditioned skin conductance (SCRs) and fear-potentiated startle responses (FPSs) and serve as laboratory models for cognitive interventions implemented in exposure-based treatments of pathological anxiety. Here, we investigated how instructions given prior to extinction training, with or without the additional removal of the electrode used to deliver the unconditioned stimulus (US), affect the return of fear assessed 24 hours later. We replicated previous instruction effects on extinction and added that the additional removal of the US electrode slightly enhanced facilitating effects on the extinction of conditioned FPSs. In contrast, extinction instructions hardly affected the return of conditioned fear responses. These findings suggest that instruction effects observed during extinction training do not extent to tests of return of fear 24 hours later which serve as laboratory models of relapse and improvement stability of exposure-based treatments.
Introduction
We investigated blood glucose (BG) and hormone response to aerobic high-intensity interval exercise (HIIE) and moderate continuous exercise (CON) matched for mean load and duration in type 1 diabetes mellitus (T1DM).
Material and Methods
Seven trained male subjects with T1DM performed a maximal incremental exercise test and HIIE and CON at 3 different mean intensities below (A) and above (B) the first lactate turn point and below the second lactate turn point (C) on a cycle ergometer. Subjects were adjusted to ultra-long-acting insulin Degludec (Tresiba/ Novo Nordisk, Denmark). Before exercise, standardized meals were administered, and short-acting insulin dose was reduced by 25% (A), 50% (B), and 75% (C) dependent on mean exercise intensity. During exercise, BG, adrenaline, noradrenaline, dopamine, cortisol, glucagon, and insulin-like growth factor-1, blood lactate, heart rate, and gas exchange variables were measured. For 24 h after exercise, interstitial glucose was measured by continuous glucose monitoring system.
Results
BG decrease during HIIE was significantly smaller for B (p = 0.024) and tended to be smaller for A and C compared to CON. No differences were found for post-exercise interstitial glucose, acute hormone response, and carbohydrate utilization between HIIE and CON for A, B, and C. In HIIE, blood lactate for A (p = 0.006) and B (p = 0.004) and respiratory exchange ratio for A (p = 0.003) and B (p = 0.003) were significantly higher compared to CON but not for C.
Conclusion
Hypoglycemia did not occur during or after HIIE and CON when using ultra-long-acting insulin and applying our methodological approach for exercise prescription. HIIE led to a smaller BG decrease compared to CON, although both exercises modes were matched for mean load and duration, even despite markedly higher peak workloads applied in HIIE. Therefore, HIIE and CON could be safely performed in T1DM.
Sarcopenic obesity is increasingly found in youth, but its health consequences remain unclear.
Therefore, we studied the prevalence of sarcopenia and its association with cardiometabolic risk factors as well as muscular and cardiorespiratory fitness using data from the German Children's Health InterventionaL Trial (CHILT III) programme.
In addition to anthropometric data and blood pressure, muscle and fat mass were determined with bioelectrical impedance analysis.
Sarcopenia was classified via muscle-to-fat ratio. A fasting blood sample was taken, muscular fitness was determined using the standing long jump, and cardiorespiratory fitness was determined using bicycle ergometry. Of the 119 obese participants included in the analysis (47.1% female, mean age 12.2 years), 83 (69.7%) had sarcopenia. Affected individuals had higher gamma-glutamyl transferase, higher glutamate pyruvate transaminase, higher high-sensitivity C-reactive protein, higher diastolic blood pressure, and lower muscular and cardiorespiratory fitness (each p < 0.05) compared to participants who were 'only' obese.
No differences were found in other parameters. In our study, sarcopenic obesity was associated with various disorders in children and adolescents.
However, the clinical value must be tested with larger samples and reference populations to develop a unique definition and appropriate methods in terms of identification but also related preventive or therapeutic approaches.
The aim was to analyze the risk of hip fracture in German primary care patients with dementia. This study included patients aged 65-90 from 1072 primary care practices who were first diagnosed with dementia between 2010 and 2013. Controls were matched (1:1) to cases for age, sex, and type of health insurance. The primary outcome was the diagnosis of hip fracture during the three-year follow-up period. A total of 53,156 dementia patients and 53,156 controls were included. A total of 5.3% of patients and 0.7% of controls displayed hip fracture after three years. Hip fracture occurred more frequently in dementia subjects living in nursing homes than in those living at home (9.2% versus 4.3%). Dementia, residence in nursing homes, and osteoporosis were risk factors for fracture development. Antidementia, antipsychotic, and antidepressant drugs generally had no significant impact on hip fracture risk when prescribed for less than six months. Dementia increased hip fracture risk in German primary care practices.
The current study investigates to what extent masked morphological priming is modulated by language-particular properties, specifically by its writing system. We present results from two masked priming experiments investigating the processing of complex Japanese words written in less common (moraic) scripts. In Experiment 1, participants performed lexical decisions on target verbs; these were preceded by primes which were either (i) a past-tense form of the same verb, (ii) a stem-related form with the epenthetic vowel -i, (iii) a semantically-related form, and (iv) a phonologically-related form. Significant priming effects were obtained for prime types (i), (ii), and (iii), but not for (iv). This pattern of results differs from previous findings on languages with alphabetic scripts, which found reliable masked priming effects for morphologically related prime/target pairs of type (i), but not for non-affixal and semantically-related primes of types (ii), and (iii). In Experiment 2, we measured priming effects for prime/target pairs which are neither morphologically, semantically, phonologically nor - as presented in their moraic scripts—orthographically related, but which—in their commonly written form—share the same kanji, which are logograms adopted from Chinese. The results showed a significant priming effect, with faster lexical-decision times for kanji-related prime/target pairs relative to unrelated ones. We conclude that affix-stripping is insufficient to account for masked morphological priming effects across languages, but that language-particular properties (in the case of Japanese, the writing system) affect the processing of (morphologically) complex words.
The increasing application of intersectionality to the psychological study of identity development raises questions regarding how we as researchers construct and operationalize social identity categories, as well as how we best capture and address systems of oppression and privilege within our work. In the continental European context, the use of the intersectionality paradigm raises additional issues, since “race” was officially removed from the vernacular following the atrocities of WWII, yet racialized oppression continues to occur at every level of society. Within psychological research, participants are often divided into those with and without “migration background,” which can reiterate inequitable norms of national belonging while washing over salient lived experiences in relation to generation status, citizenship, religion, gender, and the intersection between these and other social locations. Although discrimination is increasingly examined in identity development research, rarely are the history and impact of colonialism and related socio-historical elements acknowledged. In the current paper, we aim to address these issues by reviewing previous research and discussing theoretical and practical possibilities for the future. In doing so, we delve into the problems of trading in one static social identity category (e.g., “race”) for another (e.g., “migration background/migrant”) without examining the power structures inherent in the creation of these top-down categories, or the lived experiences of those navigating what it means to be marked as a racialized Other. Focusing primarily on contextualized ethno-cultural identity development, we discuss relevant examples from the continental European context, highlighting research gaps, points for improvement, and best practices.
Objective: This study investigated intraindividual differences of intratendinous blood flow (IBF) in response to running exercise in participants with Achilles tendinopathy.
Design: This is a cross-sectional study.
Setting: The study was conducted at the University Outpatient Clinic.
Participants: Sonographic detectable intratendinous blood flow was examined in symptomatic and contralateral asymptomatic Achilles tendons of 19 participants (42 ± 13 years, 178 ± 10 cm, 76 ± 12 kg, VISA-A 75 ± 16) with clinically diagnosed unilateral Achilles tendinopathy and sonographic evident tendinosis.
Intervention: IBF was assessed using Doppler ultrasound “Advanced Dynamic Flow” before (Upre) and 5, 30, 60, and 120 min (U5–U120) after a standardized submaximal constant load run.
Main Outcome Measure: IBF was quantified by counting the number (n) of vessels in each tendon.
Results: At Upre, IBF was higher in symptomatic compared with asymptomatic tendons [mean 6.3 (95% CI: 2.8–9.9) and 1.7 (0.4–2.9), p < 0.01]. Overall, 63% of symptomatic and 47% of asymptomatic Achilles tendons responded to exercise, whereas 16 and 11% showed persisting IBF and 21 and 42% remained avascular throughout the investigation. At U5, IBF increased in both symptomatic and asymptomatic tendons [difference to baseline: 2.4 (0.3–4.5) and 0.9 (0.5–1.4), p = 0.05]. At U30 to U120, IBF was still increased in symptomatic but not in asymptomatic tendons [mean difference to baseline: 1.9 (0.8–2.9) and 0.1 (-0.9 to 1.2), p < 0.01].
Conclusion: Irrespective of pathology, 47–63% of Achilles tendons responded to exercise with an immediate acute physiological IBF increase by an average of one to two vessels (“responders”). A higher amount of baseline IBF (approximately five vessels) and a prolonged exercise-induced IBF response found in symptomatic ATs indicate a pain-associated altered intratendinous “neovascularization.”
Rats are a reservoir of human- and livestock-associated methicillin-resistant Staphylococcus aureus (MRSA). However, the composition of the natural S. aureus population in wild and laboratory rats is largely unknown. Here, 144 nasal S. aureus isolates from free-living wild rats, captive wild rats and laboratory rats were genotyped and profiled for antibiotic resistances and human-specific virulence genes. The nasal S. aureus carriage rate was higher among wild rats (23.4%) than laboratory rats (12.3%). Free-living wild rats were primarily colonized with isolates of clonal complex (CC) 49 and CC130 and maintained these strains even in husbandry. Moreover, upon livestock contact, CC398 isolates were acquired. In contrast, laboratory rats were colonized with many different S. aureus lineages—many of which are commonly found in humans. Five captive wild rats were colonized with CC398-MRSA. Moreover, a single CC30-MRSA and two CC130-MRSA were detected in free-living or captive wild rats. Rat-derived S. aureus isolates rarely harbored the phage-carried immune evasion gene cluster or superantigen genes, suggesting long-term adaptation to their host. Taken together, our study revealed a natural S. aureus population in wild rats, as well as a colonization pressure on wild and laboratory rats by exposure to livestock- and human-associated S. aureus, respectively.
Humans generate internal models of their environment to predict events in the world. As the environments change, our brains adjust to these changes by updating their internal models. Here, we investigated whether and how 9-month-old infants differentially update their models to represent a dynamic environment. Infants observed a predictable sequence of stimuli, which were interrupted by two types of cues. Following the update cue, the pattern was altered, thus, infants were expected to update their predictions for the upcoming stimuli. Because the pattern remained the same after the no-update cue, no subsequent updating was required. Infants showed an amplified negative central (Nc) response when the predictable sequence was interrupted. Late components such as the PSW were also evoked in response to unexpected stimuli; however, we found no evidence for a differential response to the informational value of surprising cues at later stages of processing. Infants rather learned that surprising cues always signal a change in the environment that requires updating. Interestingly, infants responded with an amplified neural response to the absence of an expected change, suggesting a top-down modulation of early sensory processing in infants. Our findings corroborate emerging evidence showing that infants build predictive models early in life.
Background
Multi-component cardiac rehabilitation (CR) is performed to achieve an improved prognosis, superior health-related quality of life (HRQL) and occupational resumption through the management of cardiovascular risk factors, as well as improvement of physical performance and patients’ subjective health. Out of a multitude of variables gathered at CR admission and discharge, we aimed to identify predictors of returning to work (RTW) and HRQL 6 months after CR.
Design
Prospective observational multi-centre study, enrolment in CR between 05/2017 and 05/2018.
Method
Besides general data (e.g. age, sex, diagnoses), parameters of risk factor management (e.g. smoking, hypertension), physical performance (e.g. maximum exercise capacity, endurance training load, 6-min walking distance) and patient-reported outcome measures (e.g. depression, anxiety, HRQL, subjective well-being, somatic and mental health, pain, lifestyle change motivation, general self-efficacy, pension desire and self-assessment of the occupational prognosis using several questionnaires) were documented at CR admission and discharge. These variables (at both measurement times and as changes during CR) were analysed using multiple linear regression models regarding their predictive value for RTW status and HRQL (SF-12) six months after CR.
Results
Out of 1262 patients (54±7 years, 77% men), 864 patients (69%) returned to work. Predictors of failed RTW were primarily the desire to receive pension (OR = 0.33, 95% CI: 0.22–0.50) and negative self-assessed occupational prognosis (OR = 0.34, 95% CI: 0.24–0.48) at CR discharge, acute coronary syndrome (OR = 0.64, 95% CI: 0.47–0.88) and comorbid heart failure (OR = 0.51, 95% CI: 0.30–0.87). High educational level, stress at work and physical and mental HRQL were associated with successful RTW. HRQL was determined predominantly by patient-reported outcome measures (e.g. pension desire, self-assessed health prognosis, anxiety, physical/mental HRQL/health, stress, well-being and self-efficacy) rather than by clinical parameters or physical performance.
Conclusion
Patient-reported outcome measures predominantly influenced return to work and HRQL in patients with heart disease. Therefore, the multi-component CR approach focussing on psychosocial support is crucial for subjective health prognosis and occupational resumption.
Findings on the perceptual reorganization of lexical tones are mixed. Some studies report good tone discrimination abilities for all tested age groups, others report decreased or enhanced discrimination with increasing age, and still others report U-shaped developmental curves. Since prior studies have used a wide range of contrasts and experimental procedures, it is unclear how specific task requirements interact with discrimination abilities at different ages. In the present work, we tested German and Cantonese adults on their discrimination of Cantonese lexical tones, as well as German-learning infants between 6 and 18 months of age on their discrimination of two specific Cantonese tones using two different types of experimental procedures. The adult experiment showed that German native speakers can discriminate between lexical tones, but native Cantonese speakers show significantly better performance. The results from German-learning infants suggest that 6- and 18-month-olds discriminate tones, while 9-month-olds do not, supporting a U-shaped developmental curve. Furthermore, our results revealed an effect of methodology, with good discrimination performance at 6 months after habituation but not after familiarization. These results support three main conclusions. First, habituation can be a more sensitive procedure for measuring infants' discrimination than familiarization. Second, the previous finding of a U-shaped curve in the discrimination of lexical tones is further supported. Third, discrimination abilities at 18 months appear to reflect mature perceptual sensitivity to lexical tones, since German adults also discriminated the lexical tones with high accuracy.
Introduction: Adequate cognitive function in patients is a prerequisite for successful implementation of patient education and lifestyle coping in comprehensive cardiac rehabilitation (CR) programs. Although the association between cardiovascular diseases and cognitive impairments (CIs) is well known, the prevalence particularly of mild CI in CR and the characteristics of affected patients have been insufficiently investigated so far.
Methods: In this prospective observational study, 496 patients (54.5 ± 6.2 years, 79.8% men) with coronary artery disease following an acute coronary event (ACE) were analyzed. Patients were enrolled within 14 days of discharge from the hospital in a 3-week inpatient CR program. Patients were tested for CI using the Montreal Cognitive Assessment (MoCA) upon admission to and discharge from CR. Additionally, sociodemographic, clinical, and physiological variables were documented. The data were analyzed descriptively and in a multivariate stepwise backward elimination regression model with respect to CI.
Results: At admission to CR, the CI (MoCA score < 26) was determined in 182 patients (36.7%). Significant differences between CI and no CI groups were identified, and CI group was associated with high prevalence of smoking (65.9 vs 56.7%, P = 0.046), heavy (physically demanding) workloads (26.4 vs 17.8%, P < 0.001), sick leave longer than 1 month prior to CR (28.6 vs 18.5%, P = 0.026), reduced exercise capacity (102.5 vs 118.8 W, P = 0.006), and a shorter 6-min walking distance (401.7 vs 421.3 m, P = 0.021) compared to no CI group. The age- and education-adjusted model showed positive associations with CI only for sick leave more than 1 month prior to ACE (odds ratio [OR] 1.673, 95% confidence interval 1.07–2.79; P = 0.03) and heavy workloads (OR 2.18, 95% confidence interval 1.42–3.36; P < 0.01).
Conclusion: The prevalence of CI in CR was considerably high, affecting more than one-third of cardiac patients. Besides age and education level, CI was associated with heavy workloads and a longer sick leave before ACE.
In a self-paced reading study on German sluicing, Paape (Paape, 2016) found that reading times were shorter at the ellipsis site when the antecedent was a temporarily ambiguous garden-path structure. As a post-hoc explanation of this finding, Paape assumed that the antecedent’s memory representation was reactivated during syntactic reanalysis, making it easier to retrieve. In two eye tracking experiments, we subjected the reactivation hypothesis to further empirical scrutiny. Experiment 1, carried out in French, showed no evidence in favor in the reactivation hypothesis. Instead, results for one out of the three types of garden-path sentences that were tested suggest that subjects sometimes failed to resolve the temporary ambiguity in the antecedent clause, and subsequently failed to resolve the ellipsis. The results of Experiment 2, a conceptual replication of Paape’s (Paape, 2016) original study carried out in German, are compatible with the reactivation hypothesis, but leave open the possibility that the observed speedup for ambiguous antecedents may be due to occasional retrievals of an incorrect structure.
Understanding a sentence and integrating it into the discourse depends upon the identification of its focus, which, in spoken German, is marked by accentuation. In the case of written language, which lacks explicit cues to accent, readers have to draw on other kinds of information to determine the focus. We study the joint or interactive
effects of two kinds of information that have no direct representation in print but have each been shown to be influential in the reader’s text comprehension: (i) the (low-level)rhythmic-prosodic structure that is based on the distribution of lexically stressed syllables, and (ii) the (high-level) discourse context that is grounded in the memory of previous linguistic content. Systematically manipulating these factors, we examine the way readers resolve a syntactic ambiguity involving the scopally ambiguous focus operator auch (engl. “too”) in both oral (Experiment 1) and silent reading (Experiment 2). The results of both experiments attest that discourse context and local linguistic rhythm conspire to guide the syntactic and, oncomitantly, the focus-structural analysis of ambiguous sentences. We argue that reading comprehension requires the (implicit) assignment of accents according to the focus structure and that, by establishing a prominence profile, the implicit prosodic rhythm directly affects accent assignment.
The effects of exercise interventions on unspecific chronic low back pain (CLBP) have been investigated in many studies, but the results are inconclusive regarding exercise types, efficiency, and sustainability. This may be because the influence of psychosocial factors on exercise induced adaptation regarding CLBP is neglected. Therefore, this study assessed psychosocial characteristics, which moderate and mediate the effects of sensorimotor exercise on LBP. A single-blind 3-arm multicenter randomized controlled trial was conducted for 12-weeks. Three exercise groups, sensorimotor exercise (SMT), sensorimotor and behavioral training (SMT-BT), and regular routines (CG) were randomly assigned to 662 volunteers. Primary outcomes (pain intensity and disability) and psychosocial characteristics were assessed at baseline (M1) and follow-up (3/6/12/24 weeks, M2-M5). Multiple regression models were used to analyze whether psychosocial characteristics are moderators of the relationship between exercise and pain, meaning that psychosocial factors and exercise interact. Causal mediation analysis were conducted to analyze, whether psychosocial characteristics mediate the exercise effect on pain. A total of 453 participants with intermittent pain (mean age = 39.5 ± 12.2 years, f = 62%) completed the training. It was shown, that depressive symptomatology (at M4, M5), vital exhaustion (at M4), and perceived social support (at M5) are significant moderators of the relationship between exercise and the reduction of pain intensity. Further depressive mood (at M4), social-satisfaction (at M4), and anxiety (at M5 SMT) significantly moderate the exercise effect on pain disability. The amount of moderation was of clinical relevance. In contrast, there were no psychosocial variables which mediated exercise effects on pain. In conclusion it was shown, that psychosocial variables can be moderators in the relationship between sensorimotor exercise induced adaptation on CLBP which may explain conflicting results in the past regarding the merit of exercise interventions in CLBP. Results suggest further an early identification of psychosocial risk factors by diagnostic tools, which may essential support the planning of personalized exercise therapy.
Level of Evidence: Level I.
Clinical Trial Registration: DRKS00004977, LOE: I, MiSpEx: grant-number: 080102A/11-14. https://www.drks.de/drks_web/navigate.do?navigationId=trial.HTML&TRIAL_ID=DRKS00004977.
There is controversy in the literature in regards of the link between training load and injury rate. Thus, the aims of this non-interventional study were to evaluate relationships between pre-season training load with biochemical markers, injury incidence and performance during the first month of the competitive period in professional soccer players.
Static (one-legged stance) and dynamic (star excursion balance) postural control tests were performed by 14 adolescent athletes with and 17 without back pain to determine reproducibility. The total displacement, mediolateral and anterior-posterior displacements of the centre of pressure in mm for the static, and the normalized and composite reach distances for the dynamic tests were analysed. Intraclass correlation coefficients, 95% confidence intervals, and a Bland-Altman analysis were calculated for reproducibility. Intraclass correlation coefficients for subjects with (0.54 to 0.65), (0.61 to 0.69) and without (0.45 to 0.49), (0.52 to 0.60) back pain were obtained on the static test for right and left legs, respectively. Likewise, (0.79 to 0.88), (0.75 to 0.93) for subjects with and (0.61 to 0.82), (0.60 to 0.85) for those without back pain were obtained on the dynamic test for the right and left legs, respectively. Systematic bias was not observed between test and retest of subjects on both static and dynamic tests. The one-legged stance and star excursion balance tests have fair to excellent reliabilities on measures of postural control in adolescent athletes with and without back pain. They can be used as measures of postural control in adolescent athletes with and without back pain.
Sexual aggression victimization and perpetration among female and male university students in Poland
(2015)
This study examined the prevalence of victimization and perpetration of sexual aggression since age 15 in a convenience sample of 565 Polish university students (356 females). The prevalence of sexual aggression was investigated for both males and females from the perspectives of both victims and perpetrators in relation to three coercive strategies, three different victim–perpetrator relationships, and four types of sexual acts. We also examined the extent to which alcohol was consumed in the context of sexually aggressive incidents. The overall self-reported victimization rate was 34.3% for females and 28.4% for males. The overall perpetration rate was 11.7% for males and 6.5% for females. The gender difference was significant only for perpetration. Prevalence rates of both victimization and perpetration were higher for people known to each other than for strangers. In the majority of victimization and perpetration incidents, alcohol was consumed by one or both parties involved. The findings are discussed in relation to the international evidence and the need for tailored risk prevention and reduction programs.
Due to maturation of the postural control system and secular declines in motor performance, adolescents experience deficits in postural control during standing and walking while concurrently performing cognitive interference tasks. Thus, adequately designed balance training programs may help to counteract these deficits. While the general effectiveness of youth balance training is well-documented, there is hardly any information available on the specific effects of single-task (ST) versus dual-task (DT) balance training. Therefore, the objectives of this study were (i) to examine static/dynamic balance performance under ST and DT conditions in adolescents and (ii) to study the effects of ST versus DT balance training on static/dynamic balance under ST and DT conditions in adolescents. Twenty-eight healthy girls and boys aged 12–13 years were randomly assigned to either 8 weeks of ST or DT balance training. Before and after training, postural sway and spatio-temporal gait parameters were registered under ST (standing/walking only) and DT conditions (standing/walking while concurrently performing an arithmetic task). At baseline, significantly slower gait speed (p < 0.001, d = 5.1), shorter stride length (p < 0.001, d = 4.8), and longer stride time (p < 0.001, d = 3.8) were found for DT compared to ST walking but not standing. Training resulted in significant pre–post decreases in DT costs for gait velocity (p < 0.001, d = 3.1), stride length (-45%, p < 0.001, d = 2.4), and stride time (-44%, p < 0.01, d = 1.9). Training did not induce any significant changes (p > 0.05, d = 0–0.1) in DT costs for all parameters of secondary task performance during standing and walking. Training produced significant pre–post increases (p = 0.001; d = 1.47) in secondary task performance while sitting. The observed increase was significantly greater for the ST training group (p = 0.04; d = 0.81). For standing, no significant changes were found over time irrespective of the experimental group. We conclude that adolescents showed impaired DT compared to ST walking but not standing. ST and DT balance training resulted in significant and similar changes in DT costs during walking. Thus, there appears to be no preference for either ST or DT balance training in adolescents.
The development of phonological awareness, the knowledge of the structural combinatoriality of a language, has been widely investigated in relation to reading (dis)ability across languages. However, the extent to which knowledge of phonemic units may interact with spoken language organization in (transparent) alphabetical languages has hardly been investigated. The present study examined whether phonemic awareness correlates with coarticulation degree, commonly used as a metric for estimating the size of children’s production units. A speech production task was designed to test for developmental differences in intra-syllabic coarticulation degree in 41 German children from 4 to 7 years of age. The technique of ultrasound imaging allowed for comparing the articulatory foundations of children’s coarticulatory patterns. Four behavioral tasks assessing various levels of phonological awareness from large to small units and expressive vocabulary were also administered. Generalized additive modeling revealed strong interactions between children’s vocabulary and phonological awareness with coarticulatory patterns. Greater knowledge of sub-lexical units was associated with lower intra-syllabic coarticulation degree and greater differentiation of articulatory gestures for individual segments. This interaction was mostly nonlinear: an increase in children’s phonological proficiency was not systematically associated with an equivalent change in coarticulation degree. Similar findings were drawn between vocabulary and coarticulatory patterns. Overall, results suggest that the process of developing spoken language fluency involves dynamical interactions between cognitive and speech motor domains. Arguments for an integrated-interactive approach to skill development are discussed.
The genesis of chronic pain is explained by a biopsychosocial model. It hypothesizes an interdependency between environmental and genetic factors provoking aberrant long-term changes in biological and psychological regulatory systems. Physiological effects of psychological and physical stressors may play a crucial role in these maladaptive processes. Specifically, long-term demands on the stress response system may moderate central pain processing and influence descending serotonergic and noradrenergic signals from the brainstem, regulating nociceptive processing at the spinal level. However, the underlying mechanisms of this pathophysiological interplay still remain unclear. This paper aims to shed light on possible pathways between physical (exercise) and psychological stress and the potential neurobiological consequences in the genesis and treatment of chronic pain, highlighting evolving concepts and promising research directions in the treatment of chronic pain. Two treatment forms (exercise and mindfulness-based stress reduction as exemplary therapies), their interaction, and the dose-response will be discussed in more detail, which might pave the way to a better understanding of alterations in the pain matrix and help to develop future prevention and therapeutic concepts
Purpose: Psychosocial variables are known risk factors for the development and chronification of low back pain (LBP). Psychosocial stress is one of these risk factors. Therefore, this study aims to identify the most important types of stress predicting LBP. Self-efficacy was included as a potential protective factor related to both, stress and pain.
Participants and Methods: This prospective observational study assessed n = 1071 subjects with low back pain over 2 years. Psychosocial stress was evaluated in a broad manner using instruments assessing perceived stress, stress experiences in work and social contexts, vital exhaustion and life-event stress. Further, self-efficacy and pain (characteristic pain intensity and disability) were assessed. Using least absolute shrinkage selection operator regression, important predictors of characteristic pain intensity and pain-related disability at 1-year and 2-years follow-up were analyzed.
Results: The final sample for the statistic procedure consisted of 588 subjects (age: 39.2 (± 13.4) years; baseline pain intensity: 27.8 (± 18.4); disability: 14.3 (± 17.9)). In the 1-year follow-up, the stress types “tendency to worry”, “social isolation”, “work discontent” as well as vital exhaustion and negative life events were identified as risk factors for both pain intensity and pain-related disability. Within the 2-years follow-up, Lasso models identified the stress types “tendency to worry”, “social isolation”, “social conflicts”, and “perceived long-term stress” as potential risk factors for both pain intensity and disability. Furthermore, “self-efficacy” (“internality”, “self-concept”) and “social externality” play a role in reducing pain-related disability.
Conclusion: Stress experiences in social and work-related contexts were identified as important risk factors for LBP 1 or 2 years in the future, even in subjects with low initial pain levels. Self-efficacy turned out to be a protective factor for pain development, especially in the long-term follow-up. Results suggest a differentiation of stress types in addressing psychosocial factors in research, prevention and therapy approaches.
Syntactic priming is known to facilitate comprehension of the target sentence if the syntactic structure of the target sentence aligns with the structure of the prime (Branigan et al., 2005; Tooley and Traxler, 2010). Such a processing facilitation is understood to be constrained due to factors such as lexical overlap between the prime and the target, frequency of the prime structure, etc. Syntactic priming in SOV languages is also understood to be influenced by similar constraints (Arai, 2012). Sentence comprehension in SOV languages is known to be incremental and predictive. Such a top-down parsing process involves establishing various syntactic relations based on the linguistic cues of a sentence and the role of preverbal case-markers in achieving this is known to be critical. Given the evidence of syntactic priming during comprehension in these languages, this aspect of the comprehension process and its effect on syntactic priming becomes important. In this work, we show that syntactic priming during comprehension is affected by the probability of using the prime structure while parsing the target sentence. If the prime structure has a low probability given the sentential cues (e.g., nominal case-markers) in the target sentence, then the chances of persisting with the prime structure in the target reduces. Our work demonstrates the role of structural complexity of the target with regard to syntactic priming during comprehension and highlights that syntactic priming is modulated by an overarching preference of the parser to avoid rare structures
The regular monitoring of physical fitness and sport-specific performance is important in elite sports to increase the likelihood of success in competition. This study aimed to systematically review and to critically appraise the methodological quality, validation data, and feasibility of the sport-specific performance assessment in Olympic combat sports like amateur boxing, fencing, judo, karate, taekwondo, and wrestling. A systematic search was conducted in the electronic databases PubMed, Google-Scholar, and Science-Direct up to October 2017. Studies in combat sports were included that reported validation data (e.g., reliability, validity, sensitivity) of sport-specific tests. Overall, 39 studies were eligible for inclusion in this review. The majority of studies (74%) contained sample sizes <30 subjects. Nearly, 1/3 of the reviewed studies lacked a sufficient description (e.g., anthropometrics, age, expertise level) of the included participants. Seventy-two percent of studies did not sufficiently report inclusion/exclusion criteria of their participants. In 62% of the included studies, the description and/or inclusion of a familiarization session (s) was either incomplete or not existent. Sixty-percent of studies did not report any details about the stability of testing conditions. Approximately half of the studies examined reliability measures of the included sport-specific tests (intraclass correlation coefficient [ICC] = 0.43–1.00). Content validity was addressed in all included studies, criterion validity (only the concurrent aspect of it) in approximately half of the studies with correlation coefficients ranging from r = −0.41 to 0.90. Construct validity was reported in 31% of the included studies and predictive validity in only one. Test sensitivity was addressed in 13% of the included studies. The majority of studies (64%) ignored and/or provided incomplete information on test feasibility and methodological limitations of the sport-specific test. In 28% of the included studies, insufficient information or a complete lack of information was provided in the respective field of the test application. Several methodological gaps exist in studies that used sport-specific performance tests in Olympic combat sports. Additional research should adopt more rigorous validation procedures in the application and description of sport-specific performance tests in Olympic combat sports.
Concrete-operational thinking depicts an important aspect of cognitive development. A promising approach in promoting these skills is the instruction of strategies. The construction of such instructional programs requires insights into the mental operations involved in problem-solving. In the present paper, we address the question to which extent variations of the effect of isolated and combined mental operations (strategies) on correct solution of concrete-operational concepts can be observed. Therefore, a cross-sectional design was applied. The use of mental operations was measured by thinking-aloud reports from 80 first- and second-graders (N = 80) while solving tasks depicting concrete-operational thinking. Concrete-operational thinking was assessed using the subscales conservation of numbers, classification and sequences of the TEKO. The verbal reports were transcribed and coded with regard to the mental operations applied per task. Data analyses focused on tasks level, resulting in the analyses of N = 240 tasks per subscale. Differences regarding the contribution of isolated and combined mental operations (strategies) to correct solution were observed. Thereby, the results indicate the necessity of selection and integration of appropriate mental operations as strategies. The results offer insights in involved mental operations while solving concrete-operational tasks and depict a contribution to the construction of instructional programs.
According to recent literature sodium bicarbonate (NaHCO3) has been proposed as a performance enhancing aid by reducing acidosis during exercise. The aim of the current review is to investigate if the duration of exercise is an essential factor for the effect
of NaHCO3. To collect the latest studies from electronic database
of PubMed, study publication time was restricted from December 2006 to December 2016. The search was updated in July 2018. The studies were divided into exercise durations of > 4 or ≤ 4 minutes for easier comparability of their effects in different exercises. Only randomized controlled trials were included in this review. Of the 775 studies, 35 met the inclusion criteria. Study design, subjects, effects as well as outcome criteria were inconsistent throughout the studies. Seventeen of these studies reported
performance enhancing effects after supplementing NaHCO3. Eleven of twenty studies with exercise duration of ≤ 4 minutes showed positive and four diverse results after supplementing NaHCO3. On the other hand six of fifteen studies with an exercise duration of >4 minutes showed performance enhancing and two studies showed diverse results. Consequently, the duration of exercise might be influential for inducing a performance enhancing effect when supplementing NaHCO3, but to which extent, remains unclear due to the inconsistencies in the study results.
Ultraschall Berlin
(2014)
Do properties of individual languages shape the mechanisms by which they are processed? By virtue of their nonconcatenative
morphological structure, the recognition of complex words in Semitic languages has been argued to rely
strongly on morphological information and on decomposition into root and pattern constituents. Here, we report results
from a masked priming experiment in Hebrew in which we contrasted verb forms belonging to two morphological
classes, Paal and Piel, which display similar properties, but crucially differ on whether they are extended to novel verbs.
Verbs from the open-class Piel elicited familiar root priming effects, but verbs from the closed-class Paal did not. Our
findings indicate that, similarly to other (e.g., Indo-European) languages, down-to-the-root decomposition in Hebrew
does not apply to stems of non-productive verbal classes. We conclude that the Semitic word processor is less unique
than previously thought: Although it operates on morphological units that are combined in a non-linear way, it engages
the same universal mechanisms of storage and computation as those seen in other languages.
Background: Although the benefits for health of physical activity (PA) are well documented, the majority of the population is unable to implement present recommendations into daily routine. Mobile health (mHealth) apps could help increase the level of PA. However, this is contingent on the interest of potential users.
Objective: The aim of this study was the explorative, nuanced determination of the interest in mHealth apps with respect to PA among students and staff of a university.
Methods: We conducted a Web-based survey from June to July 2015 in which students and employees from the University of Potsdam were asked about their activity level, interest in mHealth fitness apps, chronic diseases, and sociodemographic parameters.
Results: A total of 1217 students (67.30%, 819/1217; female; 26.0 years [SD 4.9]) and 485 employees (67.5%, 327/485; female; 42.7 years [SD 11.7]) participated in the survey. The recommendation for PA (3 times per week) was not met by 70.1% (340/485) of employees and 52.67% (641/1217) of students. Within these groups, 53.2% (341/641 students) and 44.2% (150/340 employees)—independent of age, sex, body mass index (BMI), and level of education or professional qualification—indicated an interest in mHealth fitness apps.
Conclusions: Even in a younger, highly educated population, the majority of respondents reported an insufficient level of PA. About half of them indicated their interest in training support. This suggests that the use of personalized mobile fitness apps may become increasingly significant for a positive change of lifestyle.
Previous research has shown that high phonotactic frequencies facilitate the production of regularly inflected verbs in English-learning children with specific language impairment (SLI) but not with typical development (TD). We asked whether this finding can be replicated for German, a language with a much more complex inflectional verb paradigm than English. Using an elicitation task, the production of inflected nonce verb forms (3rd person singular with - t suffix) with either high-or low-frequency subsyllables was tested in sixteen German-learning children with SLI (ages 4;1-5;1), sixteen TD-children matched for chronological age (CA) and fourteen TD-children matched for verbal age (VA) (ages 3;0-3;11). The findings revealed that children with SLI, but not CA-or VA-children, showed differential performance between the two types of verbs, producing more inflectional errors when the verb forms resulted in low-frequency subsyllables than when they resulted in high-frequency subsyllables, replicating the results from English-learning children.