Open Access
Refine
Year of publication
Is part of the Bibliography
- yes (229)
Keywords
- sentence comprehension (7)
- embodied cognition (6)
- stress (6)
- acquisition (5)
- cardiac rehabilitation (5)
- english (5)
- language acquisition (5)
- late bilinguals (5)
- numerical cognition (5)
- reliability (5)
Institute
- Humanwissenschaftliche Fakultät (229) (remove)
Two of a kind?
(2014)
School attacks are attracting increasing attention in aggression research. Recent systematic analyses provided new insights into offense and offender characteristics. Less is known about attacks in institutes of higher education (e.g., universities). It is therefore questionable whether the term “school attack” should be limited to institutions of general education or could be extended to institutions of higher education. Scientific literature is divided in distinguishing or unifying these two groups and reports similarities as well as differences. We researched 232 school attacks and 45 attacks in institutes of higher education throughout the world and conducted systematic comparisons between the two groups. The analyses yielded differences in offender (e.g., age, migration background) and offense characteristics (e.g., weapons, suicide rates), and some similarities (e.g., gender). Most differences can apparently be accounted for by offenders’ age and situational influences. We discuss the implications of our findings for future research and the development of preventative measures.
Background
Outcome quality management requires the consecutive registration of defined variables. The aim was to identify relevant parameters in order to objectively assess the in-patient rehabilitation outcome.
Methods
From February 2009 to June 2010 1253 patients (70.9 ± 7.0 years, 78.1% men) at 12 rehabilitation clinics were enrolled. Items concerning sociodemographic data, the impairment group (surgery, conservative/interventional treatment), cardiovascular risk factors, structural and functional parameters and subjective health were tested in respect of their measurability, sensitivity to change and their propensity to be influenced by rehabilitation.
Results
The majority of patients (61.1%) were referred for rehabilitation after cardiac surgery, 38.9% after conservative or interventional treatment for an acute coronary syndrome. Functionally relevant comorbidities were seen in 49.2% (diabetes mellitus, stroke, peripheral artery disease, chronic obstructive lung disease). In three key areas 13 parameters were identified as being sensitive to change and subject to modification by rehabilitation: cardiovascular risk factors (blood pressure, low-density lipoprotein cholesterol, triglycerides), exercise capacity (resting heart rate, maximal exercise capacity, maximal walking distance, heart failure, angina pectoris) and subjective health (IRES-24 (indicators of rehabilitation status): pain, somatic health, psychological well-being and depression as well as anxiety on the Hospital Anxiety and Depression Scale).
Conclusion
The outcome of in-patient rehabilitation in elderly patients can be comprehensively assessed by the identification of appropriate key areas, that is, cardiovascular risk factors, exercise capacity and subjective health. This may well serve as a benchmark for internal and external quality management.
OCP-Place, a cross-linguistically well-attested constraint against pairs of consonants with shared [place], is psychologically real. Studies have shown that the processing of words violating OCP-Place is inhibited. Functionalists assume that OCP arises as a consequence of low-level perception: a consonant following another with the same [place] cannot be faithfully perceived as an independent unit. If functionalist theories were correct, then lexical access would be inhibited if two homorganic consonants conjoin at word boundaries-a problem that can only be solved with lexical feedback.
Here, we experimentally challenge the functional account by showing that OCP-Place can be used as a speech segmentation cue during pre-lexical processing without lexical feedback, and that the use relates to distributions in the input.
In Experiment 1, native listeners of Dutch located word boundaries between two labials when segmenting an artificial language. This indicates a use of OCP-Labial as a segmentation cue, implying a full perception of both labials. Experiment 2 shows that segmentation performance cannot solely be explained by well-formedness intuitions. Experiment 3 shows that knowledge of OCP-Place depends on language-specific input: in Dutch, co-occurrences of labials are under-represented, but co-occurrences of coronals are not. Accordingly, Dutch listeners fail to use OCP-Coronal for segmentation.
Despite recent growth of research on the effects of prosocial media, processes underlying these effects are not well understood. Two studies explored theoretically relevant mediators and moderators of the effects of prosocial media on helping. Study 1 examined associations among prosocial- and violent-media use, empathy, and helping in samples from seven countries. Prosocial-media use was positively associated with helping. This effect was mediated by empathy and was similar across cultures. Study 2 explored longitudinal relations among prosocial-video-game use, violent-video-game use, empathy, and helping in a large sample of Singaporean children and adolescents measured three times across 2 years. Path analyses showed significant longitudinal effects of prosocial- and violent-video-game use on prosocial behavior through empathy. Latent-growth-curve modeling for the 2-year period revealed that change in video-game use significantly affected change in helping, and that this relationship was mediated by change in empathy.
Leaking comprises observable behavior or statements that signal intentions of committing a violent offense and is considered an important warning sign for school shootings. School staff who are confronted with leaking have to assess its seriousness and react appropriately - a difficult task, because knowledge about leaking is sparse. The present study, therefore, examined how frequently leaking occurs in schools and how teachers identify leaking and respond to it. To achieve this aim, we informed teachers from eight schools in Germany about the definition of leaking and other warning signs and risk factors for school shootings in a one-hour information session. Teachers were then asked to report cases of leaking over a six- to nine-month period and to answer a questionnaire on leaking and its treatment after the information session and six to nine months later. Our results suggest that leaking is a relevant problem in German schools. Teachers mostly rated the information session positively and benefited in several aspects (e.g. reported more perceived courses of action or improved knowledge about leaking), but also expressed a constant need for support. Our findings highlight teachers' needs for further support and training and may be used in the planning of prevention measures for school shootings.
Although politicization is a perennial research topic in public administration to investigate relationships between ministers and civil servants, the concept still lacks clarification. This article contributes to this literature by systematically identifying different conceptualizations of politicization and suggests a typology including three politicization mechanisms to strengthen the political responsiveness of the ministerial bureaucracy: formal, functional and administrative politicization. The typology is empirically validated through a comparative case analysis of politicization mechanisms in Germany, Belgium, the UK and Denmark. The empirical analysis further refines the general idea of Western democracies becoming ‘simply’ more politicized, by illustrating how some politicization mechanisms do not continue to increase, but stabilize – at least for the time being.
Early acquisition of a second language influences the development of language abilities and cognitive functions. In the present study, we used functional Magnetic Resonance Imaging (fMRI) to investigate the impact of early bilingualism on the organization of the cortical language network during sentence production. Two groups of adult multilinguals, proficient in three languages, were tested on a narrative task; early multilinguals acquired the second language before the age of three years, late multilinguals after the age of nine. All participants learned a third language after nine years of age. Comparison of the two groups revealed substantial differences in language-related brain activity for early as well as late acquired languages. Most importantly, early multilinguals preferentially activated a fronto-striatal network in the left hemisphere, whereas the left posterior superior temporal gyrus (pSTG) was activated to a lesser degree than in late multilinguals. The same brain regions were highlighted in previous studies when a non-target language had to be controlled. Hence the engagement of language control in adult early multilinguals appears to be influenced by the specific learning and acquisition conditions during early childhood. Remarkably, our results reveal that the functional control of early and subsequently later acquired languages is similarly affected, suggesting that language experience has a pervasive influence into adulthood. As such, our findings extend the current understanding of control functions in multilinguals.
Background Transcatheter aortic-valve implantation (TAVI) is an established alternative therapy in patients with severe aortic stenosis and a high surgical risk. Despite a rapid growth in its use, very few data exist about the efficacy of cardiac rehabilitation (CR) in these patients. We assessed the hypothesis that patients after TAVI benefit from CR, compared to patients after surgical aortic-valve replacement (sAVR).
Methods From September 2009 to August 2011, 442 consecutive patients after TAVI (n=76) or sAVR (n=366) were referred to a 3-week CR. Data regarding patient characteristics as well as changes of functional (6-min walk test. 6-MWT), bicycle exercise test), and emotional status (Hospital Anxiety and Depression Scale) were retrospectively evaluated and compared between groups after propensity score adjustment.
Results Patients after TAVI were significantly older (p<0.001), more female (p<0.001), and had more often coronary artery disease (p=0.027), renal failure (p=0.012) and a pacemaker (p=0.032). During CR, distance in 6-MWT (both groups p0.001) and exercise capacity (sAVR p0.001, TAVI p0.05) significantly increased in both groups. Only patients after sAVR demonstrated a significant reduction in anxiety and depression (p0.001). After propensity scores adjustment, changes were not significantly different between sAVR and TAVI, with the exception of 6-MWT (p=0.004).
Conclusions Patients after TAVI benefit from cardiac rehabilitation despite their older age and comorbidities. CR is a helpful tool to maintain independency for daily life activities and participation in socio-cultural life.
Background: Chronic kidney disease (CKD) is a frequent comorbidity among elderly patients and those with cardiovascular disease. CKD carries prognostic relevance. We aimed to describe patient characteristics, risk factor management and control status of patients in cardiac rehabilitation (CR), differentiated by presence or absence of CKD.
Design and methods: Data from 92,071 inpatients with adequate information to calculate glomerular filtration rate (GFR) based on the Cockcroft-Gault formula were analyzed at the beginning and the end of a 3-week CR stay. CKD was defined as estimated GFR <60 ml/min/1.73 m(2).
Results: Compared with non-CKD patients, CKD patients were significantly older (72.0 versus 58.0 years) and more often had diabetes mellitus, arterial hypertension, and atherothrombotic manifestations (previous stroke, peripheral arterial disease), but fewer were current or previous smokers had a CHD family history. Exercise capacity was much lower in CKD (59 vs. 92Watts). Fewer patients with CKD were treated with percutaneous coronary intervention (PCI), but more had coronary artery bypass graft (CABG) surgery. Patients with CKD compared with non-CKD less frequently received statins, acetylsalicylic acid (ASA), clopidogrel, beta blockers, and angiotensin converting enzyme (ACE) inhibitors, and more frequently received angiotensin receptor blockers, insulin and oral anticoagulants. In CKD, mean low density lipoprotein cholesterol (LDL-C), total cholesterol, and high density lipoprotein cholesterol (HDL-C) were slightly higher at baseline, while triglycerides were substantially lower. This lipid pattern did not change at the discharge visit, but overall control rates for all described parameters (with the exception of HDL-C) were improved substantially. At discharge, systolic blood pressure (BP) was higher in CKD (124 versus 121 mmHg) and diastolic BP was lower (72 versus 74 mmHg). At discharge, 68.7% of CKD versus 71.9% of non-CKD patients had LDL-C <100 mg/dl. Physical fitness on exercise testing improved substantially in both groups. When the Modification of Diet in Renal Disease (MDRD) formula was used for CKD classification, there was no clinically relevant change in these results.
Conclusion: Within a short period of 3-4 weeks, CR led to substantial improvements in key risk factors such as lipid profile, blood pressure, and physical fitness for all patients, even if CKD was present.
The aim of the present study was to examine how different types of tracking— between-school streaming, within-school streaming, and course-by-course tracking—shape students’ mathematics self-concept. This was done in an internationally comparative framework using data from the Programme for International Student Assessment (PISA). After controlling for individual and track mean achievement, results indicated that generally for students in course-by-course tracking, high-track students had higher mathematics self-concepts and low-track students had lower mathematics self-concepts. For students in between-school and within-school streaming, the reverse pattern was found. These findings suggest a solution to the ongoing debate about the effects of tracking on students’ academic self-concept and suggest that the reference groups to which students compare themselves differ according to the type of tracking.
We report findings from psycholinguistic experiments investigating the detailed timing of processing morphologically complex words by proficient adult second (L2) language learners of English in comparison to adult native (L1) speakers of English. The first study employed the masked priming technique to investigate -ed forms with a group of advanced Arabic-speaking learners of English. The results replicate previously found L1/L2 differences in morphological priming, even though in the present experiment an extra temporal delay was offered after the presentation of the prime words.
The second study examined the timing of constraints against inflected forms inside derived words in English using the eye-movement monitoring technique and an additional acceptability judgment task with highly advanced Dutch L2 learners of English in comparison to adult L1 English controls. Whilst offline the L2 learners performed native-like, the eye-movement data showed that their online processing was not affected by the morphological constraint against regular plurals inside derived words in the same way as in native speakers. Taken together, these findings indicate that L2 learners are not just slower than native speakers in processing morphologically complex words, but that the L2 comprehension system employs real-time grammatical analysis (in this case, morphological information) less than the L1 system.
Background: Although the benefits for health of physical activity (PA) are well documented, the majority of the population is unable to implement present recommendations into daily routine. Mobile health (mHealth) apps could help increase the level of PA. However, this is contingent on the interest of potential users.
Objective: The aim of this study was the explorative, nuanced determination of the interest in mHealth apps with respect to PA among students and staff of a university.
Methods: We conducted a Web-based survey from June to July 2015 in which students and employees from the University of Potsdam were asked about their activity level, interest in mHealth fitness apps, chronic diseases, and sociodemographic parameters.
Results: A total of 1217 students (67.30%, 819/1217; female; 26.0 years [SD 4.9]) and 485 employees (67.5%, 327/485; female; 42.7 years [SD 11.7]) participated in the survey. The recommendation for PA (3 times per week) was not met by 70.1% (340/485) of employees and 52.67% (641/1217) of students. Within these groups, 53.2% (341/641 students) and 44.2% (150/340 employees)—independent of age, sex, body mass index (BMI), and level of education or professional qualification—indicated an interest in mHealth fitness apps.
Conclusions: Even in a younger, highly educated population, the majority of respondents reported an insufficient level of PA. About half of them indicated their interest in training support. This suggests that the use of personalized mobile fitness apps may become increasingly significant for a positive change of lifestyle.
Background: Although the benefits for health of physical activity (PA) are well documented, the majority of the population is unable to implement present recommendations into daily routine. Mobile health (mHealth) apps could help increase the level of PA. However, this is contingent on the interest of potential users.
Objective: The aim of this study was the explorative, nuanced determination of the interest in mHealth apps with respect to PA among students and staff of a university.
Methods: We conducted a Web-based survey from June to July 2015 in which students and employees from the University of Potsdam were asked about their activity level, interest in mHealth fitness apps, chronic diseases, and sociodemographic parameters.
Results: A total of 1217 students (67.30%, 819/1217; female; 26.0 years [SD 4.9]) and 485 employees (67.5%, 327/485; female; 42.7 years [SD 11.7]) participated in the survey. The recommendation for PA (3 times per week) was not met by 70.1% (340/485) of employees and 52.67% (641/1217) of students. Within these groups, 53.2% (341/641 students) and 44.2% (150/340 employees)—independent of age, sex, body mass index (BMI), and level of education or professional qualification—indicated an interest in mHealth fitness apps.
Conclusions: Even in a younger, highly educated population, the majority of respondents reported an insufficient level of PA. About half of them indicated their interest in training support. This suggests that the use of personalized mobile fitness apps may become increasingly significant for a positive change of lifestyle.
This study aimed to determine the relative and absolute reliability of ultrasound (US) measurements of the thickness and echogenicity of the plantar fascia (PF) at different measurement stations along its length using a standardized protocol. Twelve healthy subjects (24 feet) were enrolled. The PF was imaged in the longitudinal plane. Subjects were assessed twice to evaluate the intra-rater reliability. A quantitative evaluation of the thickness and echogenicity of the plantar fascia was performed using Image J, a digital image analysis and viewer software. A sonography evaluation of the thickness and echogenicity of the PF showed a high relative reliability with an Intra class correlation coefficient of 0.88 at all measurement stations. However, the measurement stations for both the PF thickness and echogenicity which showed the highest intraclass correlation coefficient (ICCs) did not have the highest absolute reliability. Compared to other measurement stations, measuring the PF thickness at 3 cm distal and the echogenicity at a region of interest 1 cm to 2 cm distal from its insertion at the medial calcaneal tubercle showed the highest absolute reliability with the least systematic bias and random error. Also, the reliability was higher using a mean of three measurements compared to one measurement. To reduce discrepancies in the interpretation of the thickness and echogenicity measurements of the PF, the absolute reliability of the different measurement stations should be considered in clinical practice and research rather than the relative reliability with the ICC.
Continuous exercise (CON) and high-intensity interval exercise (HIIE) can be safely performed with type 1 diabetes mellitus (T1DM). Additionally, continuous glucose monitoring (CGM) systems may serve as a tool to reduce the risk of exercise-induced hypoglycemia. It is unclear if CGM is accurate during CON and HIIE at different mean workloads. Seven T1DM patients performed CON and HIIE at 5% below (L) and above (M) the first lactate turn point (LTP1), and 5% below the second lactate turn point (LTP2) (H) on a cycle ergometer. Glucose was measured via CGM and in capillary blood (BG). Differences were found in comparison of CGM vs. BG in three out of the six tests (p < 0.05). In CON, bias and levels of agreement for L, M, and H were found at: 0.85 (−3.44, 5.15) mmol·L−1, −0.45 (−3.95, 3.05) mmol·L−1, −0.31 (−8.83, 8.20) mmol·L−1 and at 1.17 (−2.06, 4.40) mmol·L−1, 0.11 (−5.79, 6.01) mmol·L−1, 1.48 (−2.60, 5.57) mmol·L−1 in HIIE for the same intensities. Clinically-acceptable results (except for CON H) were found. CGM estimated BG to be clinically acceptable, except for CON H. Additionally, using CGM may increase avoidance of exercise-induced hypoglycemia, but usual BG control should be performed during intense exercise.
The Star Excursion Balance Test (SEBT) is effective in measuring dynamic postural control (DPC). This research aimed to determine whether DPC measured by the SEBT in young athletes (YA) with back pain (BP) is different from those without BP (NBP). 53 BP YA and 53 NBP YA matched for age, height, weight, training years, training sessions/week and training minutes/session were studied. Participants performed 4 practice trials after which 3 measurements in the anterior, posteromedial and posterolateral SEBT reach directions were recorded. Normalized reach distance was analyzed using the mean of all 3 measurements. There was no statistical significant difference (p > 0.05) between the reach distance of BP (87.2 ± 5.3, 82.4 ± 8.2, 78.7 ± 8.1) and NBP (87.8 ± 5.6, 82.4 ± 8.0, 80.0 ± 8.8) in the anterior, posteromedial and posterolateral directions respectively. DPC in YA with BP, as assessed by the SEBT, was not different from NBP YA.
There is growing evidence to support change in the rehabilitation strategy of patellofemoral pain syndrome (PFPS) from traditional quadriceps strengthening exercises to inclusion of hip musculature strengthening in individuals with PFPS. Several studies have evaluated effects of quadriceps and hip musculature strengthening on PFPS with varying outcomes on pain and function. This systematic review and meta-analysis aims to synthesize outcomes of pain and function post-intervention and at follow-up to determine whether outcomes vary depending on the exercise strategy in both the short and long term. Electronic databases including MEDLINE, EMBASE, CINAHL, Web of Science, PubMed, Pedro database, Proquest, Science direct, and EBscoHost databases were searched for randomized control trials published between 1st of January 2005 and 31st of June 2015, comparing the outcomes of pain and function following quadriceps strengthening and hip musculature strengthening exercises in patients with PFPS. Two independent reviewers assessed each paper for inclusion and quality. Means and SDs were extracted from each included study to allow effect size calculations and comparison of results. Six randomized control trials met the inclusion criteria. Limited to moderate evidence indicates that hip abductor strengthening was associated with significantly lower pain post-intervention (SMD −0.88, −1.28 to −0.47 95% CI), and at 12 months (SMD −3.10, −3.71 to −2.50 95% CI) with large effect sizes (greater than 0.80) compared to quadriceps strengthening. Our findings suggest that incorporating hip musculature strengthening in management of PFPS tailored to individual ability will improve short-term and long-term outcomes of rehabilitation. Further research evaluating the effects of quadriceps and hip abductors strengthening focusing on reduction in anterior knee pain and improvement in function in management of PFPS is needed.
Cytochrome P450 17A1 (CYP17A1) catalyses the formation and metabolism of steroid hormones. They are involved in blood pressure (BP) regulation and in the pathogenesis of left ventricular hypertrophy. Therefore, altered function of CYP17A1 due to genetic variants may influence BP and left ventricular mass. Notably, genome wide association studies supported the role of this enzyme in BP control. Against this background, we investigated associations between single nucleotide polymorphisms (SNPs) in or nearby the CYP17A1 gene with BP and left ventricular mass in patients with arterial hypertension and associated cardiovascular organ damage treated according to guidelines. Patients (n = 1007, mean age 58.0 ± 9.8 years, 83% men) with arterial hypertension and cardiac left ventricular ejection fraction (LVEF) ≥40% were enrolled in the study. Cardiac parameters of left ventricular mass, geometry and function were determined by echocardiography. The cohort comprised patients with coronary heart disease (n = 823; 81.7%) and myocardial infarction (n = 545; 54.1%) with a mean LVEF of 59.9% ± 9.3%. The mean left ventricular mass index (LVMI) was 52.1 ± 21.2 g/m2.7 and 485 (48.2%) patients had left ventricular hypertrophy. There was no significant association of any investigated SNP (rs619824, rs743572, rs1004467, rs11191548, rs17115100) with mean 24 h systolic or diastolic BP. However, carriers of the rs11191548 C allele demonstrated a 7% increase in LVMI (95% CI: 1%–12%, p = 0.017) compared to non-carriers. The CYP17A1 polymorphism rs11191548 demonstrated a significant association with LVMI in patients with arterial hypertension and preserved LVEF. Thus, CYP17A1 may contribute to cardiac hypertrophy in this clinical condition.
Background: Age-related postural misalignment, balance deficits and strength/power losses are associated with impaired functional mobility and an increased risk of falling in seniors. Core instability strength training (CIT) involves exercises that are challenging for both trunk muscles and postural control and may thus have the potential to induce benefits in trunk muscle strength, spinal mobility and balance performance. Objective: The objective was to investigate the effects of CIT on measures of trunk muscle strength, spinal mobility, dynamic balance and functional mobility in seniors. Methods: Thirty-two older adults were randomly assigned to an intervention group (INT; n = 16, aged 70.8 +/- 4.1 years) that conducted a 9-week progressive CIT or to a control group (n = 16, aged 70.2 +/- 4.5 years). Maximal isometric strength of the trunk flexors/extensors/lateral flexors (right, left)/rotators (right, left) as well as of spinal mobility in the sagittal and the coronal plane was measured before and after the intervention program. Dynamic balance (i.e. walking 10 m on an optoelectric walkway, the Functional Reach test) and functional mobility (Timed Up and Go test) were additionally tested. Results: Program compliance was excellent with participants of the INT group completing 92% of the training sessions. Significant group x test interactions were found for the maximal isometric strength of the trunk flexors (34%, p < 0.001), extensors (21%, p < 0.001), lateral flexors (right: 48%, p < 0.001; left: 53%, p < 0.001) and left rotators (42%, p < 0.001) in favor of the INT group. Further, training-related improvements were found for spinal mobility in the sagittal (11%, p < 0.001) and coronal plane (11%, p = 0.06) directions, for stride velocity (9%, p < 0.05), the coefficient of variation in stride velocity (31%, p < 0.05), the Functional Reach test (20%, p < 0.05) and the Timed Up and Go test (4%, p < 0.05) in favor of the INT group. Conclusion: CIT proved to be a feasible exercise program for seniors with a high adherence rate. Age-related deficits in measures of trunk muscle strength, spinal mobility, dynamic balance and functional mobility can be mitigated by CIT. This training regimen could be used as an adjunct or even alternative to traditional balance and/or resistance training.
Background: Deficits in strength, power and balance represent important intrinsic risk factors for falls in seniors. Objective: The purpose of this study was to investigate the relationship between variables of lower extremity muscle strength/power and balance, assessed under various task conditions. Methods: Twenty-four healthy and physically active older adults (mean age: 70 8 5 years) were tested for their isometric strength (i.e. maximal isometric force of the leg extensors) and muscle power (i.e. countermovement jump height and power) as well as for their steady-state (i.e. unperturbed standing, 10-meter walk), proactive (i.e. Timed Up & Go test, Functional Reach Test) and reactive (i.e. perturbed standing) balance. Balance tests were conducted under single (i.e. standing or walking alone) and dual task conditions (i.e. standing or walking plus cognitive and motor interference task). Results: Significant positive correlations were found between measures of isometric strength and muscle power of the lower extremities (r values ranged between 0.608 and 0.720, p < 0.01). Hardly any significant associations were found between variables of strength, power and balance (i.e. no significant association in 20 out of 21 cases). Additionally, no significant correlations were found between measures of steady-state, proactive and reactive balance or balance tests performed under single and dual task conditions (all p > 0.05). Conclusion: The predominately nonsignificant correlations between different types of balance imply that balance performance is task specific in healthy and physically active seniors. Further, strength, power and balance as well as balance under single and dual task conditions seem to be independent of each other and may have to be tested and trained complementarily
Background: Deficits in static and particularly dynamic postural control and force production have frequently been associated with an increased risk of falling in older adults. Objective: The objectives of this study were to investigate the effects of salsa dancing on measures of static/dynamic postural control and leg extensor power in seniors. Methods: Twenty-eight healthy older adults were randomly assigned to an intervention group (INT, n = 14, age 71.6 +/- 5.3 years) to conduct an 8-week progressive salsa dancing programme or a control group (CON, n = 14, age 68.9 +/- 4.7 years). Static postural control was measured during one-legged stance on a balance platform and dynamic postural control was obtained while walking on an instrumented walkway. Leg extensor power was assessed during a countermovement jump on a force plate. Results: Programme compliance was excellent with participants of the INT group completing 92.5% of the dancing sessions. A tendency towards an improvement in the selected measures of static postural control was observed in the INT group as compared to the CON group. Significant group X test interactions were found for stride velocity, length and time. Post hoc analyses revealed significant increases in stride velocity and length, and concomitant decreases in stride time. However, salsa dancing did not have significant effects on various measures of gait variability and leg extensor power. Conclusion: Salsa proved to be a safe and feasible exercise programme for older adults accompanied with a high adherence rate. Age-related deficits in measures of static and particularly dynamic postural control can be mitigated by salsa dancing in older adults. High physical activity and fitness/mobility levels of our participants could be responsible for the nonsignificant findings in gait variability and leg extensor power.
Background: Athletes may differ in their resting metabolic rate (RMR) from the general population. However, to estimate the RMR in athletes, prediction equations that have not been validated in athletes are often used. The purpose of this study was therefore to verify the applicability of commonly used RMR predictions for use in athletes. Methods: The RMR was measured by indirect calorimetry in 17 highly trained rowers and canoeists of the German national teams (BMI 24 ± 2 kg/m2, fat-free mass 69 ± 15 kg). In addition, the RMR was predicted using Cunningham (CUN) and Harris-Benedict (HB) equations. A two-way repeated measures ANOVA was calculated to test for differences between predicted and measured RMR (α = 0.05). The root mean square percentage error (RMSPE) was calculated and the Bland-Altman procedure was used to quantify the bias for each prediction. Results: Prediction equations significantly underestimated the RMR in males (p < 0.001). The RMSPE was calculated to be 18.4% (CUN) and 20.9% (HB) in the entire group. The bias was 133 kcal/24 h for CUN and 202 kcal/24 h for HB. Conclusions: Predictions significantly underestimate the RMR in male heavyweight endurance athletes but not in females. In athletes with a high fat-free mass, prediction equations might therefore not be applicable to estimate energy requirements. Instead, measurement of the resting energy expenditure or specific prediction equations might be needed for the individual heavyweight athlete.
The present study focuses on A-scrambling in Dutch, a local word-order alternation that typically signals the discourse-anaphoric status of the scrambled constituent. We use cross-modal priming to investigate whether an A-scrambled direct object gives rise to antecedent reactivation effects in the position where a movement theory would postulate a trace. Our results indicate that this is not the case, thereby providing support for a base-generation analysis of A-scrambling in Dutch.
Introduction: Adequate cognitive function in patients is a prerequisite for successful implementation of patient education and lifestyle coping in comprehensive cardiac rehabilitation (CR) programs. Although the association between cardiovascular diseases and cognitive impairments (CIs) is well known, the prevalence particularly of mild CI in CR and the characteristics of affected patients have been insufficiently investigated so far.
Methods: In this prospective observational study, 496 patients (54.5 ± 6.2 years, 79.8% men) with coronary artery disease following an acute coronary event (ACE) were analyzed. Patients were enrolled within 14 days of discharge from the hospital in a 3-week inpatient CR program. Patients were tested for CI using the Montreal Cognitive Assessment (MoCA) upon admission to and discharge from CR. Additionally, sociodemographic, clinical, and physiological variables were documented. The data were analyzed descriptively and in a multivariate stepwise backward elimination regression model with respect to CI.
Results: At admission to CR, the CI (MoCA score < 26) was determined in 182 patients (36.7%). Significant differences between CI and no CI groups were identified, and CI group was associated with high prevalence of smoking (65.9 vs 56.7%, P = 0.046), heavy (physically demanding) workloads (26.4 vs 17.8%, P < 0.001), sick leave longer than 1 month prior to CR (28.6 vs 18.5%, P = 0.026), reduced exercise capacity (102.5 vs 118.8 W, P = 0.006), and a shorter 6-min walking distance (401.7 vs 421.3 m, P = 0.021) compared to no CI group. The age- and education-adjusted model showed positive associations with CI only for sick leave more than 1 month prior to ACE (odds ratio [OR] 1.673, 95% confidence interval 1.07–2.79; P = 0.03) and heavy workloads (OR 2.18, 95% confidence interval 1.42–3.36; P < 0.01).
Conclusion: The prevalence of CI in CR was considerably high, affecting more than one-third of cardiac patients. Besides age and education level, CI was associated with heavy workloads and a longer sick leave before ACE.
The current study investigates to what extent masked morphological priming is modulated by language-particular properties, specifically by its writing system. We present results from two masked priming experiments investigating the processing of complex Japanese words written in less common (moraic) scripts. In Experiment 1, participants performed lexical decisions on target verbs; these were preceded by primes which were either (i) a past-tense form of the same verb, (ii) a stem-related form with the epenthetic vowel -i, (iii) a semantically-related form, and (iv) a phonologically-related form. Significant priming effects were obtained for prime types (i), (ii), and (iii), but not for (iv). This pattern of results differs from previous findings on languages with alphabetic scripts, which found reliable masked priming effects for morphologically related prime/target pairs of type (i), but not for non-affixal and semantically-related primes of types (ii), and (iii). In Experiment 2, we measured priming effects for prime/target pairs which are neither morphologically, semantically, phonologically nor - as presented in their moraic scripts—orthographically related, but which—in their commonly written form—share the same kanji, which are logograms adopted from Chinese. The results showed a significant priming effect, with faster lexical-decision times for kanji-related prime/target pairs relative to unrelated ones. We conclude that affix-stripping is insufficient to account for masked morphological priming effects across languages, but that language-particular properties (in the case of Japanese, the writing system) affect the processing of (morphologically) complex words.
The current study investigates to what extent masked morphological priming is modulated by language-particular properties, specifically by its writing system. We present results from two masked priming experiments investigating the processing of complex Japanese words written in less common (moraic) scripts. In Experiment 1, participants performed lexical decisions on target verbs; these were preceded by primes which were either (i) a past-tense form of the same verb, (ii) a stem-related form with the epenthetic vowel -i, (iii) a semantically-related form, and (iv) a phonologically-related form. Significant priming effects were obtained for prime types (i), (ii), and (iii), but not for (iv). This pattern of results differs from previous findings on languages with alphabetic scripts, which found reliable masked priming effects for morphologically related prime/target pairs of type (i), but not for non-affixal and semantically-related primes of types (ii), and (iii). In Experiment 2, we measured priming effects for prime/target pairs which are neither morphologically, semantically, phonologically nor - as presented in their moraic scripts—orthographically related, but which—in their commonly written form—share the same kanji, which are logograms adopted from Chinese. The results showed a significant priming effect, with faster lexical-decision times for kanji-related prime/target pairs relative to unrelated ones. We conclude that affix-stripping is insufficient to account for masked morphological priming effects across languages, but that language-particular properties (in the case of Japanese, the writing system) affect the processing of (morphologically) complex words.
Introduction
We investigated blood glucose (BG) and hormone response to aerobic high-intensity interval exercise (HIIE) and moderate continuous exercise (CON) matched for mean load and duration in type 1 diabetes mellitus (T1DM).
Material and Methods
Seven trained male subjects with T1DM performed a maximal incremental exercise test and HIIE and CON at 3 different mean intensities below (A) and above (B) the first lactate turn point and below the second lactate turn point (C) on a cycle ergometer. Subjects were adjusted to ultra-long-acting insulin Degludec (Tresiba/ Novo Nordisk, Denmark). Before exercise, standardized meals were administered, and short-acting insulin dose was reduced by 25% (A), 50% (B), and 75% (C) dependent on mean exercise intensity. During exercise, BG, adrenaline, noradrenaline, dopamine, cortisol, glucagon, and insulin-like growth factor-1, blood lactate, heart rate, and gas exchange variables were measured. For 24 h after exercise, interstitial glucose was measured by continuous glucose monitoring system.
Results
BG decrease during HIIE was significantly smaller for B (p = 0.024) and tended to be smaller for A and C compared to CON. No differences were found for post-exercise interstitial glucose, acute hormone response, and carbohydrate utilization between HIIE and CON for A, B, and C. In HIIE, blood lactate for A (p = 0.006) and B (p = 0.004) and respiratory exchange ratio for A (p = 0.003) and B (p = 0.003) were significantly higher compared to CON but not for C.
Conclusion
Hypoglycemia did not occur during or after HIIE and CON when using ultra-long-acting insulin and applying our methodological approach for exercise prescription. HIIE led to a smaller BG decrease compared to CON, although both exercises modes were matched for mean load and duration, even despite markedly higher peak workloads applied in HIIE. Therefore, HIIE and CON could be safely performed in T1DM.
Introduction
We investigated blood glucose (BG) and hormone response to aerobic high-intensity interval exercise (HIIE) and moderate continuous exercise (CON) matched for mean load and duration in type 1 diabetes mellitus (T1DM).
Material and Methods
Seven trained male subjects with T1DM performed a maximal incremental exercise test and HIIE and CON at 3 different mean intensities below (A) and above (B) the first lactate turn point and below the second lactate turn point (C) on a cycle ergometer. Subjects were adjusted to ultra-long-acting insulin Degludec (Tresiba/ Novo Nordisk, Denmark). Before exercise, standardized meals were administered, and short-acting insulin dose was reduced by 25% (A), 50% (B), and 75% (C) dependent on mean exercise intensity. During exercise, BG, adrenaline, noradrenaline, dopamine, cortisol, glucagon, and insulin-like growth factor-1, blood lactate, heart rate, and gas exchange variables were measured. For 24 h after exercise, interstitial glucose was measured by continuous glucose monitoring system.
Results
BG decrease during HIIE was significantly smaller for B (p = 0.024) and tended to be smaller for A and C compared to CON. No differences were found for post-exercise interstitial glucose, acute hormone response, and carbohydrate utilization between HIIE and CON for A, B, and C. In HIIE, blood lactate for A (p = 0.006) and B (p = 0.004) and respiratory exchange ratio for A (p = 0.003) and B (p = 0.003) were significantly higher compared to CON but not for C.
Conclusion
Hypoglycemia did not occur during or after HIIE and CON when using ultra-long-acting insulin and applying our methodological approach for exercise prescription. HIIE led to a smaller BG decrease compared to CON, although both exercises modes were matched for mean load and duration, even despite markedly higher peak workloads applied in HIIE. Therefore, HIIE and CON could be safely performed in T1DM.