Open Access
Refine
Year of publication
Is part of the Bibliography
- yes (229)
Keywords
- sentence comprehension (7)
- embodied cognition (6)
- stress (6)
- acquisition (5)
- cardiac rehabilitation (5)
- english (5)
- language acquisition (5)
- late bilinguals (5)
- numerical cognition (5)
- reliability (5)
Institute
- Humanwissenschaftliche Fakultät (229) (remove)
Background: Although the benefits for health of physical activity (PA) are well documented, the majority of the population is unable to implement present recommendations into daily routine. Mobile health (mHealth) apps could help increase the level of PA. However, this is contingent on the interest of potential users.
Objective: The aim of this study was the explorative, nuanced determination of the interest in mHealth apps with respect to PA among students and staff of a university.
Methods: We conducted a Web-based survey from June to July 2015 in which students and employees from the University of Potsdam were asked about their activity level, interest in mHealth fitness apps, chronic diseases, and sociodemographic parameters.
Results: A total of 1217 students (67.30%, 819/1217; female; 26.0 years [SD 4.9]) and 485 employees (67.5%, 327/485; female; 42.7 years [SD 11.7]) participated in the survey. The recommendation for PA (3 times per week) was not met by 70.1% (340/485) of employees and 52.67% (641/1217) of students. Within these groups, 53.2% (341/641 students) and 44.2% (150/340 employees)—independent of age, sex, body mass index (BMI), and level of education or professional qualification—indicated an interest in mHealth fitness apps.
Conclusions: Even in a younger, highly educated population, the majority of respondents reported an insufficient level of PA. About half of them indicated their interest in training support. This suggests that the use of personalized mobile fitness apps may become increasingly significant for a positive change of lifestyle.
Background: Although the benefits for health of physical activity (PA) are well documented, the majority of the population is unable to implement present recommendations into daily routine. Mobile health (mHealth) apps could help increase the level of PA. However, this is contingent on the interest of potential users.
Objective: The aim of this study was the explorative, nuanced determination of the interest in mHealth apps with respect to PA among students and staff of a university.
Methods: We conducted a Web-based survey from June to July 2015 in which students and employees from the University of Potsdam were asked about their activity level, interest in mHealth fitness apps, chronic diseases, and sociodemographic parameters.
Results: A total of 1217 students (67.30%, 819/1217; female; 26.0 years [SD 4.9]) and 485 employees (67.5%, 327/485; female; 42.7 years [SD 11.7]) participated in the survey. The recommendation for PA (3 times per week) was not met by 70.1% (340/485) of employees and 52.67% (641/1217) of students. Within these groups, 53.2% (341/641 students) and 44.2% (150/340 employees)—independent of age, sex, body mass index (BMI), and level of education or professional qualification—indicated an interest in mHealth fitness apps.
Conclusions: Even in a younger, highly educated population, the majority of respondents reported an insufficient level of PA. About half of them indicated their interest in training support. This suggests that the use of personalized mobile fitness apps may become increasingly significant for a positive change of lifestyle.
Do properties of individual languages shape the mechanisms by which they are processed? By virtue of their non-concatenative morphological structure, the recognition of complex words in Semitic languages has been argued to rely strongly on morphological information and on decomposition into root and pattern constituents. Here, we report results from a masked priming experiment in Hebrew in which we contrasted verb forms belonging to two morphological classes, Paal and Piel, which display similar properties, but crucially differ on whether they are extended to novel verbs. Verbs from the open-class Piel elicited familiar root priming effects, but verbs from the closed-class Paal did not. Our findings indicate that, similarly to other (e.g., Indo-European) languages, down-to-the-root decomposition in Hebrew does not apply to stems of non-productive verbal classes. We conclude that the Semitic word processor is less unique than previously thought: Although it operates on morphological units that are combined in a non-linear way, it engages the same universal mechanisms of storage and computation as those seen in other languages.
Background:
Deception can distort psychological tests on socially sensitive topics. Understanding the cerebral
processes that are involved in such faking can be useful in detection and prevention of deception. Previous research
shows that faking a brief implicit association test (BIAT ) evokes a characteristic ERP response. It is not yet known
whether temporarily available self-control resources moderate this response. We randomly assigned 22 participants
(15 females, 24.23
±
2.91
years old) to a counterbalanced repeated-measurements design. Participants first com-
pleted a Brief-IAT (BIAT ) on doping attitudes as a baseline measure and were then instructed to fake a negative dop
-
ing attitude both when self-control resources were depleted and non-depleted. Cerebral activity during BIAT perfor
-
mance was assessed using high-density EEG.
Results:
Compared to the baseline BIAT, event-related potentials showed a first interaction at the parietal P1,
while significant post hoc differences were found only at the later occurring late positive potential. Here, signifi-
cantly decreased amplitudes were recorded for ‘normal’ faking, but not in the depletion condition. In source space,
enhanced activity was found for ‘normal’ faking in the bilateral temporoparietal junction. Behaviorally, participants
were successful in faking the BIAT successfully in both conditions.
Conclusions:
Results indicate that temporarily available self-control resources do not affect overt faking success on
a BIAT. However, differences were found on an electrophysiological level. This indicates that while on a phenotypical
level self-control resources play a negligible role in deliberate test faking the underlying cerebral processes are markedly different.
Previous research offers equivocal results regarding the effect of
social networking site use on individuals’ self-esteem. We con-
duct a systematic literature review to examine the existing litera-
ture and develop a theoretical framework in order to classify the
results. The framework proposes that self-esteem is affected by
three distinct processes that incorporate self-evaluative informa-
tion: social comparison processes, social feedback processing,
and self-reflective processes. Due to particularities of the social
networking site environment, the accessibility and quality of self-
evaluative information is altered, which leads to online-specific
effects on users’ self-esteem. Results of the reviewed studies
suggest that when a social networking site is used to compare
oneself with others, it mostly results in decreases in users’ self-
esteem. On the other hand, receiving positive social feedback
from others or using these platforms to reflect on one’s own self is
mainly associated with benefits for users’ self-esteem.
Nevertheless, inter-individual differences and the specific activ-
ities performed by users on these platforms should be considered
when predicting individual effects.
Ultraschall Berlin
(2014)
Ultraschall Berlin
(2014)
Two of a kind?
(2014)
School attacks are attracting increasing attention in aggression research. Recent systematic analyses provided new insights into offense and offender characteristics. Less is known about attacks in institutes of higher education (e.g., universities). It is therefore questionable whether the term “school attack” should be limited to institutions of general education or could be extended to institutions of higher education. Scientific literature is divided in distinguishing or unifying these two groups and reports similarities as well as differences. We researched 232 school attacks and 45 attacks in institutes of higher education throughout the world and conducted systematic comparisons between the two groups. The analyses yielded differences in offender (e.g., age, migration background) and offense characteristics (e.g., weapons, suicide rates), and some similarities (e.g., gender). Most differences can apparently be accounted for by offenders’ age and situational influences. We discuss the implications of our findings for future research and the development of preventative measures.
Many studies demonstrated interactions between number processing and either spatial codes (effects of spatial-numerical associations) or visual size-related codes (size-congruity effect). However, the interrelatedness of these two number couplings is still unclear. The present study examines the simultaneous occurrence of space- and size-numerical congruency effects and their interactions both within and across trials, in a magnitude judgment task physically small or large digits were presented left or right from screen center. The reaction times analysis revealed that space- and size-congruency effects coexisted in parallel and combined additively. Moreover, a selective sequential modulation of the two congruency effects was found. The size-congruency effect was reduced after size incongruent trials. The space-congruency effect, however, was only affected by the previous space congruency. The observed independence of spatial-numerical and within magnitude associations is interpreted as evidence that the two couplings reflect Different attributes of numerical meaning possibly related to orginality and cardinality.
In Germany, active bat rabies surveillance was conducted between 1993 and 2012. A total of 4546 oropharyngeal swab samples from 18 bat species were screened for the presence of EBLV-1- , EBLV-2- and BBLV-specific RNA. Overall, 0 center dot 15% of oropharyngeal swab samples tested EBLV-1 positive, with the majority originating from Eptesicus serotinus. Interestingly, out of seven RT-PCR-positive oropharyngeal swabs subjected to virus isolation, viable virus was isolated from a single serotine bat (E. serotinus). Additionally, about 1226 blood samples were tested serologically, and varying virus neutralizing antibody titres were found in at least eight different bat species. The detection of viral RNA and seroconversion in repeatedly sampled serotine bats indicates long-term circulation of the virus in a particular bat colony. The limitations of random-based active bat rabies surveillance over passive bat rabies surveillance and its possible application of targeted approaches for future research activities on bat lyssavirus dynamics and maintenance are discussed.
Background: Chronic kidney disease (CKD) is a frequent comorbidity among elderly patients and those with cardiovascular disease. CKD carries prognostic relevance. We aimed to describe patient characteristics, risk factor management and control status of patients in cardiac rehabilitation (CR), differentiated by presence or absence of CKD.
Design and methods: Data from 92,071 inpatients with adequate information to calculate glomerular filtration rate (GFR) based on the Cockcroft-Gault formula were analyzed at the beginning and the end of a 3-week CR stay. CKD was defined as estimated GFR <60 ml/min/1.73 m(2).
Results: Compared with non-CKD patients, CKD patients were significantly older (72.0 versus 58.0 years) and more often had diabetes mellitus, arterial hypertension, and atherothrombotic manifestations (previous stroke, peripheral arterial disease), but fewer were current or previous smokers had a CHD family history. Exercise capacity was much lower in CKD (59 vs. 92Watts). Fewer patients with CKD were treated with percutaneous coronary intervention (PCI), but more had coronary artery bypass graft (CABG) surgery. Patients with CKD compared with non-CKD less frequently received statins, acetylsalicylic acid (ASA), clopidogrel, beta blockers, and angiotensin converting enzyme (ACE) inhibitors, and more frequently received angiotensin receptor blockers, insulin and oral anticoagulants. In CKD, mean low density lipoprotein cholesterol (LDL-C), total cholesterol, and high density lipoprotein cholesterol (HDL-C) were slightly higher at baseline, while triglycerides were substantially lower. This lipid pattern did not change at the discharge visit, but overall control rates for all described parameters (with the exception of HDL-C) were improved substantially. At discharge, systolic blood pressure (BP) was higher in CKD (124 versus 121 mmHg) and diastolic BP was lower (72 versus 74 mmHg). At discharge, 68.7% of CKD versus 71.9% of non-CKD patients had LDL-C <100 mg/dl. Physical fitness on exercise testing improved substantially in both groups. When the Modification of Diet in Renal Disease (MDRD) formula was used for CKD classification, there was no clinically relevant change in these results.
Conclusion: Within a short period of 3-4 weeks, CR led to substantial improvements in key risk factors such as lipid profile, blood pressure, and physical fitness for all patients, even if CKD was present.
The aim of the present study was to examine how different types of tracking— between-school streaming, within-school streaming, and course-by-course tracking—shape students’ mathematics self-concept. This was done in an internationally comparative framework using data from the Programme for International Student Assessment (PISA). After controlling for individual and track mean achievement, results indicated that generally for students in course-by-course tracking, high-track students had higher mathematics self-concepts and low-track students had lower mathematics self-concepts. For students in between-school and within-school streaming, the reverse pattern was found. These findings suggest a solution to the ongoing debate about the effects of tracking on students’ academic self-concept and suggest that the reference groups to which students compare themselves differ according to the type of tracking.
With its transparent orthography, Standard Indonesian is spoken by over 160 million inhabitants and is the primary language of instruction in education and the government in Indonesia. An assessment battery of reading and reading-related skills was developed as a starting point for the diagnosis of dyslexia in beginner learners. Founded on the International Dyslexia Association’s definition of dyslexia, the test battery comprises nine empirically motivated reading and reading-related tasks assessing word reading, pseudoword reading, arithmetic, rapid automatized naming, phoneme deletion, forward and backward digit span, verbal fluency, orthographic choice (spelling), and writing. The test was validated by computing the relationships between the outcomes on the reading-skills and reading-related measures by means of correlation and factor analyses. External variables, i.e., school grades and teacher ratings of the reading and learning abilities of individual students, were also utilized to provide evidence of its construct validity. Four variables were found to be significantly related with reading-skill measures: phonological awareness, rapid naming, spelling, and digit span. The current study on reading development in Standard Indonesian confirms findings from other languages with transparent orthographies and suggests a test battery including preliminary norm scores for screening and assessment of elementary school children learning to read Standard Indonesian.
It is a common finding across languages that young children have problems in understanding patient-initial sentences. We used Tagalog, a verb-initial language with a reliable voice-marking system and highly frequent patient voice constructions, to test the predictions of several accounts that have been proposed to explain this difficulty: the frequency account, the Competition Model, and the incremental processing account. Study 1 presents an analysis of Tagalog child-directed speech, which showed that the dominant argument order is agent-before-patient and that morphosyntactic markers are highly valid cues to thematic role assignment. In Study 2, we used a combined self-paced listening and picture verification task to test how Tagalog-speaking adults and 5- and 7-year-old children process reversible transitive sentences. Results showed that adults performed well in all conditions, while children's accuracy and listening times for the first noun phrase indicated more difficulty in interpreting patient-initial sentences in the agent voice compared to the patient voice. The patient voice advantage is partly explained by both the frequency account and incremental processing account.
Using the eye-movement monitoring technique in two reading comprehension experiments, this study investigated the timing of constraints on wh-dependencies (so-called island constraints) in first- and second-language (L1 and L2) sentence processing. The results show that both L1 and L2 speakers of English are sensitive to extraction islands during processing, suggesting that memory storage limitations affect L1 and L2 comprehenders in essentially the same way. Furthermore, these results show that the timing of island effects in L1 compared to L2 sentence comprehension is affected differently by the type of cue (semantic fit versus filled gaps) signaling whether dependency formation is possible at a potential gap site. Even though L1 English speakers showed immediate sensitivity to filled gaps but not to lack of semantic fit, proficient German-speaking learners of English as a L2 showed the opposite sensitivity pattern. This indicates that initial wh-dependency formation in L2 processing is based on semantic feature matching rather than being structurally mediated as in L1 comprehension.
We report findings from psycholinguistic experiments investigating the detailed timing of processing morphologically complex words by proficient adult second (L2) language learners of English in comparison to adult native (L1) speakers of English. The first study employed the masked priming technique to investigate -ed forms with a group of advanced Arabic-speaking learners of English. The results replicate previously found L1/L2 differences in morphological priming, even though in the present experiment an extra temporal delay was offered after the presentation of the prime words.
The second study examined the timing of constraints against inflected forms inside derived words in English using the eye-movement monitoring technique and an additional acceptability judgment task with highly advanced Dutch L2 learners of English in comparison to adult L1 English controls. Whilst offline the L2 learners performed native-like, the eye-movement data showed that their online processing was not affected by the morphological constraint against regular plurals inside derived words in the same way as in native speakers. Taken together, these findings indicate that L2 learners are not just slower than native speakers in processing morphologically complex words, but that the L2 comprehension system employs real-time grammatical analysis (in this case, morphological information) less than the L1 system.
Background The prognostic effect of multi-component cardiac rehabilitation (CR) in the modern era of statins and acute revascularisation remains controversial. Focusing on actual clinical practice, the aim was to evaluate the effect of CR on total mortality and other clinical endpoints after an acute coronary event.
Design Structured review and meta-analysis.
Methods Randomised controlled trials (RCTs), retrospective controlled cohort studies (rCCSs) and prospective controlled cohort studies (pCCSs) evaluating patients after acute coronary syndrome (ACS), coronary artery bypass grafting (CABG) or mixed populations with coronary artery disease (CAD) were included, provided the index event was in 1995 or later.
Results Out of n=18,534 abstracts, 25 studies were identified for final evaluation (RCT: n=1; pCCS: n=7; rCCS: n=17), including n=219,702 patients (after ACS: n=46,338; after CABG: n=14,583; mixed populations: n=158,781; mean follow-up: 40 months). Heterogeneity in design, biometrical assessment of results and potential confounders was evident. CCSs evaluating ACS patients showed a significantly reduced mortality for CR participants (pCCS: hazard ratio (HR) 0.37, 95% confidence interval (CI) 0.20-0.69; rCCS: HR 0.64, 95% CI 0.49-0.84; odds ratio 0.20, 95% CI 0.08-0.48), but the single RCT fulfilling Cardiac Rehabilitation Outcome Study (CROS) inclusion criteria showed neutral results. CR participation was also associated with reduced mortality after CABG (rCCS: HR 0.62, 95% CI 0.54-0.70) and in mixed CAD populations.
Conclusions CR participation after ACS and CABG is associated with reduced mortality even in the modern era of CAD treatment. However, the heterogeneity of study designs and CR programmes highlights the need for defining internationally accepted standards in CR delivery and scientific evaluation.
A particular form of social pain is invalidation. Therefore, this study (a) investigates whether patients with chronic low back pain experience invalidation, (b) if it has an influence on their pain, and (c) explores whether various social sources (e.g. partner and work) influence physical pain differentially. A total of 92 patients completed questionnaires, and for analysis, Pearson's correlation coefficients and hierarchical linear regression analyses were conducted. They indicated a significant association between discounting and disability due to pain (respective =.29, p>.05). Especially, discounting by partner was linked to higher disability (=.28, p>.05).
According to recent literature sodium bicarbonate (NaHCO3) has been proposed as a performance enhancing aid by reducing acidosis during exercise. The aim of the current review is to investigate if the duration of exercise is an essential factor for the effect
of NaHCO3. To collect the latest studies from electronic database
of PubMed, study publication time was restricted from December 2006 to December 2016. The search was updated in July 2018. The studies were divided into exercise durations of > 4 or ≤ 4 minutes for easier comparability of their effects in different exercises. Only randomized controlled trials were included in this review. Of the 775 studies, 35 met the inclusion criteria. Study design, subjects, effects as well as outcome criteria were inconsistent throughout the studies. Seventeen of these studies reported
performance enhancing effects after supplementing NaHCO3. Eleven of twenty studies with exercise duration of ≤ 4 minutes showed positive and four diverse results after supplementing NaHCO3. On the other hand six of fifteen studies with an exercise duration of >4 minutes showed performance enhancing and two studies showed diverse results. Consequently, the duration of exercise might be influential for inducing a performance enhancing effect when supplementing NaHCO3, but to which extent, remains unclear due to the inconsistencies in the study results.