Strukturbereich Kognitionswissenschaften
Refine
Year of publication
Document Type
- Article (221) (remove)
Keywords
- exercise (8)
- football (7)
- fMRI (6)
- German (5)
- adolescents (5)
- aging (5)
- embodied cognition (5)
- neuroimaging (5)
- performance (5)
- working memory (5)
Institute
- Strukturbereich Kognitionswissenschaften (221) (remove)
Various behavioural studies show that semantic typicality (TYP) and age of acquisition (AOA) of a specific word influence processing time and accuracy during the performance of lexical-semantic tasks. This study examines the influence of TYP and AOA on semantic processing at behavioural (response times and accuracy data) and electrophysiological levels using an auditory category-member-verification task. Reaction time data reveal independent TYP and AOA effects, while in the accuracy data and the event-related potentials predominantly effects of TYP can be found. The present study thus confirms previous findings and extends evidence found in the visual modality to the auditory modality. A modality-independent influence on semantic word processing is manifested. However, with regard to the influence of AOA, the diverging results raise questions on the origin of AOA effects as well as on the interpretation of offline and online data. Hence, results will be discussed against the background of recent theories on N400 correlates in semantic processing. In addition, an argument in favour of a complementary use of research techniques will be made. (C) 2015 Elsevier Ltd. All rights reserved.
Continuous treatment with antidementia drugs in Germany 2003-2013: a retrospective database analysis
(2015)
Background: Continuous treatment is an important indicator of medication adherence in dementia. However, long-term studies in larger clinical settings are lacking, and little is known about moderating effects of patient and service characteristics.
Methods: Data from 12,910 outpatients with dementia (mean age 79.2 years; SD = 7.6 years) treated between January 2003 and December 2013 in Germany were included. Continuous treatment was analysed using Kaplan-Meier curves and log-rank tests. In addition, multivariate Cox regression models were fitted with continuous treatment as dependent variable and the predictors antidementia agent, age, gender, medical comorbidities, physician specialty, and health insurance status.
Results: After one year of follow-up, nearly 60% of patients continued drug treatment. Donezepil (HR: 0.88; 95% CI: 0.82-0.95) and memantine (HR: 0.85; 0.79-0.91) patients were less likely to be discontinued treatment as compared to rivastigmine users. Patients were less likely to be discontinued if they were treated by specialist physicians as compared to general practitioners (HR: 0.44; 0.41-0.48). Younger male patients and patients who had private health insurance had a lower discontinuation risk. Regarding comorbidity, patients were more likely to be continuously treated with the index substance if a diagnosis of heart failure or hypertension had been diagnosed at baseline.
Conclusions: Our results imply that besides type of antidementia agent, involvement of a specialist in the complex process of prescribing antidementia drugs can provide meaningful benefits to patients, in terms of more disease-specific and continuous treatment.
Cytochrome P450 17A1 (CYP17A1) catalyses the formation and metabolism of steroid hormones. They are involved in blood pressure (BP) regulation and in the pathogenesis of left ventricular hypertrophy. Therefore, altered function of CYP17A1 due to genetic variants may influence BP and left ventricular mass. Notably, genome wide association studies supported the role of this enzyme in BP control. Against this background, we investigated associations between single nucleotide polymorphisms (SNPs) in or nearby the CYP17A1 gene with BP and left ventricular mass in patients with arterial hypertension and associated cardiovascular organ damage treated according to guidelines. Patients (n = 1007, mean age 58.0 +/- 9.8 years, 83% men) with arterial hypertension and cardiac left ventricular ejection fraction (LVEF) 40% were enrolled in the study. Cardiac parameters of left ventricular mass, geometry and function were determined by echocardiography. The cohort comprised patients with coronary heart disease (n = 823; 81.7%) and myocardial infarction (n = 545; 54.1%) with a mean LVEF of 59.9% +/- 9.3%. The mean left ventricular mass index (LVMI) was 52.1 +/- 21.2 g/m(2.7) and 485 (48.2%) patients had left ventricular hypertrophy. There was no significant association of any investigated SNP (rs619824, rs743572, rs1004467, rs11191548, rs17115100) with mean 24 h systolic or diastolic BP. However, carriers of the rs11191548 C allele demonstrated a 7% increase in LVMI (95% CI: 1%-12%, p = 0.017) compared to non-carriers. The CYP17A1 polymorphism rs11191548 demonstrated a significant association with LVMI in patients with arterial hypertension and preserved LVEF. Thus, CYP17A1 may contribute to cardiac hypertrophy in this clinical condition.
Background: Evidence that home telemonitoring for patients with chronic heart failure (CHF) offers clinical benefit over usual care is controversial as is evidence of a health economic advantage.
Methods: Between January 2010 and June 2013, patients with a confirmed diagnosis of CHF were enrolled and randomly assigned to 2 study groups comprising usual care with and without an interactive bi-directional remote monitoring system (Motiva (R)). The primary endpoint in CardioBBEAT is the Incremental Cost-Effectiveness Ratio (ICER) established by the groups' difference in total cost and in the combined clinical endpoint "days alive and not in hospital nor inpatient care per potential days in study" within the follow-up of 12 months.
Results: A total of 621 predominantly male patients were enrolled, whereof 302 patients were assigned to the intervention group and 319 to the control group. Ischemic cardiomyopathy was the leading cause of heart failure. Despite randomization, subjects of the control group were more often in NYHA functional class III-IV, and exhibited peripheral edema and renal dysfunction more often. Additionally, the control and intervention groups differed in heart rhythm disorders. No differences existed regarding risk factor profile, comorbidities, echocardiographic parameters, especially left ventricular and diastolic diameter and ejection fraction, as well as functional test results, medication and quality of life. While the observed baseline differences may well be a play of chance, they are of clinical relevance. Therefore, the statistical analysis plan was extended to include adjusted analyses with respect to the baseline imbalances.
Conclusions: CardioBBEAT provides prospective outcome data on both, clinical and health economic impact of home telemonitoring in CHF. The study differs by the use of a high evidence level randomized controlled trial (RCT) design along with actual cost data obtained from health insurance companies. Its results are conducive to informed political and economic decision-making with regard to home telemonitoring solutions as an option for health care. Overall, it contributes to developing advanced health economic evaluation instruments to be deployed within the specific context of the German Health Care System.
Accentual preferences and predictability: An acceptability study on split intransitivity in German
(2015)
The difference in the default prosodic realization of simple sentences with unergative vs. unaccusative/passive verbs (assigning early nuclear accent with unaccusative/passive verbs but late nuclear accent with unergative verbs) is often related to the syntactic distinction of their nominative arguments as starting off in different hierarchical positions. Alternative accounts try to trace this prosodic variation back to asymmetries in the semantic or pragmatic contribution of the verb to an utterance. The present article investigates the interaction of the assignment of default nuclear accent with the predictability of the verb. In an experimental study testing the acceptability of nuclear accent assignment, we confirmed that the predictability of the verb influences accentual preferences (such that highly predictable verbs are preferably not accented). However, the experiment also reveals that the unaccusativity distinction cannot be accounted for by means of pragmatic phenomena of this type: the two verb classes are associated with distinct accentual patterns in the baseline condition, that is, without the predictability manipulation. (C) 2014 Elsevier B.V. All rights reserved.
This paper reports the results of a production experiment that explores the prosodic realization of focus in Hungarian, a language that is characterized by obligatory syntactic focus marking. Our study investigates narrow focus in sentences in which focus is unambiguously marked by syntactic means, comparing it to broad focus sentences. Potential independent effects of the salience (textual givenness) of the background of the narrow focus and the contrastiveness of the focus are controlled for and are also examined.
The results show that both continuous phonetic measures and categorical factors such as the distribution of contour types are affected by the focus-related factors, despite the presence of syntactic focus marking. The phonetic effects found are mostly parallel to those of typical prosodic focus marking languages like English. The prosodic prominence required of focus is realized through changes to the scaling and slope of F0 targets and contours. The asymmetric prominence relation between the focus and the background can be expressed not only by the phonetic marking of the prominence of the focused element, but also by the phonetic marking of the reduced prominence of the background. Furthermore, contrastiveness of focus and (textual) givenness of the background show independent phonetic effects, both of them affecting the realization of the background. These results are argued to shed light on alternative approaches to the information structural notion of contrastive focus and the relation between the notions of focus and givenness. (C) 2014 Elsevier B.V. All rights reserved.
Validation of two accelerometers to determine mechanical loading of physical activities in children
(2015)
The purpose of this study was to assess the validity of accelerometers using force plates (i.e., ground reaction force (GRF)) during the performance of different tasks of daily physical activity in children. Thirteen children (10.1 (range 5.4-15.7)years, 3 girls) wore two accelerometers (ActiGraph GT3X+ (ACT), GENEA (GEN)) at the hip that provide raw acceleration signals at 100Hz. Participants completed different tasks (walking, jogging, running, landings from boxes of different height, rope skipping, dancing) on a force plate. GRF was collected for one step per trial (10 trials) for ambulatory movements and for all landings (10 trials), rope skips and dance procedures. Accelerometer outputs as peak loading (g) per activity were averaged. ANOVA, correlation analyses and Bland-Altman plots were computed to determine validity of accelerometers using GRF. There was a main effect of task with increasing acceleration values in tasks with increasing locomotion speed and landing height (P<0.001). Data from ACT and GEN correlated with GRF (r=0.90 and 0.89, respectively) and between each other (r=0.98), but both accelerometers consistently overestimated GRF. The new generation of accelerometer models that allow raw signal detection are reasonably accurate to measure impact loading of bone in children, although they systematically overestimate GRF.
The direction of object enumeration reflects children's enculturation but previous work on the development of such spatial preferences has been inconsistent. Therefore, we documented directional preferences in finger counting, object counting, and picture naming for children (4 groups from 3 to 6 years, N = 104) and adults (N = 56). We found a right-side preference for finger counting in 3- to 6-year-olds and a left-side preference for counting objects and naming pictures by 6 years of age. Children were consistent in their special preferences when comparing object counting and picture naming, but not in other task pairings. Finally, spatial preferences were not related to cardinality comprehension. These results, together with other recent work, suggest a gradual development of spatial-numerical associations from early non-directional mappings into culturally constrained directional mappings.
The interruption of learning processes by breaks filled with diverse activities is common in everyday life. This study investigated the effects of active computer gaming and passive relaxation (rest and music) breaks on auditory versus visual memory performance. Young adults were exposed to breaks involving (a) open eyes resting, (b) listening to music, and (c) playing a video game, immediately after memorizing auditory versus visual stimuli. To assess learning performance, words were recalled directly after the break (an 8:30 minute delay) and were recalled and recognized again after 7 days. Based on linear mixed-effects modeling, it was found that playing the Angry Birds video game during a short learning break impaired long-term retrieval in auditory learning but enhanced long-term retrieval in visual learning compared with the music and rest conditions. These differential effects of video games on visual versus auditory learning suggest specific interference of common break activities on learning.
There is evidence of substantial benefit of cardiac rehabilitation (CR) for patients with low exercise capacity at admission. Nevertheless, some patients are not able to perform an initial exercise stress test (EST). We aimed to describe this group using data of 1094 consecutive patients after a cardiac event (71 +/- 7 years, 78% men) enrolled in nine centres for inpatient CR. We analysed sociodemographic and clinical variables (e.g. cardiovascular risk factors, comorbidities, complications at admission), amount of therapy (e.g. exercise training, nursing care) and the results of the initial and the final 6-min walking test (6MWT) with respect to the application of an EST. Fifteen per cent of patients did not undergo an EST (non-EST group). In multivariable analysis, the probability of obtaining an EST was higher for men [odds ratio (OR) 1.89, P=0.01], a 6MWT (per 10 m, OR 1.07, P<0.01) and lower for patients with diabetes mellitus (OR 0.48, P<0.01), NYHA-class III/IV (OR 0.27, P<0.01), osteoarthritis (OR 0.39, P<0.01) and a longer hospital stay (per 5 days, OR 0.87, P=0.02). The non-EST group received fewer therapy units of exercise training, but more units of nursing care and physiotherapy than the EST group. However, there were no significant differences between both groups in the increase of the 6MWT during CR (123 vs. 108 m, P=0.122). The present study confirms the feasibility of an EST at the start of CR as an indicator of disease severity. Nevertheless, patients without EST benefit from CR even if exercising less. Thus, there is a justified need for individualized, comprehensive and interdisciplinary CR.
Working memory load-dependent brain response predicts behavioral training gains in older adults
(2014)
In the domain of working memory (WM), a sigmoid-shaped relationship between WM load and brain activation patterns has been demonstrated in younger adults. It has been suggested that age-related alterations of this pattern are associated with changes in neural efficiency and capacity. At the same time, WM training studies have shown that some older adults are able to increase their WM performance through training. In this study, functional magnetic resonance imaging during an n-back WM task at different WM load levels was applied to compare blood oxygen level-dependent (BOLD) responses between younger and older participants and to predict gains in WM performance after a subsequent 12-session WM training procedure in older adults. We show that increased neural efficiency and capacity, as reflected by more "youth-like" brain response patterns in regions of interest of the frontoparietal WM network, were associated with better behavioral training outcome beyond the effects of age, sex, education, gray matter volume, and baseline WM performance. Furthermore, at low difficulty levels, decreases in BOLD response were found after WM training. Results indicate that both neural efficiency (i. e., decreased activation at comparable performance levels) and capacity (i. e., increasing activation with increasing WM load) of a WM-related network predict plasticity of the WM system, whereas WM training may specifically increase neural efficiency in older adults.
Introduction: Cardiac rehabilitation is designed for patients suffering from cardiovascular diseases or functional disabilities. The aim of a cardiac rehabilitation is to improve overall physical health, psychological well-being, physical function, the ability to participate in social life and help patients to change their habits. Regarding the heterogeneity of these aims measuring of the effect of cardiac rehabilitation is still a challenge. This study recommends a concept to assess the effects of cardiac rehabilitation regarding the individual change of relevant quality indicators.
Methods: With EVA-Reha; cardiac rehabilitation the Medical Advisory Service of Statutory Health Insurance Funds in Rhineland-Palatinate, Alzey (MDK Rheinland-Pfalz) developed a software to collect data set including sociodemographic and diagnostic data and also the results of specific assessments. The project was funded by the Techniker Krankenkasse, Hamburg, and supported by participating rehabilitation centers. From 01. July 2010 to 30. June 2011 1309 patients (age 71.5 years, 76.1% men) from 13 rehabilitation centers were consecutively enrolled. 13 quality indicators in 3 scales were developed for evaluation of cardiac rehabilitation: 1) cardiovascular risk factors (blood pressure, LDL cholesterol, triglycerides), 2) exercise capacity (resting heart rate, maximal exercise capacity, maximal walking distance, heart failure [NYHA classification], and angina pectoris [CCS classification]) and 3) subjective health (IRES-24: pain, somatic health, psychological wellbeing and depression as well as anxiety on the HADS). The study was prospective; data of patients were assessed at entry and discharge of rehabilitation. To measure the success of rehabilitation each parameter was graded in severity classes at entry and discharge. For each of the 13 quality indicators changes of severity class were rated in a rating matrix. For indicators without a requirement for medical care neither at entry nor at discharge no rating was performed.
Results: The grading into severity classes as well as the minimal important differences were given for the 13 quality indicators. The result of rehabilitation can be demonstrated in suitable form by means of rating of the 13 quality indicators according to a clinical population. The rating model differs well between clinically changed and unchanged patients for the quality indicators.
Conclusion: The result of cardiac rehabilitation can be assessed with 13 quality indicators measured at entry and discharge of the rehabilitation program. If a change into a more favorable category at the end of rehabilitation could be achieved it was counted as a success. The 13 quality indicators can be used to assess the individual result as well as the result of a population - e.g. all patients of a clinic in a specific time period. In addition, the assessment and rating of relevant quality indicators can be used for comparisons of rehabilitation centers.
Background: Outcome quality management requires the consecutive registration of defined variables. The aim was to identify relevant parameters in order to objectively assess the in-patient rehabilitation outcome.
Methods: From February 2009 to June 2010 1253 patients (70.9 +/- 7.0 years, 78.1% men) at 12 rehabilitation clinics were enrolled. Items concerning sociodemographic data, the impairment group (surgery, conservative/interventional treatment), cardiovascular risk factors, structural and functional parameters and subjective health were tested in respect of their measurability, sensitivity to change and their propensity to be influenced by rehabilitation.
Results: The majority of patients (61.1%) were referred for rehabilitation after cardiac surgery, 38.9% after conservative or interventional treatment for an acute coronary syndrome. Functionally relevant comorbidities were seen in 49.2% (diabetes mellitus, stroke, peripheral artery disease, chronic obstructive lung disease). In three key areas 13 parameters were identified as being sensitive to change and subject to modification by rehabilitation: cardiovascular risk factors (blood pressure, low-density lipoprotein cholesterol, triglycerides), exercise capacity (resting heart rate, maximal exercise capacity, maximal walking distance, heart failure, angina pectoris) and subjective health (IRES-24 (indicators of rehabilitation status): pain, somatic health, psychological well-being and depression as well as anxiety on the Hospital Anxiety and Depression Scale).
Conclusion: The outcome of in-patient rehabilitation in elderly patients can be comprehensively assessed by the identification of appropriate key areas, that is, cardiovascular risk factors, exercise capacity and subjective health. This may well serve as a benchmark for internal and external quality management.
Background: Travel-related conditions have impact on the quality of oral anticoagulation therapy (OAT) with vitamin K-antagonists. No predictors for travel activity and for travel-associated haemorrhage or thromboembolic complications of patients on OAT are known.
Methods: A standardised questionnaire was sent to 2500 patients on long-term OAT in Austria, Switzerland and Germany. 997 questionnaires were received (responder rate 39.9%). Ordinal or logistic regression models with travel activity before and after onset of OAT or travel-associated haemorrhages and thromboembolic complications as outcome measures were applied.
Results: 43.4% changed travel habits since onset of OAT with 24.9% and 18.5% reporting decreased or increased travel activity, respectively. Long-distance worldwide before OAT or having suffered from thromboembolic complications was associated with reduced travel activity. Increased travel activity was associated with more intensive travel experience, increased duration of OAT, higher education, or performing patient self-management (PSM). Travel-associated haemorrhages or thromboennbolic complications were reported by 6.5% and 0.9% of the patients, respectively. Former thromboennbolic complications, former bleedings and PSM were significant predictors of travel-associated complications.
Conclusions: OAT also increases travel intensity. Specific medical advice prior travelling to prevent complications should be given especially to patients with former bleedings or thromboennbolic complications and to those performing PSM. (C) 2014 Elsevier Ltd. All rights reserved.
The purpose of this study was to compare static balance performance and muscle activity during one-leg standing on the dominant and nondominant leg under various sensory conditions with increased levels of task difficulty. Thirty healthy young adults (age: 23 +/- 2 years) performed one-leg standing tests for 30 s under three sensory conditions (ie, eyes open/firm ground; eyes open/foam ground [elastic pad on top of the balance plate]; eyes closed/firm ground). Center of pressure displacements and activity of four lower leg muscles (ie, m. tibialis anterior [TA], m. soleus [SOL], m. gastrocnemius medialis [GAS], m. peroneus longus [PER]) were analyzed. An increase in sensory task difficulty resulted in deteriorated balance performance (P < .001, effect size [ES] = .57-2.54) and increased muscle activity (P < .001, ES = .50-1.11) for all but two muscles (ie, GAS, PER). However, regardless of the sensory condition, one-leg standing on the dominant as compared with the nondominant limb did not produce statistically significant differences in various balance (P > .05, ES = .06-.22) and electromyographic (P > .05, ES = .03-.13) measures. This indicates that the dominant and the nondominant leg can be used interchangeably during static one-leg balance testing in healthy young adults.
Computer aided dosage management of phenprocoumon anticoagulation therapy Clinical validation
(2014)
A recently developed multiparameter computer-aided expert system (TheMa) for guiding anticoagulation with phenprocoumon (PPC) was validated by a prospective investigation in 22 patients. The PPC-INR-response curve resulting from physician guided dosage was compared to INR values calculated by "twin calculation" from TheMa recommended dosage. Additionally, TheMa was used to predict the optimal time to perform surgery or invasive procedures after interruption of anticogulation therapy. Results: Comparison of physician and TheMa guided anticoagulation showed almost identical accuracy by three quantitative measures: Polygon integration method (area around INR target) 616.17 vs. 607.86, INR hits in the target range 166 vs. 161, and TTR (time in therapeutic range) 63.91 vs. 62.40 %. After discontinuation of anticoagulation therapy, calculating the INR phase-out curve with TheMa INR prognosis of 1.8 was possible with a standard deviation of 0.50 +/- 0.59 days. Conclusion: Guiding anticoagulation with TheMa was as accurate as Physician guided therapy. After interruption of anticoagulant therapy, TheMa may be used for calculating the optimal time performing operations or initiating bridging therapy.
The proportion of elderly people in societies of western industrialized countries is continuously rising. Biologic aging induces deficits in balance and muscle strength/power in old age, which is responsible for an increased prevalence of falls. Therefore, nationwide and easy-to-administer fall prevention programs have to be developed in order to contribute to the autonomy and quality of life in old age and to help reduce the financial burden on the public health care system due to the treatment of fall-related injuries. This narrative (qualitative) literature review deals with a) the reasons for an increased prevalence of falls in old age, b) important clinical tests for fall-risk assessment, and c) evidence-based intervention/training programs for fall prevention in old age. The findings of this literature review are based on a cost-free practice guide that is available to the public (via the internet) and that was created by an expert panel (i.e., geriatricians, exercise scientists, physiotherapists, geriatric therapists). The present review provides the scientific foundation of the practice guide.
Numerical cognitions such as spatial-numerical associations have been observed to be influenced by grounded, embodied and situated factors. For the case of finger counting, grounded and embodied influences have been reported. However, situated influences, e.g., that reported counting habits change with perception and action within a given situation, have not been systematically examined. To pursue the issue of situatedness of reported finger-counting habits, 458 participants were tested in three separate groups: (1) spontaneous condition: counting with both hands available, (2) perceptual condition: counting with horizontal (left-to-right) perceptual arrangement of fingers (3) perceptual and proprioceptive condition: counting with horizontal (left-to-right) perceptual arrangement of fingers and with busy dominant hand. Report of typical counting habits differed strongly between the three conditions. 28 % reported to start counting with the left hand in the spontaneous counting condition (1), 54 % in the perceptual condition (2) and 62 % in the perceptual and proprioceptive condition (3). Additionally, all participants in the spontaneous counting group showed a symmetry-based counting pattern (with the thumb as number 6), while in the two other groups, a considerable number of participants exhibited a spatially continuous counting pattern (with the pinkie as number 6). Taken together, the study shows that reported finger-counting habits depend on the perceptual and proprioceptive situation and thus are strongly influenced by situated cognition. We suggest that this account reconciles apparently contradictory previous findings of different counting preferences regarding the starting hand in different examination situations.
In this article we report on early rhythmic discrimination performance of children who participated in a longitudinal study following children from birth to their 6th year of life. Thirty-four children including 8 children with a family risk for developmental language impairment were tested on the discrimination of trochaic and iambic disyllabic sequences when they were 4 months old. At 5 years of age, standardized measures on language performance (SETK3-5) and nonverbal intelligence (SON-R) were obtained. Overall, evidence of discrimination of the rhythmic patterns was found only for children without a family risk. The performance in early rhythmic discrimination correlated with the later outcomes in SETK3-5 subtests on sentence comprehension and morphological skills, but not with subtests related to memory performance nor with nonverbal intelligence. Our results suggest that indicators of language development can be discovered as early as 4 months of age, and seem to correlate with later outcomes in rather specific language skills.
Background: Chronic kidney disease (CKD) is a frequent comorbidity among elderly patients and those with cardiovascular disease. CKD carries prognostic relevance. We aimed to describe patient characteristics, risk factor management and control status of patients in cardiac rehabilitation (CR), differentiated by presence or absence of CKD.
Design and methods: Data from 92,071 inpatients with adequate information to calculate glomerular filtration rate (GFR) based on the Cockcroft-Gault formula were analyzed at the beginning and the end of a 3-week CR stay. CKD was defined as estimated GFR <60 ml/min/1.73 m(2).
Results: Compared with non-CKD patients, CKD patients were significantly older (72.0 versus 58.0 years) and more often had diabetes mellitus, arterial hypertension, and atherothrombotic manifestations (previous stroke, peripheral arterial disease), but fewer were current or previous smokers had a CHD family history. Exercise capacity was much lower in CKD (59 vs. 92Watts). Fewer patients with CKD were treated with percutaneous coronary intervention (PCI), but more had coronary artery bypass graft (CABG) surgery. Patients with CKD compared with non-CKD less frequently received statins, acetylsalicylic acid (ASA), clopidogrel, beta blockers, and angiotensin converting enzyme (ACE) inhibitors, and more frequently received angiotensin receptor blockers, insulin and oral anticoagulants. In CKD, mean low density lipoprotein cholesterol (LDL-C), total cholesterol, and high density lipoprotein cholesterol (HDL-C) were slightly higher at baseline, while triglycerides were substantially lower. This lipid pattern did not change at the discharge visit, but overall control rates for all described parameters (with the exception of HDL-C) were improved substantially. At discharge, systolic blood pressure (BP) was higher in CKD (124 versus 121 mmHg) and diastolic BP was lower (72 versus 74 mmHg). At discharge, 68.7% of CKD versus 71.9% of non-CKD patients had LDL-C <100 mg/dl. Physical fitness on exercise testing improved substantially in both groups. When the Modification of Diet in Renal Disease (MDRD) formula was used for CKD classification, there was no clinically relevant change in these results.
Conclusion: Within a short period of 3-4 weeks, CR led to substantial improvements in key risk factors such as lipid profile, blood pressure, and physical fitness for all patients, even if CKD was present.