Refine
Document Type
- Article (53)
- Postprint (15)
- Doctoral Thesis (2)
- Monograph/Edited Volume (1)
- Conference Proceeding (1)
- Habilitation Thesis (1)
- Master's Thesis (1)
Is part of the Bibliography
- yes (74)
Keywords
- allostatic load (5)
- bone remodeling (3)
- cortisol (3)
- microRNA (3)
- neuroplasticity (3)
- osteoblast (3)
- osteoclast (3)
- sonography (3)
- Advanced Dynamic Flow (2)
- Aging (2)
Institute
- Fakultät für Gesundheitswissenschaften (74) (remove)
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Although aluminum chronic neurotoxicity is well documented, there are no well-established experimental protocols of Al exposure. In the current study, toxic effects of sub-chronic Al exposure have been evaluated in outbreed male rats (gastrointestinal administration). Forty animals were used: 10 were administered with AlCl3 water solution (2 mg/kg Al per day) for 1 month, 10 received the same concentration of AlCl3 for 3 month, and 20 (10 per observation period) saline as control. After 30 and 90 days, the animals underwent behavioral tests: open field, passive avoidance, extrapolation escape task, and grip strength. At the end of the study, the blood, liver, kidney, and brain were excised for analytical and morphological studies. The Al content was measured by inductively coupled plasma mass-spectrometry. Essential trace elements-Co, Cr, Cu, Fe, Mg, Mn, Mo, Se, and Zn-were measured in whole blood samples. Although no morphological changes were observed in the brain, liver, or kidney for both exposure terms, dose-dependent Al accumulation and behavioral differences (increased locomotor activity after 30 days) between treatment and control groups were indicated. Moreover, for 30 days exposure, strong positive correlation between Al content in the brain and blood for individual animals was established, which surprisingly disappeared by the third month. This may indicate neural barrier adaptation to the Al exposure or the saturation of Al transport into the brain. Notably, we could not see a clear neurodegeneration process after rather prolonged sub-chronic Al exposure, so probably longer exposure periods are required.
Angesichts der Alterung der Gesellschaft und der hohen Kosten für die Unterstützung und Pflege in privaten Haushalten stellt sich die Frage, welche Rolle assistive Roboter spielen können. Dieser Beitrag richtet sich auf die Frage, inwieweit Roboter in der Pflege heute von der erwachsenen Bevölkerung in Deutschland akzeptiert werden. Und inwieweit beeinflussen Geschlecht, Alter und Erfahrung (beruflich, persönlich) das Ausmaß dieser Akzeptanz? Die durchgeführten Auswertungen beruhen auf drei repräsentativen Erhebungen mit insgesamt über 7000 Befragten. Zwei Erhebungen fanden in der 2. Jahreshälfte 2017 im Auftrag der Deutschen Akademie der Technikwissenschaften (acatech) und des Lebensversicherers ERGO statt, die dritte Erhebung im Auftrag des Sachverständigenrats für Verbraucherfragen (SVRV) im Frühjahr 2018. Eine vertiefte und kumulative Auswertung dieser Erhebungen und Datensätze, die von den Autoren mitkonzipiert wurden, im Hinblick auf assistive Robotik ist bislang noch nicht veröffentlicht. Trotz unterschiedlicher erfragter Einsatzszenarien für Roboter in der Pflege stimmen die Ergebnisse aller 3 Erhebungen erstaunlich überein: In Deutschland gibt es eine signifikante Minderheit von Menschen, die bereits jetzt eine funktionierende Betreuung von Robotern akzeptieren würden – sofern dadurch menschliche Pflege nicht ersetzt, sondern nur unterstützt würde. Ein gutes Drittel, das nach Alter und Geschlecht differenziert ist, lehnt die Assistenz durch Roboter grundsätzlich ab.
Background
Anticancer compound 3-bromopyruvate (3-BrPA) suppresses cancer cell growth via targeting glycolytic and mitochondrial metabolism. The malignant peripheral nerve sheath tumor (MPNST), a very aggressive, therapy resistant, and Neurofibromatosis type 1 associated neoplasia, shows a high metabolic activity and affected patients may therefore benefit from 3-BrPA treatment. To elucidate the specific mode of action, we used a controlled cell model overexpressing proteasome activator (PA) 28, subsequently leading to p53 inactivation and oncogenic transformation and therefore reproducing an important pathway in MPNST and overall tumor pathogenesis.
Methods
Viability of MPNST cell lines S462, NSF1, and T265 in response to increasing doses (0-120 mu M) of 3-BrPA was analyzed by CellTiter-Blue (R) assay. Additionally, we investigated viability, reactive oxygen species (ROS) production (dihydroethidium assay), nicotinamide adenine dinucleotide dehydrogenase activity (NADH-TR assay) and lactate production (lactate assay) in mouse B8 fibroblasts overexpressing PA28 in response to 3-BrPA application. For all experiments normal and nutrient deficient conditions were tested. MPNST cell lines were furthermore characterized immunohistochemically for Ki67, p53, bcl2, bcl6, cyclin D1, and p21.
Results
MPNST significantly responded dose dependent to 3-BrPA application, whereby S462 cells were most responsive. Human control cells showed a reduced sensitivity. In PA28 overexpressing cancer cell model 3-BrPA application harmed mitochondrial NADH dehydrogenase activity mildly and significantly failed to inhibit lactate production. PA28 overexpression was associated with a functional glycolysis as well as a partial resistance to stress provoked by nutrient deprivation. 3-BrPA treatment was not associated with an increase of ROS. Starvation sensitized MPNST to treatment.
Conclusions
Aggressive MPNST cells are sensitive to 3-BrPA therapy in-vitro with and without starvation. In a PA28 overexpression cancer cell model leading to p53 inactivation, thereby reflecting a key molecular feature in human NF1 associated MPNST, known functions of 3-BrPA to block mitochondrial activity and glycolysis were reproduced, however oncogenic cells displayed a partial resistance. To conclude, 3-BrPA was sufficient to reduce NF1 associated MPNST viability potentially due inhibition of glycolysis which should lead to the initiation of further studies and promises a potential benefit for NF1 patients.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Background:
From birth to young adulthood, health and development of young people are strongly linked to their living situation, including their family's socioeconomic position (SEP) and living environment. The impact of regional characteristics on development in early childhood beyond family SEP has been rarely investigated. This study aimed to identify regional predictors of global developmental delay at school entry taking family SEP into consideration.
Method:
We used representative, population-based data from mandatory school entry examinations of the German federal state of Brandenburg in 2018/2019 with n=22,801 preschool children. By applying binary multilevel models, we hierarchically analyzed the effect of regional deprivation defined by the German Index of Socioeconomic Deprivation (GISD) and rurality operationalized as inverted population density of the children's school district on global developmental delay (GDD) while adjusting for family SEP (low, medium and high)
Results:
Family SEP was significantly and strongly linked to GDD. Children with the highest family SEP showed a lower odds for GDD compared to a medium SEP (female: OR=4.26, male: OR=3.46) and low SEP (female: OR=16.58, male: OR=12.79). Furthermore, we discovered a smaller, but additional and independent effect of regional socioeconomic deprivation on GDD, with a higher odds for children from a more deprived school district (female: OR=1.35, male: OR=1.20). However, rurality did not show a significant link to GDD in preschool children beyond family SEP and regional deprivation.
Conclusion:
Family SEP and regional deprivation are risk factors for child development and of particular interest to promote health of children in early childhood and over the life course.
This study sought to analyze the relationship between in-season training workload with changes in aerobic power (VO2max), maximum and resting heart rate (HRmax and HRrest), linear sprint medium (LSM), and short test (LSS), in soccer players younger than 16 years (under-16 soccer players). We additionally aimed to explain changes in fitness levels during the in-season through regression models, considering accumulated load, baseline levels, and peak height velocity (PHV) as predictors. Twenty-three male sub-elite soccer players aged 15.5 ± 0.2 years (PHV: 13.6 ± 0.4 years; body height: 172.7 ± 4.2 cm; body mass: 61.3 ± 5.6 kg; body fat: 13.7% ± 3.9%; VO2max: 48.4 ± 2.6 mL⋅kg–1⋅min–1), were tested three times across the season (i.e., early-season (EaS), mid-season (MiS), and end-season (EnS) for VO2max, HRmax, LSM, and LSS. Aerobic and speed variables gradually improved over the season and had a strong association with PHV. Moreover, the HRmax demonstrated improvements from EaS to EnS; however, this was more evident in the intermediate period (from EaS to MiS) and had a strong association with VO2max. Regression analysis showed significant predictions for VO2max [F(2, 20) = 8.18, p ≤ 0.001] with an R2 of 0.45. In conclusion, the meaningful variation of youth players’ fitness levels can be observed across the season, and such changes can be partially explained by the load imposed.
This study sought to analyze the relationship between in-season training workload with changes in aerobic power (VO2max), maximum and resting heart rate (HRmax and HRrest), linear sprint medium (LSM), and short test (LSS), in soccer players younger than 16 years (under-16 soccer players). We additionally aimed to explain changes in fitness levels during the in-season through regression models, considering accumulated load, baseline levels, and peak height velocity (PHV) as predictors. Twenty-three male sub-elite soccer players aged 15.5 ± 0.2 years (PHV: 13.6 ± 0.4 years; body height: 172.7 ± 4.2 cm; body mass: 61.3 ± 5.6 kg; body fat: 13.7% ± 3.9%; VO2max: 48.4 ± 2.6 mL⋅kg–1⋅min–1), were tested three times across the season (i.e., early-season (EaS), mid-season (MiS), and end-season (EnS) for VO2max, HRmax, LSM, and LSS. Aerobic and speed variables gradually improved over the season and had a strong association with PHV. Moreover, the HRmax demonstrated improvements from EaS to EnS; however, this was more evident in the intermediate period (from EaS to MiS) and had a strong association with VO2max. Regression analysis showed significant predictions for VO2max [F(2, 20) = 8.18, p ≤ 0.001] with an R2 of 0.45. In conclusion, the meaningful variation of youth players’ fitness levels can be observed across the season, and such changes can be partially explained by the load imposed.
Background: The enzyme-linked immunosorbent assay (ELISA) is an indispensable tool for clinical diagnostics to identify or differentiate diseases such as autoimmune illnesses, but also to monitor their progression or control the efficacy of drugs. One use case of ELISA is to differentiate between different states (e.g. healthy vs. diseased). Another goal is to quantitatively assess the biomarker in question, like autoantibodies. Thus, the ELISA technology is used for the discovery and verification of new autoantibodies, too. Of key interest, however, is the development of immunoassays for the sensitive and specific detection of such biomarkers at early disease stages. Therefore, users have to deal with many parameters, such as buffer systems or antigen-autoantibody interactions, to successfully establish an ELISA. Often, fine-tuning like testing of several blocking substances is performed to yield high signal-to-noise ratios. <br /> Methods: We developed an ELISA to detect IgA and IgG autoantibodies against chitinase-3-like protein 1 (CHI3L1), a newly identified autoantigen in inflammatory bowel disease (IBD), in the serum of control and disease groups (n = 23, respectively). Microwell plates with different surface modifications (PolySorp and MaxiSorp coating) were tested to detect reproducibility problems. <br /> Results: We found a significant impact of the surface properties of the microwell plates. IgA antibody reactivity was significantly lower, since it was in the range of background noise, when measured on MaxiSorp coated plates (p < 0.0001). The IgG antibody reactivity did not differ on the diverse plates, but the plate surface had a significant influence on the test result (p = 0.0005). <br /> Conclusion: With this report, we want to draw readers' attention to the properties of solid phases and their effects on the detection of autoantibodies by ELISA. We want to sensitize the reader to the fact that the choice of the wrong plate can lead to a false negative test result, which in turn has serious consequences for the discovery of autoantibodies.
Electroencephalographic (EEG) research indicates changes in adults' low frequency bands of frontoparietal brain areas executing different balance tasks with increasing postural demands. However, this issue is unsolved for adolescents when performing the same balance task with increasing difficulty. Therefore, we examined the effects of a progressively increasing balance task difficulty on balance performance and brain activity in adolescents. Thirteen healthy adolescents aged 16-17 year performed tests in bipedal upright stance on a balance board with six progressively increasing levels of task difficulty. Postural sway and cortical activity were recorded simultaneously using a pressure sensitive measuring system and EEG. The power spectrum was analyzed for theta (4-7 Hz) and alpha-2 (10-12 Hz) frequency bands in pre-defined frontal, central, and parietal clusters of electrocortical sources. Repeated measures analysis of variance (rmANOVA) showed a significant main effect of task difficulty for postural sway (p < 0.001; d = 6.36). Concomitantly, the power spectrum changed in frontal, bilateral central, and bilateral parietal clusters. RmANOVAs revealed significant main effects of task difficulty for theta band power in the frontal (p < 0.001, d = 1.80) and both central clusters (left: p < 0.001, d = 1.49; right: p < 0.001, d = 1.42) as well as for alpha-2 band power in both parietal clusters (left: p < 0.001, d = 1.39; right: p < 0.001, d = 1.05) and in the central right cluster (p = 0.005, d = 0.92). Increases in theta band power (frontal, central) and decreases in alpha-2 power (central, parietal) with increasing balance task difficulty may reflect increased attentional processes and/or error monitoring as well as increased sensory information processing due to increasing postural demands. In general, our findings are mostly in agreement with studies conducted in adults. Similar to adult studies, our data with adolescents indicated the involvement of frontoparietal brain areas in the regulation of postural control. In addition, we detected that activity of selected brain areas (e.g., bilateral central) changed with increasing postural demands.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Genetic engineering has provided humans the ability to transform organisms by direct manipulation of genomes within a broad range of applications including agriculture (e.g., GM crops), and the pharmaceutical industry (e.g., insulin production). Developments within the last 10 years have produced new tools for genome editing (e.g., CRISPR/Cas9) that can achieve much greater precision than previous forms of genetic engineering. Moreover, these tools could offer the potential for interventions on humans and for both clinical and non-clinical purposes, resulting in a broad scope of applicability. However, their promising abilities and potential uses (including their applicability in humans for either somatic or heritable genome editing interventions) greatly increase their potential societal impacts and, as such, have brought an urgency to ethical and regulatory discussions about the application of such technology in our society. In this article, we explore different arguments (pragmatic, sociopolitical and categorical) that have been made in support of or in opposition to the new technologies of genome editing and their impact on the debate of the permissibility or otherwise of human heritable genome editing interventions in the future. For this purpose, reference is made to discussions on genetic engineering that have taken place in the field of bioethics since the 1980s. Our analysis shows that the dominance of categorical arguments has been reversed in favour of pragmatic arguments such as safety concerns. However, when it comes to involving the public in ethical discourse, we consider it crucial widening the debate beyond such pragmatic considerations. In this article, we explore some of the key categorical as well sociopolitical considerations raised by the potential uses of heritable genome editing interventions, as these considerations underline many of the societal concerns and values crucial for public engagement. We also highlight how pragmatic considerations, despite their increasing importance in the work of recent authoritative sources, are unlikely to be the result of progress on outstanding categorical issues, but rather reflect the limited progress on these aspects and/or pressures in regulating the use of the technology.
Cardiac rehabilitation
(2021)
The investigation of protein structures, functions and interactions often requires modifications to adapt protein properties to the specific application. Among many possible methods to equip proteins with new chemical groups, the utilization of orthogonal aminoacyl-tRNA synthetase/tRNA pairs enables the site-specific incorporation of non-canonical amino acids at defined positions in the protein. The open nature of cell-free protein synthesis reactions provides an optimal environment, as the orthogonal components do not need to be transported across the cell membrane and the impact on cell viability is negligible. In the present work, it was shown that the expression of orthogonal aminoacyl-tRNA synthetases in CHO cells prior to cell disruption enhanced the modification of the pharmaceutically relevant adenosine A2a receptor. For this purpose, in complement to transient transfection of CHO cells, an approach based on CRISPR/Cas9 technology was selected to generate a translationally active cell lysate harboring endogenous orthogonal aminoacyl-tRNA synthetase.
Die interventionelle Behandlung des Vorhofflimmerns verursacht häufiger als in der Vergangenheit wahrgenommen eine Beeinträchtigung benachbarter Gewebe und Organe. Im Vordergrund der Betrachtungen dieser Arbeit stehen Schäden des Oesophagus, die aufgrund der schlechten Vorhersagbarkeit, des zeitlich verzögerten Auftretens und der fatalen Prognose bei Ausbildung einer atrio-oesophagealen Fistel besondere Relevanz haben.
Das Vorhofflimmern selbst ist nicht mit einer unmittelbaren vitalen Bedrohung verbunden, aber durch seine Komplikationen (z.B. Herzinsuffizienz, Schlaganfall) dennoch prognostisch relevant. Durch Antiarrhythmika gelingt keine Verbesserung der Rhythmuskontrolle (Arrhythmie-Freiheit), eine katheterinterventionelle Behandlung ist der medikamentösen Therapie überlegen. Durch eine frühzeitige und erfolgreiche Behandlung des Vorhofflimmerns konnte eine Verbesserung klinischer Endpunkte und der Prognose erreicht werden. Das Risiko einer invasiven Behandlung (insbesondere hinsichtlich des Auftretens prognoserelevanter Komplikationen) muss jedoch bei der Indikationsstellung und der Prozedur-Durchführung bedacht und gegenüber den günstigen Effekten der Behandlung abgewogen werden.
Untersuchungen zur Vermeidung der sehr seltenen atrio-oesophagealen Fisteln bedienen sich Surrogat-Parametern, hier bisher ausschließlich den ablationsinduzierten Schleimhaut-Läsionen des Oesophagus. Die Untersuchungen dieser Arbeit zeigen ein komplexeres Bild der (peri)-oesophagealen Schädigungen nach Vorhofflimmern-Ablation mit thermischen Energiequellen.
(1) Neue Definition der Oesophagus-Schäden: Oesophageale und perioesophageale Beeinträchtigungen treten sehr häufig auf (nach der hier verwendeten erweiterten Definition bei zwei Drittel der Patienten) und sind unabhängig von der verwendeten Ablationsenergie. Unterschiede finden sich in den Manifestationen der Oesophagus-Schäden für die verschiedenen Energie-Protokolle, ohne dass der Mechanismus hierfür aufgeklärt ist. Diese Arbeit beschreibt die unterschiedlichen Ausprägungen thermischer Oesophagus-Schäden, deren Determinanten und pathophysiologische Relevanz.
(2) Die Detektion (zum Teil subtiler) Oesophagus-Schäden ist maßgeblich von der Intensität der Nachsorge abhängig. Eine Beschränkung auf subjektive Schilderungen (z.B. Schmerzen beim Schluckakt, Sodbrennen) ist irreführend, die Mehrzahl der Veränderungen bleibt asymptomatisch, die Symptome der ausgebildeten atrio-oesophagealen Fistel (meist nach mehreren Wochen) bereits mit einer sehr schlechten Prognose belastet. Eine Endoskopie der Speiseröhre findet in den meisten elektrophysiologischen Zentren nicht oder nur bei anhaltenden Symptomen statt und kann ausschließlich Mukosa-Läsionen nachweisen. Damit wird das Ausmaß des oesophagealen und perioesophagealen Schadens bei Weitem unterschätzt. Veränderungen des perioesophagealen Raums, deren klinische Relevanz (noch) unklar ist, werden nicht erfasst, und damit ein Wandödem und Schäden im Gewebe zwischen linkem Vorhof und Speiseröhre (einschl. Nerven und Gefäßen) ignoriert.
Die Studien tragen auch zur Neubewertung etablierter Messgrößen und Risikofaktoren der Oesophagus-Schäden bei.
(3) Das Temperaturmonitoring im Oesophagus anhand der Maximalabweichungen ist erst für Extremwerte aussagekräftig und dadurch nicht hilfreich, Oesophagus-Läsionen zu vermeiden. Die komplexe Analyse der Temperatur-Rohdaten (bisher nur offline möglich) liefert in der AUC für RF-Ablationen einen prädiktiven Parameter für Oesophagus-Schäden, der eine Strukturierung der weiteren endoskopischen Diagnostik erlaubt. Ein vergleich¬barer Wert für die Cryoablationen konnte in den Analysen nicht gefunden werden.
(4) Eine chronische Entzündung des unteren Oesophagus-Drittels behindert nicht nur das Abheilen einer thermischen Oesophagus-Läsion, sondern kann das Auftreten solcher Läsionen durch die Ablation begünstigen. Die große Zahl vorbestehender Oesophagus-Veränderungen, die eine erhöhte Vulnerabilität anzeigen, und die Bedeutung für die Ent¬stehung thermischer Läsionen können der Ansatzpunkt präventiver Maßnahmen sein.
Ergänzend werden Ausprägungen der Oesophagus-Schäden durch umfangreiche Diagnostik erfasst und beschrieben, die aus pathophysiologischen Überlegungen relevant sein können.
(5) Die systematische Erweiterung der bildgebenden Diagnostik auf den perioesophagealen Raum durch Endosonographie zeigte, dass Schleimhaut-Läsionen alleine nur einen geringen Teil der Oesophagus-Schäden darstellen. Schleimhaut-Läsionen infolge einer instrumentellen Verletzung sind nicht mit dem Risiko der Ausbildung einer atrio-oesophagealen Fistel verbunden und unterstreichen die pathophysiologische Relevanz der perioesophagealen Veränderungen.
(6) Eine funktionelle Diagnostik thermischer Schäden des perioesophagealen Vagus-Plexus identifiziert Patienten mit Oesophagus-Schäden, die bildgebend nicht erfasst wurden, jedoch in ihren Auswirkungen (Nahrungsretention und gastro-oesophagealer Reflux) zur Läsionsprogression beitragen können.
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.