Refine
Is part of the Bibliography
- yes (14)
Keywords
- pea (3)
- Sampling (2)
- Throughfall (2)
- carotenoid (2)
- flavonoid (2)
- lupin (2)
- microgreen (2)
- ontogeny (2)
- thermal processing of food (2)
- Brassica (1)
The main intention of this contribution is to discuss different nonlinear approaches to heart rate and blood pressure variability analysis for a better understanding of the cardiovascular regulation. We investigate measures of complexity which are based on symbolic dynamics, renormalised entropy and the finite time growth rates. The dual sequence method to estimate the baroreflex sensitivity and the maximal correlation method to estimate the nonlinear coupling between time series are employed for analysing bivariate data. The latter appears to be a suitable method to estimate the strength of the nonlinear coupling and the coupling direction. Heart rate and blood pressure data from clinical pilot studies and from very large clinical studies are analysed. We demonstrate that parameters from nonlinear dynamics are useful for risk stratification after myocardial infarction, for the prediction of life-threatening cardiac events even in short time series, and for modelling the relationship between heart rate and blood pressure regulation. These findings could be of importance for clinical diagnostics, in algorithms for risk stratification, and for therapeutic and preventive tools of next generation implantable cardioverter defibrillators.
Online Laddering
(2007)
Objectives. Ventricular tachycardia (VT) provoking sudden cardiac death (SCD) are a major cause of mortality in the developed countries. The most efficient therapy for SCID prevention are implantable cardioverter defibrillators (ICD). In this study heart rate variability (HRV) measures were analyzed for short-term forecasting of VT in order to improve VT sensing and to enable a patient warning of forthcoming shocks. Methods. The lost 1000 normal beat-to-beat intervals before 50 VT episodes stored by the ICD were analyzed and compared to individually acquire control time series (CON). HRV analysis was performed with standard parameters of time and frequency domain as suggested by the HRV Task Force and furthermore with a newly developed and optimized nonlinear parameter that assesses the compression entropy of heart rate (H-c). Results. Except of meanNN (p = 0.02) we found no significant differences in standard HRV parameters. In contrast, H, revealed highly significant (p = 0.007) alterations in VT compared with CON suggesting a decreased complexity before the onset of VT. Conclusion: Compression entropy might be a suitable parameter for short-term forecasting of life-threatening tachycardia in ICD
In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous through fall studies relied on method-of-moments variogram estimation and sample sizes <<200, currently available data are prone to large uncertainties. (C) 2016 Elsevier B.V. All rights reserved.
estimating mean throughfall
(2016)
The selection of an appropriate spatial extent of a sampling plot is one among several important decisions involved in planning a throughfall sampling scheme. In fact, the choice of the extent may determine whether or not a study can adequately characterize the hydrological fluxes of the studied ecosystem. Previous attempts to optimize throughfall sampling schemes focused on the selection of an appropriate sample size, support, and sampling design, while comparatively little attention has been given to the role of the extent. In this contribution, we investigated the influence of the extent on the representativeness of mean throughfall estimates for three forest ecosystems of varying stand structure. Our study is based on virtual sampling of simulated throughfall fields. We derived these fields from throughfall data sampled in a simply structured forest (young tropical forest) and two heterogeneous forests (old tropical forest, unmanaged mixed European beech forest). We then sampled the simulated throughfall fields with three common extents and various sample sizes for a range of events and for accumulated data. Our findings suggest that the size of the study area should be carefully adapted to the complexity of the system under study and to the required temporal resolution of the throughfall data (i.e. event-based versus accumulated). Generally, event-based sampling in complex structured forests (conditions that favor comparatively long autocorrelations in throughfall) requires the largest extents. For event-based sampling, the choice of an appropriate extent can be as important as using an adequate sample size. (C) 2016 Elsevier B.V. All rights reserved.
Standard time and frequency parameters of heart rate variability (HRV) describe only linear and periodic behaviour, whereas more complex relationships cannot be recognised. A method that may be capable of assessing more complex properties is the non-linear measure of 'renormalised entropy.' A new concept of the method, RE(AR), has been developed, based on a non-linear renormalisation of autoregressive spectral distributions. To test the hypothesis that renormalised entropy may improve the result of high-risk stratification after myocardial infarction, it is applied to a clinical pilot study (41 subjects) and to prospective data of the St George's Hospital post- infarction database (572 patients). The study shows that the new RE(AR) method is more reproducible and more stable in time than a previously introduced method (p<0.001). Moreover, the results of the study confirm the hypothesis that on average, the survivors have negative values of RE(AR) (-0.11+/-0.18), whereas the non-survivors have positive values (0.03+/-0.22, p<0.01). Further, the study shows that the combination of an HRV triangular index and RE(AR) leads to a better prediction of sudden arrhythmic death than standard measurements of HRV. In summary, the new RE(AR) method is an independent measure in HRV analysis that may be suitable for risk stratification in patients after myocardial infarction.
Ventricular tachycardia or fibrillation (VT-VF) as fatal cardiac arrhythmias are the main factors triggering sudden cardiac death. The objective of this study is to find early signs of sustained VT-VF in patients with an implanted cardioverter-defibrillator (ICD). These devices are able to safeguard patients by returning their hearts to a normal rhythm via strong defibrillatory shocks; additionally, they store the 1000 beat-to-beat intervals immediately before the onset of a life-threatening arrhythmia. We study these 1000 beat-to-beat intervals of 17 chronic heart failure ICD patients before the onset of a life-threatening arrhythmia and at a control time, i.e., without a VT-VF event. To characterize these rather short data sets, we calculate heart rate variability parameters from the time and frequency domain, from symbolic dynamics as well as the finite-time growth rates. We find that neither the time nor the frequency domain parameters show significant differences between the VT-VF and the control time series. However, two parameters from symbolic dynamics as well as the finite-time growth rates discriminate significantly both groups. These findings could be of importance in algorithms for next generation ICD's to improve the diagnostics and therapy of VT-VF.
Recent studies have claimed the existence of very massive stars (VMS) up to 300 M⊙ in the local Universe. As this finding may represent a paradigm shift for the canonical stellar upper-mass limit of 150 M⊙, it is timely to discuss the status of the data, as well as the far-reaching implications of such objects. We held a Joint Discussion at the General Assembly in Beijing to discuss (i) the determination of the current masses of the most massive stars, (ii) the formation of VMS, (iii) their mass loss, and (iv) their evolution and final fate. The prime aim was to reach broad consensus between observers and theorists on how to identify and quantify the dominant physical processes.