Refine
Year of publication
Document Type
- Article (17)
- Postprint (4)
- Doctoral Thesis (2)
- Monograph/Edited Volume (1)
- Report (1)
Language
- English (25) (remove)
Is part of the Bibliography
- yes (25)
Keywords
- monitoring (25) (remove)
Institute
- Institut für Geowissenschaften (5)
- Department Sport- und Gesundheitswissenschaften (4)
- Strukturbereich Kognitionswissenschaften (4)
- Institut für Informatik und Computational Science (3)
- Humanwissenschaftliche Fakultät (2)
- Institut für Biochemie und Biologie (2)
- Institut für Ernährungswissenschaft (2)
- Fachgruppe Betriebswirtschaftslehre (1)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (1)
- Institut für Physik und Astronomie (1)
Stripe rust (Pst) is a major disease of wheat crops leading untreated to severe yield losses. The use of fungicides is often essential to control Pst when sudden outbreaks are imminent. Sensors capable of detecting Pst in wheat crops could optimize the use of fungicides and improve disease monitoring in high-throughput field phenotyping. Now, deep learning provides new tools for image recognition and may pave the way for new camera based sensors that can identify symptoms in early stages of a disease outbreak within the field. The aim of this study was to teach an image classifier to detect Pst symptoms in winter wheat canopies based on a deep residual neural network (ResNet). For this purpose, a large annotation database was created from images taken by a standard RGB camera that was mounted on a platform at a height of 2 m. Images were acquired while the platform was moved over a randomized field experiment with Pst-inoculated and Pst-free plots of winter wheat. The image classifier was trained with 224 x 224 px patches tiled from the original, unprocessed camera images. The image classifier was tested on different stages of the disease outbreak. At patch level the image classifier reached a total accuracy of 90%. To test the image classifier on image level, the image classifier was evaluated with a sliding window using a large striding length of 224 px allowing for fast test performance. At image level, the image classifier reached a total accuracy of 77%. Even in a stage with very low disease spreading (0.5%) at the very beginning of the Pst outbreak, a detection accuracy of 57% was obtained. Still in the initial phase of the Pst outbreak with 2 to 4% of Pst disease spreading, detection accuracy with 76% could be attained. With further optimizations, the image classifier could be implemented in embedded systems and deployed on drones, vehicles or scanning systems for fast mapping of Pst outbreaks.
This study aimed to compare the training load of a professional under-19 soccer team (U-19) to that of an elite adult team (EAT), from the same club, during the in-season period. Thirty-nine healthy soccer players were involved (EAT [n = 20]; U-19 [n = 19]) in the study which spanned four weeks. Training load (TL) was monitored as external TL, using a global positioning system (GPS), and internal TL, using a rating of perceived exertion (RPE). TL data were recorded after each training session. During soccer matches, players’ RPEs were recorded. The internal TL was quantified daily by means of the session rating of perceived exertion (session-RPE) using Borg’s 0–10 scale. For GPS data, the selected running speed intensities (over 0.5 s time intervals) were 12–15.9 km/h; 16–19.9 km/h; 20–24.9 km/h; >25 km/h (sprint). Distances covered between 16 and 19.9 km/h, > 20 km/h and >25 km/h were significantly higher in U-19 compared to EAT over the course of the study (p = 0.023, d = 0.243, small; p = 0.016, d = 0.298, small; and p = 0.001, d = 0.564, small, respectively). EAT players performed significantly fewer sprints per week compared to U-19 players (p = 0.002, d = 0.526, small). RPE was significantly higher in U-19 compared to EAT (p = 0.001, d = 0.188, trivial). The external and internal measures of TL were significantly higher in the U-19 group compared to the EAT soccer players. In conclusion, the results obtained show that the training load is greater in U19 compared to EAT.
This study aimed to compare the training load of a professional under-19 soccer team (U-19) to that of an elite adult team (EAT), from the same club, during the in-season period. Thirty-nine healthy soccer players were involved (EAT [n = 20]; U-19 [n = 19]) in the study which spanned four weeks. Training load (TL) was monitored as external TL, using a global positioning system (GPS), and internal TL, using a rating of perceived exertion (RPE). TL data were recorded after each training session. During soccer matches, players’ RPEs were recorded. The internal TL was quantified daily by means of the session rating of perceived exertion (session-RPE) using Borg’s 0–10 scale. For GPS data, the selected running speed intensities (over 0.5 s time intervals) were 12–15.9 km/h; 16–19.9 km/h; 20–24.9 km/h; >25 km/h (sprint). Distances covered between 16 and 19.9 km/h, > 20 km/h and >25 km/h were significantly higher in U-19 compared to EAT over the course of the study (p = 0.023, d = 0.243, small; p = 0.016, d = 0.298, small; and p = 0.001, d = 0.564, small, respectively). EAT players performed significantly fewer sprints per week compared to U-19 players (p = 0.002, d = 0.526, small). RPE was significantly higher in U-19 compared to EAT (p = 0.001, d = 0.188, trivial). The external and internal measures of TL were significantly higher in the U-19 group compared to the EAT soccer players. In conclusion, the results obtained show that the training load is greater in U19 compared to EAT.
The aim of this study is to monitor short-term seasonal development of young Olympic weightlifters’ anthropometry, body composition, physical fitness, and sport-specific performance. Fifteen male weightlifters aged 13.2 ± 1.3 years participated in this study. Tests for the assessment of anthropometry (e.g., body-height, body-mass), body-composition (e.g., lean-body-mass, relative fat-mass), muscle strength (grip-strength), jump performance (drop-jump (DJ) height, countermovement-jump (CMJ) height, DJ contact time, DJ reactive-strength-index (RSI)), dynamic balance (Y-balance-test), and sport-specific performance (i.e., snatch and clean-and-jerk) were conducted at different time-points (i.e., T1 (baseline), T2 (9 weeks), T3 (20 weeks)). Strength tests (i.e., grip strength, clean-and-jerk and snatch) and training volume were normalized to body mass. Results showed small-to-large increases in body-height, body-mass, lean-body-mass, and lower-limbs lean-mass from T1-to-T2 and T2-to-T3 (∆0.7–6.7%; 0.1 ≤ d ≤ 1.2). For fat-mass, a significant small-sized decrease was found from T1-to-T2 (∆13.1%; d = 0.4) and a significant increase from T2-to-T3 (∆9.1%; d = 0.3). A significant main effect of time was observed for DJ contact time (d = 1.3) with a trend toward a significant decrease from T1-to-T2 (∆–15.3%; d = 0.66; p = 0.06). For RSI, significant small increases from T1-to-T2 (∆9.9%, d = 0.5) were noted. Additionally, a significant main effect of time was found for snatch (d = 2.7) and clean-and-jerk (d = 3.1) with significant small-to-moderate increases for both tests from T1-to-T2 and T2-to-T3 (∆4.6–11.3%, d = 0.33 to 0.64). The other tests did not change significantly over time (0.1 ≤ d ≤ 0.8). Results showed significantly higher training volume for sport-specific training during the second period compared with the first period (d = 2.2). Five months of Olympic weightlifting contributed to significant changes in anthropometry, body-composition, and sport-specific performance. However, hardly any significant gains were observed for measures of physical fitness. Coaches are advised to design training programs that target a variety of fitness components to lay an appropriate foundation for later performance as an elite athlete.
There is controversy in the literature in regards of the link between training load and injury rate. Thus, the aims of this non-interventional study were to evaluate relationships between pre-season training load with biochemical markers, injury incidence and performance during the first month of the competitive period in professional soccer players.
There is controversy in the literature in regards of the link between training load and injury rate. Thus, the aims of this non-interventional study were to evaluate relationships between pre-season training load with biochemical markers, injury incidence and performance during the first month of the competitive period in professional soccer players.
The 933 km(2) Bengue catchment in northeastern Brazil is characterized by distinct rainy and dry seasons. Precipitation is stored in variously sized reservoirs, which is essential for the local population. In this study, we used TerraSAR-X SM(HH) data for an one-year monitoring of seasonal changes in the reservoir areas from July 2011 to July 2012. The monitoring was based on acquisitions in the ascending pass direction, complemented by occasional descending-pass images. To detect water surface areas, a histogram analysis followed by a global threshold classification was performed, and the results were validated using in situ GPS data. Distinguishing between small reservoirs and similar looking dark areas was difficult. Therefore, we tested several approaches for identifying misclassified areas. An analysis of the surface area dynamics of the reservoirs indicated high spatial and temporal heterogeneities and a large decrease in the total water surface area of the reservoirs in the catchment by approximately 30% within one year.