SpineMan is designed as a prototype of a soft robotic manipulator that is constructed of alternating hard and soft segments similar to the human spine. Implementing such soft segments allows to surpass the rigidity of conventional robots and ensures safer workspaces where humans and machines can work side by side with less stringent safety restrictions. Therefore, we used a hydrogel as viscoelastic material consisting of poly(vinyl alcohol) and borax. The mechanical properties of the hydrogel were tailored by embedding silica particles of various particles sizes as well as in different mass fractions. Increased mass contents as well as larger particle sizes led to strongly enhanced rigidity with a more than doubled storage modulus of the composite compared to the pure hydrogel. Furthermore, specific functionalities were induced by the incorporation of superparamagnetic Fe3O4 nanoparticles that can in principle be used for sensing robotic motion and detecting malfunctions. Therefore, we precisely adjusted the saturation magnetization of the soft segments using defined mass contents of the nanoparticles. To ensure long-time shape stability and prevention of atmospheric influences on the prepared composites, a silicone skin of specific shore hardness was used. The composites and the soft segments were characterized by oscillation measurements, cryo-SEM, bending tests and SQUID measurements, which give insights into the properties in the passive and in the moving state of SpineMan. The utilization of tailored composites led to highly flexible, reinforced and functional soft segments, which ensure stability, easy movability by springs of the shape memory alloy nitinol and prevention of total failure.
Biomarkers are used to predict phenotypical properties before these features become apparent and, therefore, are valuable tools for both fundamental and applied research. Diagnostic biomarkers have been discovered in medicine many decades ago and are now commonly applied. While this is routine in the field of medicine, it is of surprise that in agriculture this approach has never been investigated. Up to now, the prediction of phenotypes in plants was based on growing plants and assaying the organs of interest in a time intensive process. For the first time, we demonstrate in this study the application of metabolomics to predict agronomic important phenotypes of a crop plant that was grown in different environments. Our procedure consists of established techniques to screen untargeted for a large amount of metabolites in parallel, in combination with machine learning methods. By using this combination of metabolomics and biomathematical tools metabolites were identified that can be used as biomarkers to improve the prediction of traits. The predictive metabolites can be selected and used subsequently to develop fast, targeted and low-cost diagnostic biomarker assays that can be implemented in breeding programs or quality assessment analysis. The identified metabolic biomarkers allow for the prediction of crop product quality. Furthermore, marker-assisted selection can benefit from the discovery of metabolic biomarkers when other molecular markers come to its limitation. The described marker selection method was developed for potato tubers, but is generally applicable to any crop and trait as it functions independently of genomic information.
While the Intergovernmental Panel on Climate Change (IPCC) physical science reports usually assess a handful of future scenarios, the Working Group III contribution on climate mitigation to the IPCC's Sixth Assessment Report (AR6 WGIII) assesses hundreds to thousands of future emissions scenarios. A key task in WGIII is to assess the global mean temperature outcomes of these scenarios in a consistent manner, given the challenge that the emissions scenarios from different integrated assessment models (IAMs) come with different sectoral and gas-to-gas coverage and cannot all be assessed consistently by complex Earth system models. In this work, we describe the “climate-assessment” workflow and its methods, including infilling of missing emissions and emissions harmonisation as applied to 1202 mitigation scenarios in AR6 WGIII. We evaluate the global mean temperature projections and effective radiative forcing (ERF) characteristics of climate emulators FaIRv1.6.2 and MAGICCv7.5.3 and use the CICERO simple climate model (CICERO-SCM) for sensitivity analysis. We discuss the implied overshoot severity of the mitigation pathways using overshoot degree years and look at emissions and temperature characteristics of scenarios compatible with one possible interpretation of the Paris Agreement. We find that the lowest class of emissions scenarios that limit global warming to “1.5 ∘C (with a probability of greater than 50 %) with no or limited overshoot” includes 97 scenarios for MAGICCv7.5.3 and 203 for FaIRv1.6.2. For the MAGICCv7.5.3 results, “limited overshoot” typically implies exceedance of median temperature projections of up to about 0.1 ∘C for up to a few decades before returning to below 1.5 ∘C by or before the year 2100. For more than half of the scenarios in this category that comply with three criteria for being “Paris-compatible”, including net-zero or net-negative greenhouse gas (GHG) emissions, median temperatures decline by about 0.3–0.4 ∘C after peaking at 1.5–1.6 ∘C in 2035–2055. We compare the methods applied in AR6 with the methods used for SR1.5 and discuss their implications. This article also introduces a “climate-assessment” Python package which allows for fully reproducing the IPCC AR6 WGIII temperature assessment. This work provides a community tool for assessing the temperature outcomes of emissions pathways and provides a basis for further work such as extending the workflow to include downscaling of climate characteristics to a regional level and calculating impacts.