3,369 research outputs found
SCAMP:standardised, concentrated, additional macronutrients, parenteral nutrition in very preterm infants: a phase IV randomised, controlled exploratory study of macronutrient intake, growth and other aspects of neonatal care
<p>Abstract</p> <p>Background</p> <p>Infants born <29 weeks gestation are at high risk of neurocognitive disability. Early postnatal growth failure, particularly head growth, is an important and potentially reversible risk factor for impaired neurodevelopmental outcome. Inadequate nutrition is a major factor in this postnatal growth failure, optimal protein and calorie (macronutrient) intakes are rarely achieved, especially in the first week. Infants <29 weeks are dependent on parenteral nutrition for the bulk of their nutrient needs for the first 2-3 weeks of life to allow gut adaptation to milk digestion. The prescription, formulation and administration of neonatal parenteral nutrition is critical to achieving optimal protein and calorie intake but has received little scientific evaluation. Current neonatal parenteral nutrition regimens often rely on individualised prescription to manage the labile, unpredictable biochemical and metabolic control characteristic of the early neonatal period. Individualised prescription frequently fails to translate into optimal macronutrient delivery. We have previously shown that a standardised, concentrated neonatal parenteral nutrition regimen can optimise macronutrient intake.</p> <p>Methods</p> <p>We propose a single centre, randomised controlled exploratory trial of two standardised, concentrated neonatal parenteral nutrition regimens comparing a standard macronutrient content (maximum protein 2.8 g/kg/day; lipid 2.8 g/kg/day, dextrose 10%) with a higher macronutrient content (maximum protein 3.8 g/kg/day; lipid 3.8 g/kg/day, dextrose 12%) over the first 28 days of life. 150 infants 24-28 completed weeks gestation and birthweight <1200 g will be recruited. The primary outcome will be head growth velocity in the first 28 days of life. Secondary outcomes will include a) auxological data between birth and 36 weeks corrected gestational age b) actual macronutrient intake in first 28 days c) biomarkers of biochemical and metabolic tolerance d) infection biomarkers and other intravascular line complications e) incidence of major complications of prematurity including mortality f) neurodevelopmental outcome at 2 years corrected gestational age</p> <p>Trial registration</p> <p>Current controlled trials: <a href="http://www.controlled-trials.com/ISRCTN76597892">ISRCTN76597892</a>; EudraCT Number: 2008-008899-14</p
The evolution of the terrestrial-terminating Irish Sea glacier during the last glaciation
Here we reconstruct the last advance to maximum limits and retreat of the Irish Sea Glacier (ISG), the only land‐terminating ice lobe of the western British Irish Ice Sheet. A series of reverse bedrock slopes rendered proglacial lakes endemic, forming time‐transgressive moraine‐ and bedrock‐dammed basins that evolved with ice marginal retreat. Combining, for the first time on glacial sediments, optically stimulated luminescence (OSL) bleaching profiles for cobbles with single grain and small aliquot OSL measurements on sands, has produced a coherent chronology from these heterogeneously bleached samples. This chronology constrains what is globally an early build‐up of ice during late Marine Isotope Stage 3 and Greenland Stadial (GS) 5, with ice margins reaching south Lancashire by 30 ± 1.2 ka, followed by a 120‐km advance at 28.3 ± 1.4 ka reaching its 26.5 ± 1.1 ka maximum extent during GS‐3. Early retreat during GS‐3 reflects piracy of ice sources shared with the Irish‐Sea Ice Stream (ISIS), starving the ISG. With ISG retreat, an opportunistic readvance of Welsh ice during GS‐2 rode over the ISG moraines occupying the space vacated, with ice margins oscillating within a substantial glacial over‐deepening. Our geomorphological chronosequence shows a glacial system forced by climate but mediated by piracy of ice sources shared with the ISIS, changing flow regimes and fronting environments
Application of the speed-duration relationship to normalize the intensity of high-intensity interval training
The tolerable duration of continuous high-intensity exercise is determined by the hyperbolic Speed-tolerable duration (S-tLIM) relationship. However, application of the S-tLIM relationship to normalize the intensity of High-Intensity Interval Training (HIIT) has yet to be considered, with this the aim of present study. Subjects completed a ramp-incremental test, and series of 4 constant-speed tests to determine the S-tLIM relationship. A sub-group of subjects (n = 8) then repeated 4 min bouts of exercise at the speeds predicted to induce intolerance at 4 min (WR4), 6 min (WR6) and 8 min (WR8), interspersed with bouts of 4 min recovery, to the point of exercise intolerance (fixed WR HIIT) on different days, with the aim of establishing the work rate that could be sustained for 960 s (i.e. 4×4 min). A sub-group of subjects (n = 6) also completed 4 bouts of exercise interspersed with 4 min recovery, with each bout continued to the point of exercise intolerance (maximal HIIT) to determine the appropriate protocol for maximizing the amount of high-intensity work that can be completed during 4×4 min HIIT. For fixed WR HIIT tLIM of HIIT sessions was 399±81 s for WR4, 892±181 s for WR6 and 1517±346 s for WR8, with total exercise durations all significantly different from each other (P<0.050). For maximal HIIT, there was no difference in tLIM of each of the 4 bouts (Bout 1: 229±27 s; Bout 2: 262±37 s; Bout 3: 235±49 s; Bout 4: 235±53 s; P>0.050). However, there was significantly less high-intensity work completed during bouts 2 (153.5±40. 9 m), 3 (136.9±38.9 m), and 4 (136.7±39.3 m), compared with bout 1 (264.9±58.7 m; P>0.050). These data establish that WR6 provides the appropriate work rate to normalize the intensity of HIIT between subjects. Maximal HIIT provides a protocol which allows the relative contribution of the work rate profile to physiological adaptations to be considered during alternative intensity-matched HIIT protocols
Neurospora from natural populations: Population genomics insights into the Life history of a model microbial Eukaryote
The ascomycete filamentous fungus Neurospora crassa played a historic role in experimental biology and became a model system for genetic research. Stimulated by a systematic effort to collect wild strains initiated by Stanford geneticist David Perkins, the genus Neurospora has also become a basic model for the study of evolutionary processes, speciation, and population biology. In this chapter, we will first trace the history that brought Neurospora into the era of population genomics. We will then cover the major contributions of population genomic investigations using Neurospora to our understanding of microbial biogeography and speciation, and review recent work using population genomics and genome-wide association mapping that illustrates the unique potential of Neurospora as a model for identifying the genetic basis of (potentially adaptive) phenotypes in filamentous fungi. The advent of population genomics has contributed to firmly establish Neurospora as a complete model system and we hope our review will entice biologists to include Neurospora in their research
Reconstructing the three-dimensional GABAergic microcircuit of the striatum
A system's wiring constrains its dynamics, yet modelling of neural structures often overlooks the specific networks formed by their neurons. We developed an approach for constructing anatomically realistic networks and reconstructed the GABAergic microcircuit formed by the medium spiny neurons (MSNs) and fast-spiking interneurons (FSIs) of the adult rat striatum. We grew dendrite and axon models for these neurons and extracted probabilities for the presence of these neurites as a function of distance from the soma. From these, we found the probabilities of intersection between the neurites of two neurons given their inter-somatic distance, and used these to construct three-dimensional striatal networks. The MSN dendrite models predicted that half of all dendritic spines are within 100 mu m of the soma. The constructed networks predict distributions of gap junctions between FSI dendrites, synaptic contacts between MSNs, and synaptic inputs from FSIs to MSNs that are consistent with current estimates. The models predict that to achieve this, FSIs should be at most 1% of the striatal population. They also show that the striatum is sparsely connected: FSI-MSN and MSN-MSN contacts respectively form 7% and 1.7% of all possible connections. The models predict two striking network properties: the dominant GABAergic input to a MSN arises from neurons with somas at the edge of its dendritic field; and FSIs are interconnected on two different spatial scales: locally by gap junctions and distally by synapses. We show that both properties influence striatal dynamics: the most potent inhibition of a MSN arises from a region of striatum at the edge of its dendritic field; and the combination of local gap junction and distal synaptic networks between FSIs sets a robust input-output regime for the MSN population. Our models thus intimately link striatal micro-anatomy to its dynamics, providing a biologically grounded platform for further study
Rethinking Serious Games Design in the Age of COVID-19: Setting the Focus on Wicked Problems
We live in a complex world, in which our existence is defined by forces that we cannot fully comprehend, predict, nor control. This is the world of wicked problems, of which the situation triggered by the COVID-19 pandemic is a notable example. Wicked problems are complex scenarios defined by the interplay of multiple environmental, social and economic factors. They are everchanging, and largely unpredictable and uncontrollable. As a consequence, wicked problems cannot be definitively solved through traditional problem-solving approaches. Instead, they should be iteratively managed, recognizing and valuing our connectedness with each other and the environment, and engaging in joint thinking and action to identify and pursue the common good. Serious games can be key to foster wicked problem management abilities. To this end, they should engage players in collective activities set in contexts simulating real-world wicked problem scenarios. These should require the continuous interpretation of changing circumstances to identify and pursue shared goals, promoting the development of knowledge, attitudes and skill sets relevant to tackle real-world situations. In this paper we outline the nature, implications and challenges of wicked problems, highlighting why games should be leveraged to foster wicked problem management abilities. Then, we propose a theory-based framework to support the design of games for this purpose
Detection of subclinical keratoconus using biometric parameters
The validation of innovative methodologies for diagnosing keratoconus in its earliest stages is of major interest in ophthalmology. So far, subclinical keratoconus diagnosis has been made by combining several clinical criteria that allowed the definition of indices and decision trees, which proved to be valuable diagnostic tools. However, further improvements need to be made in order to reduce the risk of ectasia in patients who undergo corneal refractive surgery. The purpose of this work is to report a new subclinical keratoconus detection method based in the analysis of certain biometric parameters extracted from a custom 3D corneal model. This retrospective study includes two groups: the first composed of 67 patients with healthy eyes and normal vision, and the second composed of 24 patients with subclinical keratoconus and normal vision as well. The proposed detection method generates a 3D custom corneal model using computer-aided graphic design (CAGD) tools and corneal surfaces’ data provided by a corneal tomographer. Defined bio-geometric parameters are then derived from the model, and statistically analysed to detect any minimal corneal deformation. The metric which showed the highest area under the receiver-operator curve (ROC) was the posterior apex deviation. This new method detected differences between healthy and sub-clinical keratoconus corneas by using abnormal corneal topography and normal spectacle corrected vision, enabling an integrated tool that facilitates an easier diagnosis and follow-up of keratoconus.This publication has been carried out in the framework of the Thematic Network for Co-Operative Research in Health (RETICS) reference number RD16/0008/0012 financed by the Carlos III Health Institute-General Subdirection of Networks and Cooperative Investigation Centers (R&D&I National Plan 2013–2016) and the European Regional Development Fund (FEDER)
Extragalactic Radio Continuum Surveys and the Transformation of Radio Astronomy
Next-generation radio surveys are about to transform radio astronomy by
discovering and studying tens of millions of previously unknown radio sources.
These surveys will provide new insights to understand the evolution of
galaxies, measuring the evolution of the cosmic star formation rate, and
rivalling traditional techniques in the measurement of fundamental cosmological
parameters. By observing a new volume of observational parameter space, they
are also likely to discover unexpected new phenomena. This review traces the
evolution of extragalactic radio continuum surveys from the earliest days of
radio astronomy to the present, and identifies the challenges that must be
overcome to achieve this transformational change.Comment: To be published in Nature Astronomy 18 Sept 201
Internal representations, external representations and ergonomics: towards a theoretical integration
- …