476 research outputs found

    Universal features of correlated bursty behaviour

    Get PDF
    Inhomogeneous temporal processes, like those appearing in human communications, neuron spike trains, and seismic signals, consist of high-activity bursty intervals alternating with long low-activity periods. In recent studies such bursty behavior has been characterized by a fat-tailed inter-event time distribution, while temporal correlations were measured by the autocorrelation function. However, these characteristic functions are not capable to fully characterize temporally correlated heterogenous behavior. Here we show that the distribution of the number of events in a bursty period serves as a good indicator of the dependencies, leading to the universal observation of power-law distribution in a broad class of phenomena. We find that the correlations in these quite different systems can be commonly interpreted by memory effects and described by a simple phenomenological model, which displays temporal behavior qualitatively similar to that in real systems

    Genetic inhibition of neurotransmission reveals role of glutamatergic input to dopamine neurons in high-effort behavior

    Get PDF
    Midbrain dopamine neurons are crucial for many behavioral and cognitive functions. As the major excitatory input, glutamatergic afferents are important for control of the activity and plasticity of dopamine neurons. However, the role of glutamatergic input as a whole onto dopamine neurons remains unclear. Here we developed a mouse line in which glutamatergic inputs onto dopamine neurons are specifically impaired, and utilized this genetic model to directly test the role of glutamatergic inputs in dopamine-related functions. We found that while motor coordination and reward learning were largely unchanged, these animals showed prominent deficits in effort-related behavioral tasks. These results provide genetic evidence that glutamatergic transmission onto dopaminergic neurons underlies incentive motivation, a willingness to exert high levels of effort to obtain reinforcers, and have important implications for understanding the normal function of the midbrain dopamine system.Fil: Hutchison, M. A.. National Institutes of Health; Estados UnidosFil: Gu, X.. National Institutes of Health; Estados UnidosFil: Adrover, Martín Federico. National Institutes of Health; Estados Unidos. Consejo Nacional de Investigaciones Científicas y Técnicas. Instituto de Investigaciones en Ingeniería Genética y Biología Molecular "Dr. Héctor N. Torres"; ArgentinaFil: Lee, M. R.. National Institutes of Health; Estados UnidosFil: Hnasko, T. S.. University of California at San Diego; Estados UnidosFil: Alvarez, V. A.. National Institutes of Health; Estados UnidosFil: Lu, W.. National Institutes of Health; Estados Unido

    Quantitative Changes in Hydrocarbons over Time in Fecal Pellets of Incisitermes minor May Predict Whether Colonies Are Alive or Dead

    Get PDF
    Hydrocarbon mixtures extracted from fecal pellets of drywood termites are species-specific and can be characterized to identify the termites responsible for damage, even when termites are no longer present or are unable to be recovered easily. In structures infested by drywood termites, it is common to find fecal pellets, but difficult to sample termites from the wood. When fecal pellets appear after remedial treatment of a structure, it is difficult to determine whether this indicates that termites in the structure are still alive and active or not. We examined the hydrocarbon composition of workers, alates, and soldiers of Incisitermes minor (Hagen) (family Kalotermitidae) and of fecal pellets of workers. Hydrocarbons were qualitatively similar among castes and pellets. Fecal pellets that were aged for periods of 0, 30, 90, and 365 days after collection were qualitatively similar across all time periods, however, the relative quantities of certain individual hydrocarbons changed over time, with 19 of the 73 hydrocarbon peaks relatively increasing or decreasing. When the sums of the positive and negative slopes of these 19 hydrocarbons were indexed, they produced a highly significant linear correlation (R2 = 0.89). Consequently, the quantitative differences of these hydrocarbons peaks can be used to determine the age of worker fecal pellets, and thus help determine whether the colony that produced them is alive or dead

    Quantitative cross-species extrapolation between humans and fish: The case of the anti-depressant fluoxetine

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.Fish are an important model for the pharmacological and toxicological characterization of human pharmaceuticals in drug discovery, drug safety assessment and environmental toxicology. However, do fish respond to pharmaceuticals as humans do? To address this question, we provide a novel quantitative cross-species extrapolation approach (qCSE) based on the hypothesis that similar plasma concentrations of pharmaceuticals cause comparable target-mediated effects in both humans and fish at similar level of biological organization (Read-Across Hypothesis). To validate this hypothesis, the behavioural effects of the anti-depressant drug fluoxetine on the fish model fathead minnow (Pimephales promelas) were used as test case. Fish were exposed for 28 days to a range of measured water concentrations of fluoxetine (0.1, 1.0, 8.0, 16, 32, 64 μg/L) to produce plasma concentrations below, equal and above the range of Human Therapeutic Plasma Concentrations (HTPCs). Fluoxetine and its metabolite, norfluoxetine, were quantified in the plasma of individual fish and linked to behavioural anxiety-related endpoints. The minimum drug plasma concentrations that elicited anxiolytic responses in fish were above the upper value of the HTPC range, whereas no effects were observed at plasma concentrations below the HTPCs. In vivo metabolism of fluoxetine in humans and fish was similar, and displayed bi-phasic concentration-dependent kinetics driven by the auto-inhibitory dynamics and saturation of the enzymes that convert fluoxetine into norfluoxetine. The sensitivity of fish to fluoxetine was not so dissimilar from that of patients affected by general anxiety disorders. These results represent the first direct evidence of measured internal dose response effect of a pharmaceutical in fish, hence validating the Read-Across hypothesis applied to fluoxetine. Overall, this study demonstrates that the qCSE approach, anchored to internal drug concentrations, is a powerful tool to guide the assessment of the sensitivity of fish to pharmaceuticals, and strengthens the translational power of the cross-species extrapolation

    Comparison of anaemia and parasitaemia as indicators of malaria control in household and EPI-health facility surveys in Malawi

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The World Health Organization has recommended that anaemia be used as an additional indicator to monitor malaria burden at the community level as malaria interventions are nationally scaled up. To date, there are no published evaluations of this recommendation.</p> <p>Methods</p> <p>To evaluate this recommendation, a comparison of anaemia and parasitaemia among 6-30 month old children was made during two repeated cross-sectional household (HH) and health facility (HF) surveys in six districts across Malawi at baseline (2005) and in a follow-up survey (2008) after a scale up of malaria control interventions.</p> <p>Results</p> <p>HH net ownership did not increase between the years (50.5% vs. 49.8%), but insecticide treated net (ITN) ownership increased modestly from 41.5% (95% CI: 37.2%-45.8%) in 2005 to 45.3% (95% CI: 42.6%-48.0%) in 2008. ITN use by children 6-30 months old, who were living in HH with at least one net, increased from 73.6% (95% CI:68.2%-79.1%) to 80.0% (95% CI:75.9%-84.1%) over the three-year period. This modest increase in ITN use was associated with a decrease in moderate to severe anaemia (Hb <8 g/dl) from 18.4% (95% CI:14.9%-21.8%) in 2005 to 15.4% (13.2%-17.7%) in 2008, while parasitaemia, measured as positive-slide microscopy, decreased from 18.9% (95% CI:14.7%-23.2%) to 16.9% (95% CI:13.8%-20.0%), a relative reduction of 16% and 11%, respectively. In HF surveys, anaemia prevalence decreased from 18.3% (95% CI: 14.9%-21.7%) to 15.4% (95% CI: 12.7%-18.2%), while parasitaemia decreased from 30.6% (95% CI: 25.7%-35.5%) to 13.2% (95% CI: 10.6%-15.8%), a relative reduction of 15% and 57%, respectively.</p> <p>Conclusion</p> <p>Increasing access to effective malaria prevention was associated with a reduced burden of malaria in young Malawian children. Anaemia measured at the HF level at time of routine vaccination may be a good surrogate indicator for its measurement at the HH level in evaluating national malaria control programmes.</p

    Emergency Portacaval Shunt Versus Rescue Portacaval Shunt in a Randomized Controlled Trial of Emergency Treatment of Acutely Bleeding Esophageal Varices in Cirrhosis—Part 3

    Get PDF
    Emergency treatment of bleeding esophageal varices in cirrhosis is of singular importance because of the high mortality rate. Emergency portacaval shunt is rarely used today because of the belief, unsubstantiated by long-term randomized trials, that it causes frequent portal-systemic encephalopathy and liver failure. Consequently, portacaval shunt has been relegated solely to salvage therapy when endoscopic and pharmacologic therapies have failed. Question: Is the regimen of endoscopic sclerotherapy with rescue portacaval shunt for failure to control bleeding varices superior to emergency portacaval shunt? A unique opportunity to answer this question was provided by a randomized controlled trial of endoscopic sclerotherapy versus emergency portacaval shunt conducted from 1988 to 2005. Unselected consecutive cirrhotic patients with acute bleeding esophageal varices were randomized to endoscopic sclerotherapy (n = 106) or emergency portacaval shunt (n = 105). Diagnostic workup was completed and treatment was initiated within 8 h. Failure of endoscopic sclerotherapy was defined by strict criteria and treated by rescue portacaval shunt (n = 50) whenever possible. Ninety-six percent of patients had more than 10 years of follow-up or until death. Comparison of emergency portacaval shunt and endoscopic sclerotherapy followed by rescue portacaval shunt showed the following differences in measurements of outcomes: (1) survival after 5 years (72% versus 22%), 10 years (46% versus 16%), and 15 years (46% versus 0%); (2) median post-shunt survival (6.18 versus 1.99 years); (3) mean requirements of packed red blood cell units (17.85 versus 27.80); (4) incidence of recurrent portal-systemic encephalopathy (15% versus 43%); (5) 5-year change in Child’s class showing improvement (59% versus 19%) or worsening (8% versus 44%); (6) mean quality of life points in which lower is better (13.89 versus 27.89); and (7) mean cost of care per year (39,200versus39,200 versus 216,700). These differences were highly significant in favor of emergency portacaval shunt (all p &lt; 0.001). Emergency portacaval shunt was strikingly superior to endoscopic sclerotherapy as well as to the combination of endoscopic sclerotherapy and rescue portacaval shunt in regard to all outcome measures, specifically bleeding control, survival, incidence of portal-systemic encephalopathy, improvement in liver function, quality of life, and cost of care. These results strongly support the use of emergency portacaval shunt as the first line of emergency treatment of bleeding esophageal varices in cirrhosis

    Effect of a multi-faceted quality improvement intervention on inappropriate antibiotic use in children with non-bloody diarrhoea admitted to district hospitals in Kenya

    Get PDF
    BACKGROUND: There are few reports of interventions to reduce the common but irrational use of antibiotics for acute non-bloody diarrhoea amongst hospitalised children in low-income settings. We undertook a secondary analysis of data from an intervention comprising training of health workers, facilitation, supervision and face-to-face feedback, to assess whether it reduced inappropriate use of antibiotics in children with non-bloody diarrhoea and no co-morbidities requiring antibiotics, compared to a partial intervention comprising didactic training and written feedback only. This outcome was not a pre-specified end-point of the main trial. METHODS: Repeated cross-sectional survey data from a cluster-randomised controlled trial of an intervention to improve management of common childhood illnesses in Kenya were used to describe the prevalence of inappropriate antibiotic use in a 7-day period in children aged 2-59 months with acute non-bloody diarrhoea. Logistic regression models with random effects for hospital were then used to identify patient and clinician level factors associated with inappropriate antibiotic use and to assess the effect of the intervention. RESULTS: 9, 459 admission records of children were reviewed for this outcome. Of these, 4, 232 (44.7%) were diagnosed with diarrhoea, with 130 of these being bloody (dysentery) therefore requiring antibiotics. 1, 160 children had non-bloody diarrhoea and no co-morbidities requiring antibiotics-these were the focus of the analysis. 750 (64.7%) of them received antibiotics inappropriately, 313 of these being in the intervention hospitals vs. 437 in the controls. The adjusted logistic regression model showed the baseline-adjusted odds of inappropriate antibiotic prescription to children admitted to the intervention hospitals was 0.30 times that in the control hospitals (95%CI 0.09-1.02). CONCLUSION: We found some evidence that the multi-faceted, sustained intervention described in this paper led to a reduction in the inappropriate use of antibiotics in treating children with non-bloody diarrhoea. TRIAL REGISTRATION: International Standard Randomised Controlled Trial Number Register ISRCTN42996612

    Soil Respiration in Tibetan Alpine Grasslands: Belowground Biomass and Soil Moisture, but Not Soil Temperature, Best Explain the Large-Scale Patterns

    Get PDF
    The Tibetan Plateau is an essential area to study the potential feedback effects of soils to climate change due to the rapid rise in its air temperature in the past several decades and the large amounts of soil organic carbon (SOC) stocks, particularly in the permafrost. Yet it is one of the most under-investigated regions in soil respiration (Rs) studies. Here, Rs rates were measured at 42 sites in alpine grasslands (including alpine steppes and meadows) along a transect across the Tibetan Plateau during the peak growing season of 2006 and 2007 in order to test whether: (1) belowground biomass (BGB) is most closely related to spatial variation in Rs due to high root biomass density, and (2) soil temperature significantly influences spatial pattern of Rs owing to metabolic limitation from the low temperature in cold, high-altitude ecosystems. The average daily mean Rs of the alpine grasslands at peak growing season was 3.92 µmol CO2 m−2 s−1, ranging from 0.39 to 12.88 µmol CO2 m−2 s−1, with average daily mean Rs of 2.01 and 5.49 µmol CO2 m−2 s−1 for steppes and meadows, respectively. By regression tree analysis, BGB, aboveground biomass (AGB), SOC, soil moisture (SM), and vegetation type were selected out of 15 variables examined, as the factors influencing large-scale variation in Rs. With a structural equation modelling approach, we found only BGB and SM had direct effects on Rs, while other factors indirectly affecting Rs through BGB or SM. Most (80%) of the variation in Rs could be attributed to the difference in BGB among sites. BGB and SM together accounted for the majority (82%) of spatial patterns of Rs. Our results only support the first hypothesis, suggesting that models incorporating BGB and SM can improve Rs estimation at regional scale
    corecore