54 research outputs found

    Distribution and seasonality of rhinovirus and other respiratory viruses in a cross-section of asthmatic children in Trinidad, West Indies

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Childhood asthma in the Caribbean is advancing in prevalence and morbidity. Though viral respiratory tract infections are reported triggers for exacerbations, information on these infections with asthma is sparse in Caribbean territories. We examined the distribution of respiratory viruses and their association with seasons in acute and stable asthmatic children in Trinidad.</p> <p>Methods</p> <p>In a cross-sectional study of 70 wheezing children attending the emergency department for nebulisation and 80 stable control subjects (2 to 16 yr of age) in the asthma clinic, nasal specimens were collected during the dry (<it>n </it>= 38, January to May) and rainy (<it>n </it>= 112, June to December) seasons. A multitarget, sensitive, specific high-throughput Respiratory MultiCode assay tested for respiratory-virus sequences for eight distinct groups: human rhinovirus, respiratory syncytial virus, parainfluenza virus, influenza virus, metapneumovirus, adenovirus, coronavirus, and enterovirus.</p> <p>Results</p> <p>Wheezing children had a higher [χ<sup>2 </sup>= 5.561, <it>p </it>= 0.018] prevalence of respiratory viruses compared with stabilized asthmatics (34.3% (24) versus (vs.) 17.5% (14)). Acute asthmatics were thrice as likely to be infected with a respiratory virus (OR = 2.5, 95% CI = 1.2 – 5.3). The predominant pathogens detected in acute versus stable asthmatics were the rhinovirus (RV) (<it>n </it>= 18, 25.7% vs. <it>n </it>= 7, 8.8%; <it>p </it>= 0.005), respiratory syncytial virus B (RSV B) (<it>n </it>= 2, 2.9% vs. <it>n </it>= 4, 5.0%), and enterovirus (<it>n </it>= 1, 1.4% vs. <it>n </it>= 2, 2.5%). Strong odds for rhinoviral infection were observed among nebulised children compared with stable asthmatics (<it>p </it>= 0.005, OR = 3.6, 95% CI = 1.4 – 9.3,). RV was prevalent throughout the year (Dry, <it>n </it>= 6, 15.8%; Rainy, <it>n </it>= 19, 17.0%) and without seasonal association [χ<sup>2 </sup>= 0.028, <it>p </it>= 0.867]. However it was the most frequently detected virus [Dry = 6/10, (60.0%); Rainy = 19/28, (67.9%)] in both seasons.</p> <p>Conclusion</p> <p>Emergent wheezing illnesses during childhood can be linked to infection with rhinovirus in Trinidad's tropical environment. Viral-induced exacerbations of asthma are independent of seasons in this tropical climate. Further clinical and virology investigations are recommended on the role of infections with the rhinovirus in Caribbean childhood wheeze.</p

    Estimating the NIH Efficient Frontier

    Get PDF
    Background: The National Institutes of Health (NIH) is among the world’s largest investors in biomedical research, with a mandate to: “
lengthen life, and reduce the burdens of illness and disability.” Its funding decisions have been criticized as insufficiently focused on disease burden. We hypothesize that modern portfolio theory can create a closer link between basic research and outcome, and offer insight into basic-science related improvements in public health. We propose portfolio theory as a systematic framework for making biomedical funding allocation decisions–one that is directly tied to the risk/reward trade-off of burden-of-disease outcomes. Methods and Findings: Using data from 1965 to 2007, we provide estimates of the NIH “efficient frontier”, the set of funding allocations across 7 groups of disease-oriented NIH institutes that yield the greatest expected return on investment for a given level of risk, where return on investment is measured by subsequent impact on U.S. years of life lost (YLL). The results suggest that NIH may be actively managing its research risk, given that the volatility of its current allocation is 17% less than that of an equal-allocation portfolio with similar expected returns. The estimated efficient frontier suggests that further improvements in expected return (89% to 119% vs. current) or reduction in risk (22% to 35% vs. current) are available holding risk or expected return, respectively, constant, and that 28% to 89% greater decrease in average years-of-life-lost per unit risk may be achievable. However, these results also reflect the imprecision of YLL as a measure of disease burden, the noisy statistical link between basic research and YLL, and other known limitations of portfolio theory itself. Conclusions: Our analysis is intended to serve as a proof-of-concept and starting point for applying quantitative methods to allocating biomedical research funding that are objective, systematic, transparent, repeatable, and expressly designed to reduce the burden of disease. By approaching funding decisions in a more analytical fashion, it may be possible to improve their ultimate outcomes while reducing unintended consequences

    The emergence of modern statistics in agricultural science : Analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919–1933

    Get PDF
    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher’s methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians’ tools and expertise into the station research programme. Fisher’s statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them

    Scintillation light detection in the 6-m drift-length ProtoDUNE Dual Phase liquid argon TPC

    Get PDF
    DUNE is a dual-site experiment for long-baseline neutrino oscillation studies, neutrino astrophysics and nucleon decay searches. ProtoDUNE Dual Phase (DP) is a 6  ×  6  ×  6 m 3 liquid argon time-projection-chamber (LArTPC) that recorded cosmic-muon data at the CERN Neutrino Platform in 2019-2020 as a prototype of the DUNE Far Detector. Charged particles propagating through the LArTPC produce ionization and scintillation light. The scintillation light signal in these detectors can provide the trigger for non-beam events. In addition, it adds precise timing capabilities and improves the calorimetry measurements. In ProtoDUNE-DP, scintillation and electroluminescence light produced by cosmic muons in the LArTPC is collected by photomultiplier tubes placed up to 7 m away from the ionizing track. In this paper, the ProtoDUNE-DP photon detection system performance is evaluated with a particular focus on the different wavelength shifters, such as PEN and TPB, and the use of Xe-doped LAr, considering its future use in giant LArTPCs. The scintillation light production and propagation processes are analyzed and a comparison of simulation to data is performed, improving understanding of the liquid argon properties

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)1.

    Get PDF
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    Design, construction and operation of the ProtoDUNE-SP Liquid Argon TPC

    Get PDF
    The ProtoDUNE-SP detector is a single-phase liquid argon time projection chamber (LArTPC) that was constructed and operated in the CERN North Area at the end of the H4 beamline. This detector is a prototype for the first far detector module of the Deep Underground Neutrino Experiment (DUNE), which will be constructed at the Sandford Underground Research Facility (SURF) in Lead, South Dakota, U.S.A. The ProtoDUNE-SP detector incorporates full-size components as designed for DUNE and has an active volume of 7 × 6 × 7.2 m3. The H4 beam delivers incident particles with well-measured momenta and high-purity particle identification. ProtoDUNE-SP's successful operation between 2018 and 2020 demonstrates the effectiveness of the single-phase far detector design. This paper describes the design, construction, assembly and operation of the detector components

    Searching for solar KDAR with DUNE

    Get PDF

    Low exposure long-baseline neutrino oscillation sensitivity of the DUNE experiment

    Get PDF
    The Deep Underground Neutrino Experiment (DUNE) will produce world-leading neutrino oscillation measurements over the lifetime of the experiment. In this work, we explore DUNE's sensitivity to observe charge-parity violation (CPV) in the neutrino sector, and to resolve the mass ordering, for exposures of up to 100 kiloton-megawatt-years (kt-MW-yr). The analysis includes detailed uncertainties on the flux prediction, the neutrino interaction model, and detector effects. We demonstrate that DUNE will be able to unambiguously resolve the neutrino mass ordering at a 3σ (5σ) level, with a 66 (100) kt-MW-yr far detector exposure, and has the ability to make strong statements at significantly shorter exposures depending on the true value of other oscillation parameters. We also show that DUNE has the potential to make a robust measurement of CPV at a 3σ level with a 100 kt-MW-yr exposure for the maximally CP-violating values \delta_{\rm CP}} = \pm\pi/2. Additionally, the dependence of DUNE's sensitivity on the exposure taken in neutrino-enhanced and antineutrino-enhanced running is discussed. An equal fraction of exposure taken in each beam mode is found to be close to optimal when considered over the entire space of interest
    • 

    corecore