71 research outputs found

    Risk factors for methamphetamine use in youth: a systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Methamphetamine (MA) is a potent stimulant that is readily available. Its effects are similar to cocaine, but the drug has a profile associated with increased acute and chronic toxicities. The objective of this systematic review was to identify and synthesize literature on risk factors that are associated with MA use among youth.</p> <p>More than 40 electronic databases, websites, and key journals/meeting abstracts were searched. We included studies that compared children and adolescents (≀ 18 years) who used MA to those who did not. One reviewer extracted the data and a second checked for completeness and accuracy. For discrete risk factors, odds ratios (OR) were calculated and when appropriate, a pooled OR with 95% confidence intervals (95% CI) was calculated. For continuous risk factors, mean difference and 95% CI were calculated and when appropriate, a weighted mean difference (WMD) and 95% CI was calculated. Results were presented separately by comparison group: low-risk (no previous drug abuse) and high-risk children (reported previous drug abuse or were recruited from a juvenile detention center).</p> <p>Results</p> <p>Twelve studies were included. Among low-risk youth, factors associated with MA use were: history of heroin/opiate use (OR = 29.3; 95% CI: 9.8–87.8), family history of drug use (OR = 4.7; 95% CI: 2.8–7.9), risky sexual behavior (OR = 2.79; 95% CI: 2.25, 3.46) and some psychiatric disorders. History of alcohol use and smoking were also significantly associated with MA use. Among high-risk youth, factors associated with MA use were: family history of crime (OR = 2.0; 95% CI: 1.2–3.3), family history of drug use (OR = 4.7; 95% CI: 2.8–7.9), family history of alcohol abuse (OR = 3.2; 95% CI: 1.8–5.6), and psychiatric treatment (OR = 6.8; 95% CI: 3.6–12.9). Female sex was also significantly associated with MA use.</p> <p>Conclusion</p> <p>Among low-risk youth, a history of engaging in a variety of risky behaviors was significantly associated with MA use. A history of a psychiatric disorder was a risk factor for MA for both low- and high-risk youth. Family environment was also associated with MA use. Many of the included studies were cross-sectional making it difficult to assess causation. Future research should utilize prospective study designs so that temporal relationships between risk factors and MA use can be established.</p

    Accounting for Ecosystem Alteration Doubles Estimates of Conservation Risk in the Conterminous United States

    Get PDF
    Previous national and global conservation assessments have relied on habitat conversion data to quantify conservation risk. However, in addition to habitat conversion to crop production or urban uses, ecosystem alteration (e.g., from logging, conversion to plantations, biological invasion, or fire suppression) is a large source of conservation risk. We add data quantifying ecosystem alteration on unconverted lands to arrive at a more accurate depiction of conservation risk for the conterminous United States. We quantify ecosystem alteration using a recent national assessment based on remote sensing of current vegetation compared with modeled reference natural vegetation conditions. Highly altered (but not converted) ecosystems comprise 23% of the conterminous United States, such that the number of critically endangered ecoregions in the United States is 156% higher than when calculated using habitat conversion data alone. Increased attention to natural resource management will be essential to address widespread ecosystem alteration and reduce conservation risk

    A deep learning approach to photo–identification demonstrates high performance on two dozen cetacean species

    Get PDF
    We thank the countless individuals who collected and/or processed the nearly 85,000 images used in this study and those who assisted, particularly those who sorted these images from the millions that did not end up in the catalogues. Additionally, we thank the other Kaggle competitors who helped develop the ideas, models and data used here, particularly those who released their datasets to the public. The graduate assistantship for Philip T. Patton was funded by the NOAA Fisheries QUEST Fellowship. This paper represents HIMB and SOEST contribution numbers 1932 and 11679, respectively. The technical support and advanced computing resources from University of Hawaii Information Technology Services—Cyberinfrastructure, funded in part by the National Science Foundation CC* awards # 2201428 and # 2232862 are gratefully acknowledged. Every photo–identification image was collected under permits according to relevant national guidelines, regulation and legislation.Peer reviewedPublisher PD

    Proceedings of Patient Reported Outcome Measure’s (PROMs) Conference Oxford 2017: Advances in Patient Reported Outcomes Research

    Get PDF
    A33-Effects of Out-of-Pocket (OOP) Payments and Financial Distress on Quality of Life (QoL) of People with Parkinson’s (PwP) and their Carer

    Identification and reconstruction of low-energy electrons in the ProtoDUNE-SP detector

    Full text link
    Measurements of electrons from Îœe\nu_e interactions are crucial for the Deep Underground Neutrino Experiment (DUNE) neutrino oscillation program, as well as searches for physics beyond the standard model, supernova neutrino detection, and solar neutrino measurements. This article describes the selection and reconstruction of low-energy (Michel) electrons in the ProtoDUNE-SP detector. ProtoDUNE-SP is one of the prototypes for the DUNE far detector, built and operated at CERN as a charged particle test beam experiment. A sample of low-energy electrons produced by the decay of cosmic muons is selected with a purity of 95%. This sample is used to calibrate the low-energy electron energy scale with two techniques. An electron energy calibration based on a cosmic ray muon sample uses calibration constants derived from measured and simulated cosmic ray muon events. Another calibration technique makes use of the theoretically well-understood Michel electron energy spectrum to convert reconstructed charge to electron energy. In addition, the effects of detector response to low-energy electron energy scale and its resolution including readout electronics threshold effects are quantified. Finally, the relation between the theoretical and reconstructed low-energy electron energy spectrum is derived and the energy resolution is characterized. The low-energy electron selection presented here accounts for about 75% of the total electron deposited energy. After the addition of lost energy using a Monte Carlo simulation, the energy resolution improves from about 40% to 25% at 50~MeV. These results are used to validate the expected capabilities of the DUNE far detector to reconstruct low-energy electrons.Comment: 19 pages, 10 figure

    Impact of cross-section uncertainties on supernova neutrino spectral parameter fitting in the Deep Underground Neutrino Experiment

    Get PDF
    A primary goal of the upcoming Deep Underground Neutrino Experiment (DUNE) is to measure the O(10)\mathcal{O}(10) MeV neutrinos produced by a Galactic core-collapse supernova if one should occur during the lifetime of the experiment. The liquid-argon-based detectors planned for DUNE are expected to be uniquely sensitive to the Îœe\nu_e component of the supernova flux, enabling a wide variety of physics and astrophysics measurements. A key requirement for a correct interpretation of these measurements is a good understanding of the energy-dependent total cross section σ(EÎœ)\sigma(E_\nu) for charged-current Îœe\nu_e absorption on argon. In the context of a simulated extraction of supernova Îœe\nu_e spectral parameters from a toy analysis, we investigate the impact of σ(EÎœ)\sigma(E_\nu) modeling uncertainties on DUNE's supernova neutrino physics sensitivity for the first time. We find that the currently large theoretical uncertainties on σ(EÎœ)\sigma(E_\nu) must be substantially reduced before the Îœe\nu_e flux parameters can be extracted reliably: in the absence of external constraints, a measurement of the integrated neutrino luminosity with less than 10\% bias with DUNE requires σ(EÎœ)\sigma(E_\nu) to be known to about 5%. The neutrino spectral shape parameters can be known to better than 10% for a 20% uncertainty on the cross-section scale, although they will be sensitive to uncertainties on the shape of σ(EÎœ)\sigma(E_\nu). A direct measurement of low-energy Îœe\nu_e-argon scattering would be invaluable for improving the theoretical precision to the needed level.Comment: 25 pages, 21 figure

    Fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin with gemtuzumab ozogamicin improves event-free survival in younger patients with newly diagnosed aml and overall survival in patients with npm1 and flt3 mutations

    Get PDF
    Purpose To determine the optimal induction chemotherapy regimen for younger adults with newly diagnosed AML without known adverse risk cytogenetics. Patients and Methods One thousand thirty-three patients were randomly assigned to intensified (fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin [FLAG-Ida]) or standard (daunorubicin and Ara-C [DA]) induction chemotherapy, with one or two doses of gemtuzumab ozogamicin (GO). The primary end point was overall survival (OS). Results There was no difference in remission rate after two courses between FLAG-Ida + GO and DA + GO (complete remission [CR] + CR with incomplete hematologic recovery 93% v 91%) or in day 60 mortality (4.3% v 4.6%). There was no difference in OS (66% v 63%; P = .41); however, the risk of relapse was lower with FLAG-Ida + GO (24% v 41%; P < .001) and 3-year event-free survival was higher (57% v 45%; P < .001). In patients with an NPM1 mutation (30%), 3-year OS was significantly higher with FLAG-Ida + GO (82% v 64%; P = .005). NPM1 measurable residual disease (MRD) clearance was also greater, with 88% versus 77% becoming MRD-negative in peripheral blood after cycle 2 (P = .02). Three-year OS was also higher in patients with a FLT3 mutation (64% v 54%; P = .047). Fewer transplants were performed in patients receiving FLAG-Ida + GO (238 v 278; P = .02). There was no difference in outcome according to the number of GO doses, although NPM1 MRD clearance was higher with two doses in the DA arm. Patients with core binding factor AML treated with DA and one dose of GO had a 3-year OS of 96% with no survival benefit from FLAG-Ida + GO. Conclusion Overall, FLAG-Ida + GO significantly reduced relapse without improving OS. However, exploratory analyses show that patients with NPM1 and FLT3 mutations had substantial improvements in OS. By contrast, in patients with core binding factor AML, outcomes were excellent with DA + GO with no FLAG-Ida benefit

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Information and Blind Spots: Satellite-imaging technology and contending arguments for nuclear disarmament, arms control, and modernization in U.S. policymaking

    No full text
    In the United States in the late 1950s and early 1960s, arms control gained traction as the preferred option for dealing with Soviet nuclear advances. The development of the first imaging satellites is often credited with enabling nuclear arms control under the conventional wisdom that more information enabled verification and thus facilitated restraint and cooperation. This research shows that while satellites indeed provided information to verify arms control and contributed to its rise, the information and blind spots stemming from the high-resolution satellites fielded by the United States related to Soviet nuclear weapon developments were often leveraged to pursue more diverse and expansive U.S. modernization efforts, including to retain nascent weapon programs when their banning or cancellation was a possibility under debate. New information and remaining blind spots from satellites were thus co-constitutive in arms control limits (rather than bans), but also in qualitative arms build ups. The result was that arms control, while initially framed by U.S. political leaders as a step toward disarmament, became increasingly interwoven with modernization, and this combination was often positioned as the policy alternative to a potential ban. The advancement and use of high-resolution satellites thus contributed to the shaping of U.S. nuclear policy by encouraging arms control while creating new opportunities for pursuing nuclear superiority and further diminishing disarmament as a policy choice
    • 

    corecore