41 research outputs found

    Trends in the use of disease-modifying therapies among reproductive-aged women with multiple sclerosis in the United States from 2010 to 2019

    Get PDF
    Purpose: Multiple sclerosis (MS) is a chronic disease of the central nervous system that disproportionately affects women, with typical onset during reproductive age. Several disease-modifying therapies (DMTs) are FDA-approved to slow disease progression, but are not indicated for use during pregnancy. Our objective was to describe trends over time (2010–2019) in monthly point prevalence of DMT use among reproductive-age women, overall and by generic name. Methods: This study used administrative claims data from the US during 2009–2019 to identify women age 15–44 with MS and continuous insurance coverage for ≥12 months. DMTs were identified using prescription fills and procedural claims for alemtuzumab, daclizumab, dimethyl fumarate, fingolimod, glatiramer acetate, interferon beta, mitoxantrone, natalizumab, ocrelizumab, and teriflunomide. Monthly prevalent use was defined as ≥1 days' supply of a DMT in the month. Age- and region-standardized monthly prevalence was estimated nonparametrically. Results: Among 42 281 reproductive-aged women over 818 179 person-months, DMT use increased from a minimum monthly prevalence of 49.3% (February, 2011) to a maximum of 58.7% (April, 2019). In 2010, prevalence of injectable DMTs was 43.1% compared to 2.5% for oral DMTs; by 2014, however, oral DMTs (26.5%) surpassed injectable DMTs (23.7%) as the most common route of administration. In the most recent data available (December, 2019), the most common DMTs were dimethyl fumarate, glatiramer acetate, and fingolimod. Conclusions: DMT use among reproductive-aged women has rapidly evolved during the past decade. Collaborative treatment decision making between women with MS and clinicians may help optimize MS care and improve DMT uptake during reproductive years

    The Timing, the Treatment, the Question: Comparison of Epidemiologic Approaches to Minimize Immortal Time Bias in Real-World Data Using a Surgical Oncology Example

    Get PDF
    Background: Studies evaluating the effects of cancer treatments are prone to immortal time bias that, if unaddressed, can lead to treatments appearing more beneficial than they are. Methods: To demonstrate the impact of immortal time bias, we compared results across several analytic approaches (dichotomous exposure, dichotomous exposure excluding immortal time, time-varying exposure, landmark analysis, clone-censor-weight method), using surgical resection among women with metastatic breast cancer as an example. All adult women diagnosed with incident metastatic breast cancer from 2013–2016 in the National Cancer Database were included. To quantify immortal time bias, we also conducted a simulation study where the “true” relationship between surgical resection and mortality was known. Results: 24,329 women (median age 61, IQR 51–71) were included, and 24% underwent surgical resection. The largest association between resection and mortality was observed when using a dichotomized exposure [HR, 0.54; 95% confidence interval (CI), 0.51–0.57], followed by dichotomous with exclusion of immortal time (HR, 0.62; 95% CI, 0.59–0.65). Results from the time-varying exposure, landmark, and clone-censor-weight method analyses were closer to the null (HR, 0.67–0.84). Results from the plasmode simulation found that the time-varying exposure, landmark, and clone-censor-weight method models all produced unbiased HRs (bias -0.003 to 0.016). Both standard dichotomous exposure (HR, 0.84; bias, -0.177) and dichotomous with exclusion of immortal time (HR, 0.93; bias, -0.074) produced meaningfully biased estimates. Conclusions: Researchers should use time-varying exposures with a treatment assessment window or the clone-censor-weight method when immortal time is present. Impact: Using methods that appropriately account for immortal time will improve evidence and decision-making from research using real-world data

    Missing data approaches in longitudinal studies of aging: A case example using the National Health and Aging Trends Study

    Get PDF
    Purpose Missing data is a key methodological consideration in longitudinal studies of aging. We described missing data challenges and potential methodological solutions using a case example describing five-year frailty state transitions in a cohort of older adults. Methods We used longitudinal data from the National Health and Aging Trends Study, a nationally-representative cohort of Medicare beneficiaries. We assessed the five components of the Fried frailty phenotype and classified frailty based on their number of components (robust: 0, prefrail: 1–2, frail: 3–5). One-, two-, and five-year frailty state transitions were defined as movements between frailty states or death. Missing frailty components were imputed using hot deck imputation. Inverse probability weights were used to account for potentially informative loss-to-follow-up. We conducted scenario analyses to test a range of assumptions related to missing data. Results Missing data were common for frailty components measured using physical assessments (walking speed, grip strength). At five years, 36% of individuals were lost-to-follow-up, differentially with respect to baseline frailty status. Assumptions for missing data mechanisms impacted inference regarding individuals improving or worsening in frailty. Conclusions Missing data and loss-to-follow-up are common in longitudinal studies of aging. Robust epidemiologic methods can improve the rigor and interpretability of aging-related research

    Volume I. Introduction to DUNE

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. This TDR is intended to justify the technical choices for the far detector that flow down from the high-level physics goals through requirements at all levels of the Project. Volume I contains an executive summary that introduces the DUNE science program, the far detector and the strategy for its modular designs, and the organization and management of the Project. The remainder of Volume I provides more detail on the science program that drives the choice of detector technologies and on the technologies themselves. It also introduces the designs for the DUNE near detector and the DUNE computing model, for which DUNE is planning design reports. Volume II of this TDR describes DUNE\u27s physics program in detail. Volume III describes the technical coordination required for the far detector design, construction, installation, and integration, and its organizational structure. Volume IV describes the single-phase far detector technology. A planned Volume V will describe the dual-phase technology

    Deep Underground Neutrino Experiment (DUNE), far detector technical design report, volume III: DUNE far detector technical coordination

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. Volume III of this TDR describes how the activities required to design, construct, fabricate, install, and commission the DUNE far detector modules are organized and managed. This volume details the organizational structures that will carry out and/or oversee the planned far detector activities safely, successfully, on time, and on budget. It presents overviews of the facilities, supporting infrastructure, and detectors for context, and it outlines the project-related functions and methodologies used by the DUNE technical coordination organization, focusing on the areas of integration engineering, technical reviews, quality assurance and control, and safety oversight. Because of its more advanced stage of development, functional examples presented in this volume focus primarily on the single-phase (SP) detector module

    Long-Baseline Neutrino Facility (LBNF) and Deep Underground Neutrino Experiment (DUNE) Conceptual Design Report Volume 2: The Physics Program for DUNE at LBNF

    Get PDF
    The Physics Program for the Deep Underground Neutrino Experiment (DUNE) at the Fermilab Long-Baseline Neutrino Facility (LBNF) is described

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    The Single-Phase ProtoDUNE Technical Design Report

    No full text
    ProtoDUNE-SP is the single-phase DUNE Far Detector prototype that is under construction and will be operated at the CERN Neutrino Platform (NP) starting in 2018. ProtoDUNE-SP, a crucial part of the DUNE effort towards the construction of the first DUNE 10-kt fiducial mass far detector module (17 kt total LAr mass), is a significant experiment in its own right. With a total liquid argon (LAr) mass of 0.77 kt, it represents the largest monolithic single-phase LArTPC detector to be built to date. It's technical design is given in this report

    The Single-Phase ProtoDUNE Technical Design Report

    No full text
    ProtoDUNE-SP is the single-phase DUNE Far Detector prototype that is under construction and will be operated at the CERN Neutrino Platform (NP) starting in 2018. ProtoDUNE-SP, a crucial part of the DUNE effort towards the construction of the first DUNE 10-kt fiducial mass far detector module (17 kt total LAr mass), is a significant experiment in its own right. With a total liquid argon (LAr) mass of 0.77 kt, it represents the largest monolithic single-phase LArTPC detector to be built to date. It's technical design is given in this report

    Long-baseline neutrino oscillation physics potential of the DUNE experiment: DUNE Collaboration

    No full text
    The sensitivity of the Deep Underground Neutrino Experiment (DUNE) to neutrino oscillation is determined, based on a full simulation, reconstruction, and event selection of the far detector and a full simulation and parameterized analysis of the near detector. Detailed uncertainties due to the flux prediction, neutrino interaction model, and detector effects are included. DUNE will resolve the neutrino mass ordering to a precision of 5σ, for all δCP values, after 2 years of running with the nominal detector design and beam configuration. It has the potential to observe charge-parity violation in the neutrino sector to a precision of 3σ (5σ) after an exposure of 5 (10) years, for 50% of all δCP values. It will also make precise measurements of other parameters governing long-baseline neutrino oscillation, and after an exposure of 15 years will achieve a similar sensitivity to sin 22 θ13 to current reactor experiments. © 2020, The Author(s)
    corecore