1,148 research outputs found
Missing data in randomised controlled trials: a practical guide
Objective: Missing data are ubiquitous in clinical trials, yet recent research suggests many statisticians
and investigators appear uncertain how to handle them. The objective is to set out a principled
approach for handling missing data in clinical trials, and provide examples and code to facilitate
its adoption.
Data sources: An asthma trial from GlaxoSmithKline, a asthma trial from AstraZeneca, and a
dental pain trial from GlaxoSmithKline.
Methods: Part I gives a non-technical review how missing data are typically handled in clinical
trials, and the issues raised by missing data. When faced with missing data, we show no analysis
can avoid making additional untestable assumptions. This leads to a proposal for a systematic,
principled approach for handling missing data in clinical trials, which in turn informs a critique of
current Committee of Proprietary Medicinal Products guidelines for missing data, together with
many of the ad-hoc statistical methods currently employed.
Part II shows how primary analyses in a range of settings can be carried out under the so-called
missing at random assumption. This key assumption has a central role in underpinning the most
important classes of primary analysis, such as those based on likelihood. However its validity cannot
be assessed from the data under analysis, so in Part III, two main approaches are developed and
illustrated for the assessment of the sensitivity of the primary analyses to this assumption.
Results: The literature review revealed missing data are often ignored, or poorly handled in the
analysis. Current guidelines, and frequently used ad-hoc statistical methods are shown to be flawed.
A principled, yet practical, alternative approach is developed, which examples show leads inferences
with greater validity. SAS code is given to facilitate its direct application.
Conclusions: From the design stage onwards, a principled approach to handling missing data should
be adopted. Such an approach follows well-defined and accepted statistical arguments, using models
and assumptions that are transparent, and hence open to criticism and debate. This monograph
outlines how this principled approach can be practically, and directly, applied to the majority of
trials with longitudinal follow-up
Missing data in randomised controlled trials: a practical guide
Objective: Missing data are ubiquitous in clinical trials, yet recent research suggests many statisticians
and investigators appear uncertain how to handle them. The objective is to set out a principled
approach for handling missing data in clinical trials, and provide examples and code to facilitate
its adoption.
Data sources: An asthma trial from GlaxoSmithKline, a asthma trial from AstraZeneca, and a
dental pain trial from GlaxoSmithKline.
Methods: Part I gives a non-technical review how missing data are typically handled in clinical
trials, and the issues raised by missing data. When faced with missing data, we show no analysis
can avoid making additional untestable assumptions. This leads to a proposal for a systematic,
principled approach for handling missing data in clinical trials, which in turn informs a critique of
current Committee of Proprietary Medicinal Products guidelines for missing data, together with
many of the ad-hoc statistical methods currently employed.
Part II shows how primary analyses in a range of settings can be carried out under the so-called
missing at random assumption. This key assumption has a central role in underpinning the most
important classes of primary analysis, such as those based on likelihood. However its validity cannot
be assessed from the data under analysis, so in Part III, two main approaches are developed and
illustrated for the assessment of the sensitivity of the primary analyses to this assumption.
Results: The literature review revealed missing data are often ignored, or poorly handled in the
analysis. Current guidelines, and frequently used ad-hoc statistical methods are shown to be flawed.
A principled, yet practical, alternative approach is developed, which examples show leads inferences
with greater validity. SAS code is given to facilitate its direct application.
Conclusions: From the design stage onwards, a principled approach to handling missing data should
be adopted. Such an approach follows well-defined and accepted statistical arguments, using models
and assumptions that are transparent, and hence open to criticism and debate. This monograph
outlines how this principled approach can be practically, and directly, applied to the majority of
trials with longitudinal follow-up
Bayesian models for weighted data with missing values: a bootstrap approach
Many data sets, especially from surveys, are made available to users with weights. Where the derivation of such weights is known, this information can often be incorporated in the user's substantive model (model of interest). When the derivation is unknown, the established procedure is to carry out a weighted analysis. However, with non‐trivial proportions of missing data this is inefficient and may be biased when data are not missing at random. Bayesian approaches provide a natural approach for the imputation of missing data, but it is unclear how to handle the weights. We propose a weighted bootstrap Markov chain Monte Carlo algorithm for estimation and inference. A simulation study shows that it has good inferential properties. We illustrate its utility with an analysis of data from the Millennium Cohort Study
Reference based sensitivity analysis for longitudinal trials with protocol deviation via multiple imputation
Randomised controlled trials provide essential evidence for the evaluation of new and existing medical treatments. Unfortunately the statistical analysis is often complicated by the occurrence of protocol deviations, which mean we cannot always measure the intended outcomes for individuals who deviate, resulting in a missing data problem. In such settings, however one approaches the analysis, an untestable assumption about the distribution of the unobserved data must be made. To understand how far the results depend on these assumptions, the primary analysis should be supplemented by a range of sensitivity analyses, which explore how the conclusions vary over a range of different credible assumptions for the missing data. In this article we describe a new command, mimix, that can be used to perform reference based sensitivity analyses for randomised controlled trials with longitudinal quantitative outcome data, using the approach proposed by Carpenter, Roger, and Kenward (2013). Under this approach, we make qualitative assumptions about how individuals' missing outcomes relate to those observed in relevant groups in the trial, based on plausible clinical scenarios. Statistical analysis then proceeds using the method of multiple imputation
Information anchored reference‐based sensitivity analysis for truncated normal data with application to survival analysis
The primary analysis of time-to-event data typically makes the censoring at random assumption, that is, that—conditional on covariates in the model—the distribution of event times is the same, whether they are observed or unobserved. In such cases, we need to explore the robustness of inference to more pragmatic assumptions about patients post-censoring in sensitivity analyses. Reference-based multiple imputation, which avoids analysts explicitly specifying the parameters of the unobserved data distribution, has proved attractive to researchers. Building on results for longitudinal continuous data, we show that inference using a Tobit regression imputation model for reference-based sensitivity analysis with right censored log normal data is information anchored, meaning the proportion of information lost due to missing data under the primary analysis is held constant across the sensitivity analyses. We illustrate our theoretical results using simulation and a clinical trial case study
Assessing metabolism and injury in acute human traumatic brain injury with magnetic resonance spectroscopy: current and future applications
Traumatic brain injury triggers a series of complex pathophysiological processes. These include abnormalities in brain energy metabolism; consequent to reduced tissue pO₂ arising from ischaemia or abnormal tissue oxygen diffusion, or due to a failure of mitochondrial function. In-vivo magnetic resonance spectroscopy (MRS) allows non-invasive interrogation of brain tissue metabolism in patients with acute brain injury. Nuclei with ‘spin’ e.g. ¹H, ³¹P and ¹³C, are detectable using MRS and are found in metabolites at various stages of energy metabolism, possessing unique signatures due to their chemical shift or spin-spin interactions (J-coupling).
The most commonly used clinical MRS technique, ¹H MRS, uses the great abundance of hydrogen atoms within molecules in brain tissue. Spectra acquired with longer echo-times include N-acetylaspartate, creatine and choline. N-acetylaspartate, a marker of neuronal mitochondrial activity related to ATP, is reported to be lower in patients with TBI than healthy controls, and the ratio of N-acetylaspartate/creatine at early time points may correlate with clinical outcome. ¹H MRS acquired with shorter echo-times produces a more complex spectrum, allowing detection of a wider range of metabolites.
³¹P MRS detects high energy phosphate species, which are the end-products of cellular respiration: adenosine triphosphate (ATP) and phosphocreatine. ATP is the principal form of chemical energy in living organisms, and phosphocreatine (PCr) is regarded as a readily mobilised reserve for its replenishment during periods of high utilisation. The ratios of high energy phosphates are thought to represent a balance between energy generation, reserve and use in the brain Additionally, the chemical shift difference between Pi and PCr enables calculation of intracellular pH. ¹³C MRS detects the ¹³C-isotope of carbon in brain metabolites. As the natural abundance of ¹³C is low (1.1%), ¹³C MRS is typically performed following administration of ¹³C-enriched substrates which permits tracking of the metabolic fate of the infused ¹³C in the brain over time, and calculation of metabolic rates in a range of biochemical pathways, including glycolysis, the tricarboxylic acid (TCA) cycle, and glutamate-glutamine cycling. The advent of new hyperpolarization techniques to transiently boost signal in ¹³C-enriched MRS in-vivo studies shows promise in this field and further developments are expected.The funding bodies acknowledged on the paper are:
PJAH is supported by a National Institute for Health Research (NIHR) Research Professorship, Academy of Medical Sciences/Health Foundation Senior Surgical Scientist Fellowship and the National Institute for Health Research Biomedical Research Centre, Cambridge.
PJAH and KLHC are supported by the NIHR Biomedical Research Centre, Cambridge.
MGS is supported by PH’s NIHR Research Professorship.
AS is funded by the NIHR via an award to the Cambridge NIHR/Wellcome Trust Clinical Research Facility
Beetle (Coleoptera: Scirtidae) Facilitation of Larval Mosquito Growth in Tree Hole Habitats is Linked to Multitrophic Microbial Interactions
Container-breeding mosquitoes, such as Aedes triseriatus, ingest biofilms and filter water column microorganisms directly to obtain the bulk of their nutrition. Scirtid beetles often co-occur with A. triseriatus and may facilitate the production of mosquito adults under low-resource conditions. Using molecular genetic techniques and quantitative assays, we observed changes in the dynamics and composition of bacterial and fungal communities present on leaf detritus and in the water column when scirtid beetles co-occur with A. triseriatus. Data from terminal restriction fragment polymorphism analysis indicated scirtid presence alters the structure of fungal communities in the water column but not leaf-associated fungal communities. Similar changes in leaf and water bacterial communities occurred in response to mosquito presence. In addition, we observed increased processing of leaf detritus, higher leaf-associated enzyme activity, higher bacterial productivity, and higher leaf-associated fungal biomass when scirtid beetles were present. Such shifts suggest beetle feeding facilitates mosquito production indirectly through the microbial community rather than directly through an increase in available fine particulate organic matter
Axion Protection from Flavor
The QCD axion fails to solve the strong CP problem unless all explicit PQ
violating, Planck-suppressed, dimension n<10 operators are forbidden or have
exponentially small coefficients. We show that all theories with a QCD axion
contain an irreducible source of explicit PQ violation which is proportional to
the determinant of the Yukawa interaction matrix of colored fermions.
Generically, this contribution is of low operator dimension and will
drastically destabilize the axion potential, so its suppression is a necessary
condition for solving the strong CP problem. We propose a mechanism whereby the
PQ symmetry is kept exact up to n=12 with the help of the very same flavor
symmetries which generate the hierarchical quark masses and mixings of the SM.
This "axion flavor protection" is straightforwardly realized in theories which
employ radiative fermion mass generation and grand unification. A universal
feature of this construction is that the heavy quark Yukawa couplings are
generated at the PQ breaking scale.Comment: 16 pages, 2 figure
- …