162 research outputs found

    The arts therapist in public: The dichotomy of clinical and performative improvising

    Get PDF
    This paper presents a video of a performance at 'Concurrent♯2' in Edinburgh, 2017. it is followed by artist statements made in response to watching the piece. The authors' set in context the work and its development.https://doi.org/10.15845/voices.v17i3.92717pubpub

    Estimation of treatment policy estimands for continuous outcomes using off treatment sequential multiple imputation

    Full text link
    The estimands framework outlined in ICH E9 (R1) describes the components needed to precisely define the effects to be estimated in clinical trials, which includes how post-baseline "intercurrent" events (IEs) are to be handled. In late-stage clinical trials, it is common to handle intercurrent events like "treatment discontinuation" using the treatment policy strategy and target the treatment effect on all outcomes regardless of treatment discontinuation. For continuous repeated measures, this type of effect is often estimated using all observed data before and after discontinuation using either a mixed model for repeated measures (MMRM) or multiple imputation (MI) to handle any missing data. In basic form, both of these estimation methods ignore treatment discontinuation in the analysis and therefore may be biased if there are differences in patient outcomes after treatment discontinuation compared to patients still assigned to treatment, and missing data being more common for patients who have discontinued treatment. We therefore propose and evaluate a set of MI models that can accommodate differences between outcomes before and after treatment discontinuation. The models are evaluated in the context of planning a phase 3 trial for a respiratory disease. We show that analyses ignoring treatment discontinuation can introduce substantial bias and can sometimes underestimate variability. We also show that some of the MI models proposed can successfully correct the bias but inevitably lead to increases in variance. We conclude that some of the proposed MI models are preferable to the traditional analysis ignoring treatment discontinuation, but the precise choice of MI model will likely depend on the trial design, disease of interest and amount of observed and missing data following treatment discontinuation

    Cenozoic evolution of the eastern Black Sea: a test of depth-dependent stretching models

    Get PDF
    Subsidence analysis of the eastern Black Sea basin suggests that the stratigraphy of this deep, extensional basin can be explained by a predominantly pure-shear stretching history. A strain-rate inversion method that assumes pure-shear extension obtains good fits between observed and predicted stratigraphy. A relatively pure-shear strain distribution is also obtained when a strain-rate inversion algorithm is applied that allows extension to vary with depth without assuming its existence or form. The timing of opening of the eastern Black Sea, which occupied a back-arc position during the closure of the Tethys Ocean, has also been a subject of intense debate; competing theories called for basin opening during the Jurassic, Cretaceous or Paleocene/Eocene. Our work suggests that extension likely continued into the early Cenozoic, in agreement with stratigraphic relationships onshore and with estimates for the timing of arc magmatism. Further basin deepening also appears to have occurred in the last 20 myr. This anomalous subsidence event is focused in the northern part of the basin and reaches its peak at 15–10 Ma. We suggest that this comparatively localized shortening is associated with the northward movement of the Arabian plate. We also explore the effects of paleowater depth and elastic thickness on the results. These parameters are controversial, particularly for deep-water basins and margins, but their estimation is a necessary step in any analysis of the tectonic subsidence record stored in stratigraphy. <br/

    A novel equivalence probability weighted power prior for using historical control data in an adaptive clinical trial design: a comparison to standard methods

    Get PDF
    A standard two‐arm randomised controlled trial usually compares an intervention to a control treatment with equal numbers of patients randomised to each treatment arm and only data from within the current trial are used to assess the treatment effect. Historical data are used when designing new trials and have recently been considered for use in the analysis when the required number of patients under a standard trial design cannot be achieved. Incorporating historical control data could lead to more efficient trials, reducing the number of controls required in the current study when the historical and current control data agree. However, when the data are inconsistent, there is potential for biased treatment effect estimates, inflated type I error and reduced power. We introduce two novel approaches for binary data which discount historical data based on the agreement with the current trial controls, an equivalence approach and an approach based on tail area probabilities. An adaptive design is used where the allocation ratio is adapted at the interim analysis, randomising fewer patients to control when there is agreement. The historical data are down‐weighted in the analysis using the power prior approach with a fixed power. We compare operating characteristics of the proposed design to historical data methods in the literature: the modified power prior; commensurate prior; and robust mixture prior. The equivalence probability weight approach is intuitive and the operating characteristics can be calculated exactly. Furthermore, the equivalence bounds can be chosen to control the maximum possible inflation in type I error

    A novel equivalence probability weighted power prior for using historical control data in an adaptive clinical trial design: A comparison to standard methods.

    Get PDF
    Funder: National Institute of Health Research (NIHR) Cambridge Biomedical Research CentreA standard two-arm randomised controlled trial usually compares an intervention to a control treatment with equal numbers of patients randomised to each treatment arm and only data from within the current trial are used to assess the treatment effect. Historical data are used when designing new trials and have recently been considered for use in the analysis when the required number of patients under a standard trial design cannot be achieved. Incorporating historical control data could lead to more efficient trials, reducing the number of controls required in the current study when the historical and current control data agree. However, when the data are inconsistent, there is potential for biased treatment effect estimates, inflated type I error and reduced power. We introduce two novel approaches for binary data which discount historical data based on the agreement with the current trial controls, an equivalence approach and an approach based on tail area probabilities. An adaptive design is used where the allocation ratio is adapted at the interim analysis, randomising fewer patients to control when there is agreement. The historical data are down-weighted in the analysis using the power prior approach with a fixed power. We compare operating characteristics of the proposed design to historical data methods in the literature: the modified power prior; commensurate prior; and robust mixture prior. The equivalence probability weight approach is intuitive and the operating characteristics can be calculated exactly. Furthermore, the equivalence bounds can be chosen to control the maximum possible inflation in type I error

    Emulation and History Matching using the hmer Package

    Full text link
    Modelling complex real-world situations such as infectious diseases, geological phenomena, and biological processes can present a dilemma: the computer model (referred to as a simulator) needs to be complex enough to capture the dynamics of the system, but each increase in complexity increases the evaluation time of such a simulation, making it difficult to obtain an informative description of parameter choices that would be consistent with observed reality. While methods for identifying acceptable matches to real-world observations exist, for example optimisation or Markov chain Monte Carlo methods, they may result in non-robust inferences or may be infeasible for computationally intensive simulators. The techniques of emulation and history matching can make such determinations feasible, efficiently identifying regions of parameter space that produce acceptable matches to data while also providing valuable information about the simulator's structure, but the mathematical considerations required to perform emulation can present a barrier for makers and users of such simulators compared to other methods. The hmer package provides an accessible framework for using history matching and emulation on simulator data, leveraging the computational efficiency of the approach while enabling users to easily match to, visualise, and robustly predict from their complex simulators.Comment: 40 pages, 11 figures; submitted to Journal of Statistical Software: author order correcte

    Automated methods to test connectedness and quantify indirectness of evidence in network meta‐analysis

    Get PDF
    Network meta-analysis compares multiple treatments from studies that form a connected network of evidence. However, for complex networks, it is not easy to see if the network is connected. We use simple techniques from graph theory to test the connectedness of evidence networks in network meta-analysis. The method is to build the adjacency matrix for a network, with rows and columns corresponding to the treatments in the network and entries being one or zero depending on whether the treatments have been compared or not, and with zeros along the diagonal. Manipulation of this matrix gives the indirect connection matrix. The entries of this matrix determine whether two treatments can be compared, directly or indirectly. We also describe the distance matrix, which gives the minimum number of steps in the network required to compare a pair of treatments. This is a useful assessment of an indirect comparison as each additional step requires further assumptions of homogeneity in, for example, design and target populations of included trials. If there are no loops in the network, the distance is a measure of the degree of assumptions needed; it is approximately this with loops. We illustrate our methods using several constructed examples and giving R code for computation. We have also implemented the techniques in the Stata package "network." The methods provide a fast way to ensure comparisons are only made between connected treatments and to assess the degree of indirectness of a comparison

    Causes and Consequences of Diachronous V-Shaped Ridges in the North Atlantic Ocean

    Get PDF
    In the North Atlantic Ocean, the geometry of diachronous V‐shaped features that straddle the Reykjanes Ridge is often attributed to thermal pulses which advect away from the center of the Iceland plume. Recently, two alternative hypotheses have been proposed: rift propagation and buoyant mantle upwelling. Here we evaluate these different proposals using basin‐wide geophysical and geochemical observations. The centerpiece of our analysis is a pair of seismic reflection profiles oriented parallel to flow lines that span the North Atlantic Ocean. V‐shaped ridges and troughs are mapped on both Neogene and Paleogene oceanic crust, enabling a detailed chronology of activity to be established for the last 50 million years. Estimates of the cumulative horizontal displacement across normal faults help to discriminate between brittle and magmatic modes of plate separation, suggesting that crustal architecture is sensitive to the changing planform of the plume. Water‐loaded residual depth measurements are used to estimate crustal thickness and to infer mantle potential temperature which varies by ±25°C on timescales of 3–8 Ma. This variation is consistent with the range of temperatures inferred from geochemical modeling of dredged basaltic rocks along the ridge axis itself, from changes in Neogene deep‐water circulation, and from the regional record of episodic Cenozoic magmatism. We conclude that radial propagation of transient thermal anomalies within an asthenospheric channel that is 150 ± 50 km thick best accounts for the available geophysical and geochemical observations
    • 

    corecore