396 research outputs found

    A New Perspective on the Nonextremal Enhancon Solution

    Full text link
    We discuss the nonextremal generalisation of the enhancon mechanism. We find that the nonextremal shell branch solution does not violate the Weak Energy Condition when the nonextremality parameter is small, in contrast to earlier discussions of this subject. We show that this physical shell branch solution fills the mass gap between the extremal enhancon solution and the nonextremal horizon branch solution.Comment: 10 pages, 3 figures, reference adde

    Aspects of D-Branes as BPS monopoles

    Get PDF
    We investigate some of the properties of D-brane configurations which behave as BPS monopoles. The two D-brane configurations we will study are the enhançon and D-strings attached to D3-branes.We will start by investigating D3-branes wrapped on a K3 manifold, which are known as enhançons. They look like regions of enhanced gauge symmetry in the directions transverse to the branes, and therefore behave as BPS monopoles. We calculate the metric on moduli space for n enhançons, following the methods used by Ferrell and Eardley for black holes. We expect the result to be the higher-dimensional generalisation of the Taub-NUT metric, which is the metric on moduli space for n BPS monopoles. Next we will study D-strings attached to D3-branes; the ends of the D-strings behave as BPS monopoles of the world volume gauge theory living on the D3-branes. In fact the D-string/D3-brane system is a physical realisation of the ADHMN construction for BPS monopoles. We aim to test this correspondence by calculating the energy radiated during D-string scattering, working with the non-Abelian Born-Infeld action for D-strings. We will then compare our result to the equivalent monopole calculation of Manton and Samols

    Employing a latent variable framework to improve efficiency in composite endpoint analysis.

    Get PDF
    Composite endpoints that combine multiple outcomes on different scales are common in clinical trials, particularly in chronic conditions. In many of these cases, patients will have to cross a predefined responder threshold in each of the outcomes to be classed as a responder overall. One instance of this occurs in systemic lupus erythematosus, where the responder endpoint combines two continuous, one ordinal and one binary measure. The overall binary responder endpoint is typically analysed using logistic regression, resulting in a substantial loss of information. We propose a latent variable model for the systemic lupus erythematosus endpoint, which assumes that the discrete outcomes are manifestations of latent continuous measures and can proceed to jointly model the components of the composite. We perform a simulation study and find that the method offers large efficiency gains over the standard analysis, the magnitude of which is highly dependent on the components driving response. Bias is introduced when joint normality assumptions are not satisfied, which we correct for using a bootstrap procedure. The method is applied to the Phase IIb MUSE trial in patients with moderate to severe systemic lupus erythematosus. We show that it estimates the treatment effect 2.5 times more precisely, offering a 60% reduction in required sample size

    Use of Modeling and Simulation in the Design and Conduct of Pediatric Clinical Trials and the Optimization of Individualized Dosing Regimens

    Get PDF
    Mathematical models of drug action and disease progression can inform pediatric pharmacotherapy. In this tutorial, we explore the key issues that differentiate pediatric from adult pharmacokinetic (PK) / pharmacodynamic (PD) studies, describe methods to calculate the number of participants to be enrolled and the optimal times at which blood samples should be collected, and therapeutic drug monitoring methods for individualizing pharmacotherapy. The development of pediatric‐specific drug dosing dashboards is also highlighted, with an emphasis on clinical‐relevance and ease of use

    The use of repeated blood pressure measures for cardiovascular risk prediction: a comparison of statistical models in the ARIC study.

    Get PDF
    Many prediction models have been developed for the risk assessment and the prevention of cardiovascular disease in primary care. Recent efforts have focused on improving the accuracy of these prediction models by adding novel biomarkers to a common set of baseline risk predictors. Few have considered incorporating repeated measures of the common risk predictors. Through application to the Atherosclerosis Risk in Communities study and simulations, we compare models that use simple summary measures of the repeat information on systolic blood pressure, such as (i) baseline only; (ii) last observation carried forward; and (iii) cumulative mean, against more complex methods that model the repeat information using (iv) ordinary regression calibration; (v) risk-set regression calibration; and (vi) joint longitudinal and survival models. In comparison with the baseline-only model, we observed modest improvements in discrimination and calibration using the cumulative mean of systolic blood pressure, but little further improvement from any of the complex methods. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.J.K.B. was supported by the Medical Research Council grant numbers G0902100 and MR/K014811/1. This work was funded by the UK Medical Research Council (G0800270), British Heart Foundation (SP/09/002), UK National Institute for Health Research Cambridge Biomedical Research Centre, European Research Council (268834) and European Commission Framework Programme 7 (HEALTH-F2-2012-279233). The ARIC study is carried out as a collaborative study supported by the National Heart, Lung, and Blood Institute contracts (HHSN268201100005C, HHSN268201100006C, HHSN268201100007C, HHSN268201100008C, HHSN268201100009C, HHSN268201100010C, HHSN268201100011C and HHSN268201100012C).This is the final version of the article. It first appeared from Wiley via https://doi.org/10.1002/sim.714

    Dynamic Prediction of Survival in Cystic Fibrosis: A Landmarking Analysis Using UK Patient Registry Data.

    Get PDF
    BACKGROUND: Cystic fibrosis (CF) is an inherited, chronic, progressive condition affecting around 10,000 individuals in the United Kingdom and over 70,000 worldwide. Survival in CF has improved considerably over recent decades, and it is important to provide up-to-date information on patient prognosis. METHODS: The UK Cystic Fibrosis Registry is a secure centralized database, which collects annual data on almost all CF patients in the United Kingdom. Data from 43,592 annual records from 2005 to 2015 on 6181 individuals were used to develop a dynamic survival prediction model that provides personalized estimates of survival probabilities given a patient's current health status using 16 predictors. We developed the model using the landmarking approach, giving predicted survival curves up to 10 years from 18 to 50 years of age. We compared several models using cross-validation. RESULTS: The final model has good discrimination (C-indexes: 0.873, 0.843, and 0.804 for 2-, 5-, and 10-year survival prediction) and low prediction error (Brier scores: 0.036, 0.076, and 0.133). It identifies individuals at low and high risk of short- and long-term mortality based on their current status. For patients 20 years of age during 2013-2015, for example, over 80% had a greater than 95% probability of 2-year survival and 40% were predicted to survive 10 years or more. CONCLUSIONS: Dynamic personalized prediction models can guide treatment decisions and provide personalized information for patients. Our application illustrates the utility of the landmarking approach for making the best use of longitudinal and survival data and shows how models can be defined and compared in terms of predictive performance.US NIH Grant K25 HL12595

    Bayesian blockwise inference for joint models of longitudinal and multistate data with application to longitudinal multimorbidity analysis

    Get PDF
    Multistate models provide a useful framework for modelling complex event history data in clinical settings and have recently been extended to the joint modelling framework to appropriately handle endogenous longitudinal covariates, such as repeatedly measured biomarkers, which are informative about health status and disease progression. However, the practical application of such joint models faces considerable computational challenges. Motivated by a longitudinal multimorbidity analysis of large-scale UK health records, we introduce novel Bayesian inference approaches for these models that are capable of handling complex multistate processes and large datasets with straightforward implementation. These approaches decompose the original estimation task into smaller inference blocks, leveraging parallel computing and facilitating flexible model specification and comparison. Using extensive simulation studies, we show that the proposed approaches achieve satisfactory estimation accuracy, with notable gains in computational efficiency compared to the standard Bayesian estimation strategy. We illustrate our approaches by analysing the coevolution of routinely measured systolic blood pressure and the progression of three important chronic conditions, using a large dataset from the Clinical Practice Research Datalink Aurum database. Our analysis reveals distinct and previously lesser-known association structures between systolic blood pressure and different disease transitions
    corecore