28,156 research outputs found

    Statistical methods for automated drug susceptibility testing: Bayesian minimum inhibitory concentration prediction from growth curves

    Get PDF
    Determination of the minimum inhibitory concentration (MIC) of a drug that prevents microbial growth is an important step for managing patients with infections. In this paper we present a novel probabilistic approach that accurately estimates MICs based on a panel of multiple curves reflecting features of bacterial growth. We develop a probabilistic model for determining whether a given dilution of an antimicrobial agent is the MIC given features of the growth curves over time. Because of the potentially large collection of features, we utilize Bayesian model selection to narrow the collection of predictors to the most important variables. In addition to point estimates of MICs, we are able to provide posterior probabilities that each dilution is the MIC based on the observed growth curves. The methods are easily automated and have been incorporated into the Becton--Dickinson PHOENIX automated susceptibility system that rapidly and accurately classifies the resistance of a large number of microorganisms in clinical samples. Over seventy-five studies to date have shown this new method provides improved estimation of MICs over existing approaches.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS217 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Optimal design of dilution experiments under volume constraints

    Full text link
    The paper develops methods to construct a one-stage optimal design of dilution experiments under the total available volume constraint typical for bio-medical applications. We consider various design criteria based on the Fisher information both is Bayesian and non-Bayasian settings and show that the optimal design is typically one-atomic meaning that all the dilutions should be of the same size. The main tool is variational analysis of functions of a measure and the corresponding steepest descent type numerical methods. Our approach is generic in the sense that it allows for inclusion of additional constraints and cost components, like the cost of materials and of the experiment itself.Comment: 29 pages, 10 figure

    Variable Selection and Model Averaging in Semiparametric Overdispersed Generalized Linear Models

    Full text link
    We express the mean and variance terms in a double exponential regression model as additive functions of the predictors and use Bayesian variable selection to determine which predictors enter the model, and whether they enter linearly or flexibly. When the variance term is null we obtain a generalized additive model, which becomes a generalized linear model if the predictors enter the mean linearly. The model is estimated using Markov chain Monte Carlo simulation and the methodology is illustrated using real and simulated data sets.Comment: 8 graphs 35 page

    The Effects of Statistical Multiplicity of Infection on Virus Quantification and Infectivity Assays

    Full text link
    Many biological assays are employed in virology to quantify parameters of interest. Two such classes of assays, virus quantification assays (VQA) and infectivity assays (IA), aim to estimate the number of viruses present in a solution, and the ability of a viral strain to successfully infect a host cell, respectively. VQAs operate at extremely dilute concentrations and results can be subject to stochastic variability in virus-cell interactions. At the other extreme, high viral particle concentrations are used in IAs, resulting in large numbers of viruses infecting each cell, enough for measurable change in total transcription activity. Furthermore, host cells can be infected at any concentration regime by multiple particles, resulting in a statistical multiplicity of infection (SMOI) and yielding potentially significant variability in the assay signal and parameter estimates. We develop probabilistic models for SMOI at low and high viral particle concentration limits and apply them to the plaque (VQA), endpoint dilution (VQA), and luciferase reporter (IA) assays. A web-based tool implementing our models and analysis is also developed and presented. We test our proposed new methods for inferring experimental parameters from data using numerical simulations and show improvement on existing procedures in all limits.Comment: 19 pages, 11 figures, 1 tabl

    The Effect of IMF Lending on the Probability of Sovereign Debt Crises

    Get PDF
    This paper explores empirically how the adoption of IMF programs affects sovereign risk over the medium term. We nd that IMF programs signifcantly increase the probability of subsequent sovereign defaults by approximately 1.5 to 2 percentage points. These results cannot be attributed to endogeneity bias as they are supported by specications that explain sovereign defaults and program participation simultaneously. Furthermore, IMF programs turn out to be especially detrimental to scal solvency when the Fund distributes its resources to countries whose economic fundamentals are already weak. Our evidence is therefore consistent with the hypothesis that debtor moral hazard is most likely to occur in these circumstances. Other explanations that point to the eects of debt dilution and the possibility of IMF triggered debt runs, however, are also possible.IMF programs, Sovereign defaults, Bivariate probit, International Financial Architecture

    The epidemiology of canine leishmaniasis: transmission rates estimated from a cohort study in Amazonian Brazil

    Get PDF
    We estimate the incidence rate, serological conversion rate and basic case reproduction number (R0) of Leishmania infantum from a cohort study of 126 domestic dogs exposed to natural infection rates over 2 years on Marajó Island, Pará State, Brazil. The analysis includes new methods for (1) determining the number of seropositives in cross-sectional serological data, (2) identifying seroconversions in longitudinal studies, based on both the number of antibody units and their rate of change through time, (3) estimating incidence and serological pre-patent periods and (4) calculating R0 for a potentially fatal, vector-borne disease under seasonal transmission. Longitudinal and cross-sectional serological (ELISA) analyses gave similar estimates of the proportion of dogs positive. However, longitudinal analysis allowed the calculation of pre-patent periods, and hence the more accurate estimation of incidence: an infection–conversion model fitted by maximum likelihood to serological data yielded seasonally varying per capita incidence rates with a mean of 8·66×10[minus sign]3/day (mean time to infection 115 days, 95% C.L. 107–126 days), and a median pre-patent period of 94 (95% C.L. 82–111) days. These results were used in conjunction with theory and dog demographic data to estimate the basic reproduction number, R0, as 5·9 (95% C.L. 4·4–7·4). R0 is a determinant of the scale of the leishmaniasis control problem, and we comment on the options for control

    Contrast-ultrasound dispersion imaging for prostate cancer localization

    Get PDF

    Bayesian correction for covariate measurement error: a frequentist evaluation and comparison with regression calibration

    Get PDF
    Bayesian approaches for handling covariate measurement error are well established, and yet arguably are still relatively little used by researchers. For some this is likely due to unfamiliarity or disagreement with the Bayesian inferential paradigm. For others a contributory factor is the inability of standard statistical packages to perform such Bayesian analyses. In this paper we first give an overview of the Bayesian approach to handling covariate measurement error, and contrast it with regression calibration (RC), arguably the most commonly adopted approach. We then argue why the Bayesian approach has a number of statistical advantages compared to RC, and demonstrate that implementing the Bayesian approach is usually quite feasible for the analyst. Next we describe the closely related maximum likelihood and multiple imputation approaches, and explain why we believe the Bayesian approach to generally be preferable. We then empirically compare the frequentist properties of RC and the Bayesian approach through simulation studies. The flexibility of the Bayesian approach to handle both measurement error and missing data is then illustrated through an analysis of data from the Third National Health and Nutrition Examination Survey

    A Structural Econometric Investigation of the Agency Theory of Financial Structure

    Get PDF
    We estimate a structural model of financing choices in presence of managerial moral hazard, financial distress costs and taxes. In the theoretical model, firms with low cost of managerial effort, and high financial distress costs and non--debt tax shields, find it optimal to issue equity. Correspondingly the likelihood that a given firm issues equity is the probability that its managerial cost of effort is below an upper bound, reflecting its financial distress cost and non debt tax shields, as well as the other deep parameters of the model. Similarly we characterize the likelihood of issues of debt and convertible bonds. Using maximum likelihood analysis, we confront this theoretical model to data on financing choices by French firms in 1996. We find large costs of financial distress, equal on average to 41.2\% of the value of the firm when it is in distress. We also find large agency costs, equal to 40.26\% of the value of the investment project. In contrast, we find that tax shields do not play a significant role in the financing decision.
    corecore