17,666 research outputs found

    Statistical methods for automated drug susceptibility testing: Bayesian minimum inhibitory concentration prediction from growth curves

    Get PDF
    Determination of the minimum inhibitory concentration (MIC) of a drug that prevents microbial growth is an important step for managing patients with infections. In this paper we present a novel probabilistic approach that accurately estimates MICs based on a panel of multiple curves reflecting features of bacterial growth. We develop a probabilistic model for determining whether a given dilution of an antimicrobial agent is the MIC given features of the growth curves over time. Because of the potentially large collection of features, we utilize Bayesian model selection to narrow the collection of predictors to the most important variables. In addition to point estimates of MICs, we are able to provide posterior probabilities that each dilution is the MIC based on the observed growth curves. The methods are easily automated and have been incorporated into the Becton--Dickinson PHOENIX automated susceptibility system that rapidly and accurately classifies the resistance of a large number of microorganisms in clinical samples. Over seventy-five studies to date have shown this new method provides improved estimation of MICs over existing approaches.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS217 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Universality of Bayesian mixture predictors

    Full text link
    The problem is that of sequential probability forecasting for finite-valued time series. The data is generated by an unknown probability distribution over the space of all one-way infinite sequences. It is known that this measure belongs to a given set C, but the latter is completely arbitrary (uncountably infinite, without any structure given). The performance is measured with asymptotic average log loss. In this work it is shown that the minimax asymptotic performance is always attainable, and it is attained by a convex combination of a countably many measures from the set C (a Bayesian mixture). This was previously only known for the case when the best achievable asymptotic error is 0. This also contrasts previous results that show that in the non-realizable case all Bayesian mixtures may be suboptimal, while there is a predictor that achieves the optimal performance

    Identifying the consequences of dynamic treatment strategies: A decision-theoretic overview

    Full text link
    We consider the problem of learning about and comparing the consequences of dynamic treatment strategies on the basis of observational data. We formulate this within a probabilistic decision-theoretic framework. Our approach is compared with related work by Robins and others: in particular, we show how Robins's 'G-computation' algorithm arises naturally from this decision-theoretic perspective. Careful attention is paid to the mathematical and substantive conditions required to justify the use of this formula. These conditions revolve around a property we term stability, which relates the probabilistic behaviours of observational and interventional regimes. We show how an assumption of 'sequential randomization' (or 'no unmeasured confounders'), or an alternative assumption of 'sequential irrelevance', can be used to infer stability. Probabilistic influence diagrams are used to simplify manipulations, and their power and limitations are discussed. We compare our approach with alternative formulations based on causal DAGs or potential response models. We aim to show that formulating the problem of assessing dynamic treatment strategies as a problem of decision analysis brings clarity, simplicity and generality.Comment: 49 pages, 15 figure

    Optimal Sequential Investigation Rules in Competition Law

    Get PDF
    Although both in US antitrust and European competition law there is a clear evolution to a much broader application of "rule of reason" (instead of per-se rules), there is also an increasing awareness of the problems of a case-by-case approach. The "error costs approach" (minimizing the sum of welfare costs of decision errors and administrative costs) allows not only to decide between these two extremes, but also to design optimally differentiated rules (with an optimal depth of investigation) as intermediate solutions between simple per-se rules and a fullscale rule of reason. In this paper we present a decision-theoretic model that can be used as an instrument for deriving optimal rules for a sequential investigation process in competition law. Such a sequential investigation can be interpreted as a step-by-step sorting process into ever smaller subclasses of cases that help to discriminate better between pro- and anticompetitive cases. We analyze both the problem of optimal stopping of the investigation and optimal sequencing of the assessment criteria in an investigation. To illustrate, we show how a more differentiated rule on resale price maintenance could be derived after the rejection of its per-se prohibition by the US Supreme Court in the "Leegin" case 2007.Law Enforcement, Decision-Making, Competition Law, Antitrust Law
    corecore