71,893 research outputs found
Bayesian sequential change diagnosis
Sequential change diagnosis is the joint problem of detection and
identification of a sudden and unobservable change in the distribution of a
random sequence. In this problem, the common probability law of a sequence of
i.i.d. random variables suddenly changes at some disorder time to one of
finitely many alternatives. This disorder time marks the start of a new regime,
whose fingerprint is the new law of observations. Both the disorder time and
the identity of the new regime are unknown and unobservable. The objective is
to detect the regime-change as soon as possible, and, at the same time, to
determine its identity as accurately as possible. Prompt and correct diagnosis
is crucial for quick execution of the most appropriate measures in response to
the new regime, as in fault detection and isolation in industrial processes,
and target detection and identification in national defense. The problem is
formulated in a Bayesian framework. An optimal sequential decision strategy is
found, and an accurate numerical scheme is described for its implementation.
Geometrical properties of the optimal strategy are illustrated via numerical
examples. The traditional problems of Bayesian change-detection and Bayesian
sequential multi-hypothesis testing are solved as special cases. In addition, a
solution is obtained for the problem of detection and identification of
component failure(s) in a system with suspended animation
Modeling long-term longitudinal HIV dynamics with application to an AIDS clinical study
A virologic marker, the number of HIV RNA copies or viral load, is currently
used to evaluate antiretroviral (ARV) therapies in AIDS clinical trials. This
marker can be used to assess the ARV potency of therapies, but is easily
affected by drug exposures, drug resistance and other factors during the
long-term treatment evaluation process. HIV dynamic studies have significantly
contributed to the understanding of HIV pathogenesis and ARV treatment
strategies. However, the models of these studies are used to quantify
short-term HIV dynamics ( 1 month), and are not applicable to describe
long-term virological response to ARV treatment due to the difficulty of
establishing a relationship of antiviral response with multiple treatment
factors such as drug exposure and drug susceptibility during long-term
treatment. Long-term therapy with ARV agents in HIV-infected patients often
results in failure to suppress the viral load. Pharmacokinetics (PK), drug
resistance and imperfect adherence to prescribed antiviral drugs are important
factors explaining the resurgence of virus. To better understand the factors
responsible for the virological failure, this paper develops the
mechanism-based nonlinear differential equation models for characterizing
long-term viral dynamics with ARV therapy. The models directly incorporate drug
concentration, adherence and drug susceptibility into a function of treatment
efficacy and, hence, fully integrate virologic, PK, drug adherence and
resistance from an AIDS clinical trial into the analysis. A Bayesian nonlinear
mixed-effects modeling approach in conjunction with the rescaled version of
dynamic differential equations is investigated to estimate dynamic parameters
and make inference. In addition, the correlations of baseline factors with
estimated dynamic parameters are explored and some biologically meaningful
correlation results are presented. Further, the estimated dynamic parameters in
patients with virologic success were compared to those in patients with
virologic failure and significantly important findings were summarized. These
results suggest that viral dynamic parameters may play an important role in
understanding HIV pathogenesis, designing new treatment strategies for
long-term care of AIDS patients.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS192 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Expert Elicitation for Reliable System Design
This paper reviews the role of expert judgement to support reliability
assessments within the systems engineering design process. Generic design
processes are described to give the context and a discussion is given about the
nature of the reliability assessments required in the different systems
engineering phases. It is argued that, as far as meeting reliability
requirements is concerned, the whole design process is more akin to a
statistical control process than to a straightforward statistical problem of
assessing an unknown distribution. This leads to features of the expert
judgement problem in the design context which are substantially different from
those seen, for example, in risk assessment. In particular, the role of experts
in problem structuring and in developing failure mitigation options is much
more prominent, and there is a need to take into account the reliability
potential for future mitigation measures downstream in the system life cycle.
An overview is given of the stakeholders typically involved in large scale
systems engineering design projects, and this is used to argue the need for
methods that expose potential judgemental biases in order to generate analyses
that can be said to provide rational consensus about uncertainties. Finally, a
number of key points are developed with the aim of moving toward a framework
that provides a holistic method for tracking reliability assessment through the
design process.Comment: This paper commented in: [arXiv:0708.0285], [arXiv:0708.0287],
[arXiv:0708.0288]. Rejoinder in [arXiv:0708.0293]. Published at
http://dx.doi.org/10.1214/088342306000000510 in the Statistical Science
(http://www.imstat.org/sts/) by the Institute of Mathematical Statistics
(http://www.imstat.org
Bayesian subset simulation
We consider the problem of estimating a probability of failure ,
defined as the volume of the excursion set of a function above a given threshold, under a given
probability measure on . In this article, we combine the popular
subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our
sequential Bayesian approach for the estimation of a probability of failure
(Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it
possible to estimate when the number of evaluations of is very
limited and is very small. The resulting algorithm is called Bayesian
subset simulation (BSS). A key idea, as in the subset simulation algorithm, is
to estimate the probabilities of a sequence of excursion sets of above
intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A
Gaussian process prior on is used to define the sequence of densities
targeted by the SMC algorithm, and drive the selection of evaluation points of
to estimate the intermediate probabilities. Adaptive procedures are
proposed to determine the intermediate thresholds and the number of evaluations
to be carried out at each stage of the algorithm. Numerical experiments
illustrate that BSS achieves significant savings in the number of function
evaluations with respect to other Monte Carlo approaches
- …