8,833 research outputs found
Bayesian T-optimal discriminating designs
The problem of constructing Bayesian optimal discriminating designs for a
class of regression models with respect to the T-optimality criterion
introduced by Atkinson and Fedorov (1975a) is considered. It is demonstrated
that the discretization of the integral with respect to the prior distribution
leads to locally T-optimal discrimination designs can only deal with a few
comparisons, but the discretization of the Bayesian prior easily yields to
discrimination design problems for more than 100 competing models. A new
efficient method is developed to deal with problems of this type. It combines
some features of the classical exchange type algorithm with the gradient
methods. Convergence is proved and it is demonstrated that the new method can
find Bayesian optimal discriminating designs in situations where all currently
available procedures fail.Comment: 25 pages, 3 figure
KL-optimum designs: theoretical properties and practical computation
In this paper some new properties and computational tools for finding
KL-optimum designs are provided. KL-optimality is a general criterion useful to
select the best experimental conditions to discriminate between statistical
models. A KL-optimum design is obtained from a minimax optimization problem,
which is defined on a infinite-dimensional space. In particular, continuity of
the KL-optimality criterion is proved under mild conditions; as a consequence,
the first-order algorithm converges to the set of KL-optimum designs for a
large class of models. It is also shown that KL-optimum designs are invariant
to any scale-position transformation. Some examples are given and discussed,
together with some practical implications for numerical computation purposes.Comment: The final publication is available at Springer via
http://dx.doi.org/10.1007/s11222-014-9515-
Robust T-optimal discriminating designs
This paper considers the problem of constructing optimal discriminating
experimental designs for competing regression models on the basis of the
T-optimality criterion introduced by Atkinson and Fedorov [Biometrika 62 (1975)
57-70]. T-optimal designs depend on unknown model parameters and it is
demonstrated that these designs are sensitive with respect to misspecification.
As a solution to this problem we propose a Bayesian and standardized maximin
approach to construct robust and efficient discriminating designs on the basis
of the T-optimality criterion. It is shown that the corresponding Bayesian and
standardized maximin optimality criteria are closely related to linear
optimality criteria. For the problem of discriminating between two polynomial
regression models which differ in the degree by two the robust T-optimal
discriminating designs can be found explicitly. The results are illustrated in
several examples.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1117 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Recommended from our members
T-optimal designs formulti-factor polynomial regressionmodelsvia a semidefinite relaxation method
We consider T-optimal experiment design problems for discriminating multi-factor polynomial regression models wherethe design space is defined by polynomial inequalities and the regression parameters are constrained to given convex sets.Our proposed optimality criterion is formulated as a convex optimization problem with a moment cone constraint. When theregression models have one factor, an exact semidefinite representation of the moment cone constraint can be applied to obtainan equivalent semidefinite program.When there are two or more factors in the models, we apply a moment relaxation techniqueand approximate the moment cone constraint by a hierarchy of semidefinite-representable outer approximations. When therelaxation hierarchy converges, an optimal discrimination design can be recovered from the optimal moment matrix, and itsoptimality can be additionally confirmed by an equivalence theorem. The methodology is illustrated with several examples
Optimal Discrimination Designs for Exponential Regression Models
We investigate optimal designs for discriminating between exponential regression models of different complexity, which are widely used in the biological sciences; see, e.g., Landaw (1995) or Gibaldi and Perrier (1982). We discuss different approaches for the construction of appropriate optimality criteria, and find sharper upper bounds on the number of support points of locally optimal discrimination designs than those given by Caratheodory?s Theorem. These results greatly facilitate the numerical construction of optimal designs. Various examples of optimal designs are then presented and compared to different other designs. Moreover, to protect the experiment against misspecifications of the nonlinear model parameters, we adapt the design criteria such that the resulting designs are robust with respect to such misspecifications and, again, provide several examples, which demonstrate the advantages of our approach. --Compartmental Model,Model Discrimination,Discrimination Design,Locally Optimal Design,Robust Optimal Design,Maximin Optimal Design
-optimal designs for discrimination between two polynomial models
This paper is devoted to the explicit construction of optimal designs for
discrimination between two polynomial regression models of degree and
. In a fundamental paper, Atkinson and Fedorov [Biometrika 62 (1975a)
57--70] proposed the -optimality criterion for this purpose. Recently,
Atkinson [MODA 9, Advances in Model-Oriented Design and Analysis (2010) 9--16]
determined -optimal designs for polynomials up to degree 6 numerically and
based on these results he conjectured that the support points of the optimal
design are cosines of the angles that divide half of the circle into equal
parts if the coefficient of in the polynomial of larger degree
vanishes. In the present paper we give a strong justification of the conjecture
and determine all -optimal designs explicitly for any degree
. In particular, we show that there exists a one-dimensional
class of -optimal designs. Moreover, we also present a generalization to the
case when the ratio between the coefficients of and is smaller
than a certain critical value. Because of the complexity of the optimization
problem, -optimal designs have only been determined numerically so far, and
this paper provides the first explicit solution of the -optimal design
problem since its introduction by Atkinson and Fedorov [Biometrika 62 (1975a)
57--70]. Finally, for the remaining cases (where the ratio of coefficients is
larger than the critical value), we propose a numerical procedure to calculate
the -optimal designs. The results are also illustrated in an example.Comment: Published in at http://dx.doi.org/10.1214/11-AOS956 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Optimal discrimination designs
We consider the problem of constructing optimal designs for model
discrimination between competing regression models. Various new properties of
optimal designs with respect to the popular -optimality criterion are
derived, which in many circumstances allow an explicit determination of
-optimal designs. It is also demonstrated, that in nested linear models the
number of support points of -optimal designs is usually too small to
estimate all parameters in the extended model. In many cases -optimal
designs are usually not unique, and in this situation we give a
characterization of all -optimal designs. Finally, -optimal designs are
compared with optimal discriminating designs with respect to alternative
criteria by means of a small simulation study.Comment: Published in at http://dx.doi.org/10.1214/08-AOS635 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- …