8,283 research outputs found
Comparing algorithms and criteria for designing Bayesian conjoint choice experiments.
The recent algorithm to find efficient conjoint choice designs, the RSC-algorithm developed by SƔndor and Wedel (2001), uses Bayesian design methods that integrate the D-optimality criterion over a prior distribution of likely parameter values. Characteristic for this algorithm is that the designs satisfy the minimal level overlap property provided the starting design complies with it. Another, more embedded, algorithm in the literature, developed by Zwerina et al. (1996), involves an adaptation of the modified Fedorov exchange algorithm to the multinomial logit choice model. However, it does not take into account the uncertainty about the assumed parameter values. In this paper, we adjust the modified Fedorov choice algorithm in a Bayesian fashion and compare its designs to those produced by the RSC-algorithm. Additionally, we introduce a measure to investigate the utility balances of the designs. Besides the widely used D-optimality criterion, we also implement the A-, G- and V-optimality criteria and look for the criterion that is most suitable for prediction purposes and that offers the best quality in terms of computational effectiveness. The comparison study reveals that the Bayesian modified Fedorov choice algorithm provides more efficient designs than the RSC-algorithm and that the Dand V-optimality criteria are the best criteria for prediction, but the computation time with the V-optimality criterion is longer.A-Optimality; Algorithms; Bayesian design; Bayesian modified Fedorov choice algorithm; Choice; Conjoint choice experiments; Criteria; D-Optimality; Design; Discrete choice experiments; Distribution; Effectiveness; Fashion; G-optimality; Logit; Methods; Model; Multinomial logit; Predictive validity; Quality; Research; RSC-algorithm; Studies; Time; Uncertainty; V-optimality; Value;
Recommended from our members
A comparison of general-purpose optimization algorithms forfinding optimal approximate experimental designs
Several common general purpose optimization algorithms are compared for findingA- and D-optimal designs for different types of statistical models of varying complexity,including high dimensional models with five and more factors. The algorithms of interestinclude exact methods, such as the interior point method, the NelderāMead method, theactive set method, the sequential quadratic programming, and metaheuristic algorithms,such as particle swarm optimization, simulated annealing and genetic algorithms.Several simulations are performed, which provide general recommendations on theutility and performance of each method, including hybridized versions of metaheuristicalgorithms for finding optimal experimental designs. A key result is that general-purposeoptimization algorithms, both exact methods and metaheuristic algorithms, perform wellfor finding optimal approximate experimental designs
Bayesian and maximin optimal designs for heteroscedastic regression models
The problem of constructing standardized maximin D-optimal designs for weighted polynomial regression models is addressed. In particular it is shown that, by following the broad approach to the construction of maximin designs introduced recently by Dette, Haines and Imhof (2003), such designs can be obtained as weak limits of the corresponding Bayesian Ī¦q-optimal designs. The approach is illustrated for two specific weighted polynomial models and also for a particular growth model. --
- ā¦