57,143 research outputs found
Bayesian computational methods
In this chapter, we will first present the most standard computational
challenges met in Bayesian Statistics, focussing primarily on mixture
estimation and on model choice issues, and then relate these problems with
computational solutions. Of course, this chapter is only a terse introduction
to the problems and solutions related to Bayesian computations. For more
complete references, see Robert and Casella (2004, 2009), or Marin and Robert
(2007), among others. We also restrain from providing an introduction to
Bayesian Statistics per se and for comprehensive coverage, address the reader
to Robert (2007), (again) among others.Comment: This is a revised version of a chapter written for the Handbook of
Computational Statistics, edited by J. Gentle, W. Hardle and Y. Mori in 2003,
in preparation for the second editio
Cross-classification analysis in the field of management and organization : comments on the DEL-technique
The DEL-technique, a proportionate reduction in error measure, developed by Hildebrand, Laing and Rosenthal, has been applied and portrayed as a promising prediction analysis technique to evaluate theory on the basis of cross-classification data, though it was controversial at its birth in the early 70s. According to the opponents, Goodman and Kruskal, the interpretation of DEL as a proportionate reduction in error measure of knowing a prediction rule over not knowing the prediction rule, cannot be held, because it is benchmarked against independence instead of ignorance. However, even when neglecting this criticism the DEL-measure can be easily misinterpreted as a measure of acceptance of the specified customized hypothesis as the only and best relationship between two categorical variables, when the context for the interpretation is not carefully stated in terms of the adhered research paradigm: theory-testing versus prediction logic. When taking into account this criticism, the researchers need to be acknowledged for clearly addressing some of the methodological problems in prediction research, however, an alternative proportionate reduction in error measure may generate unequivocally interpretable results and outperforms the DEL-technique.
A Quality Systems Economic-Risk Design Theoretical Framework
Quality systems, including control charts theory and sampling plans, have become essential tools to develop business processes. Since 1928, research has been conducted in developing the economic-risk designs for specific types of control charts or sampling plans. However, there has been no theoretical or applied research attempts to combine these related theories into a synthesized theoretical framework of quality systems economic-risk design. This research proposes to develop a theoretical framework of quality systems economic-risk design from qualitative research synthesis of the economic-risk design of sampling plan models and control charts models. This theoretical framework will be useful in guiding future research into economic risk quality systems design theory and application
Hyper-g Priors for Generalized Linear Models
We develop an extension of the classical Zellner's g-prior to generalized
linear models. The prior on the hyperparameter g is handled in a flexible way,
so that any continuous proper hyperprior f(g) can be used, giving rise to a
large class of hyper-g priors. Connections with the literature are described in
detail. A fast and accurate integrated Laplace approximation of the marginal
likelihood makes inference in large model spaces feasible. For posterior
parameter estimation we propose an efficient and tuning-free
Metropolis-Hastings sampler. The methodology is illustrated with variable
selection and automatic covariate transformation in the Pima Indians diabetes
data set.Comment: 30 pages, 12 figures, poster contribution at ISBA 201
Quantitative microbiological risk assessment as a tool to obtain useful information for risk managers - specific application to Listeria monocytogenes and ready-to-eat meat products
The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required), support growth of pathogens (high initial pH = 6.2–6.4 and water activity = 0.98–0.99) and has a relatively long period of storage at chilled temperatures with a shelf life equal to 60 days based on manufacturer's instructions. Therefore, in case of post-process contamination, even with low number of cells, the microorganism is able to reach unacceptable levels at the time of consumption. The aim of this study was to conduct a Quantitative Microbiological Risk Assessment (QMRA) on the risk of L. monocytogenes presence in RTE meat products. This may help risk managers to make decisions and apply control measures with ultimate objective the food safety assurance. Examples are given to illustrate the development of practical risk management strategies based on the results obtained from the QMRA model specifically developed for this pathogen/food product combinatio
Intelligent systems in manufacturing: current developments and future prospects
Global competition and rapidly changing customer requirements are demanding increasing changes in manufacturing environments. Enterprises are required to constantly redesign their products and continuously reconfigure their manufacturing systems. Traditional approaches to manufacturing systems do not fully satisfy this new situation. Many authors have proposed that artificial intelligence will bring the flexibility and efficiency needed by manufacturing systems. This paper is a review of artificial intelligence techniques used in manufacturing systems. The paper first defines the components of a simplified intelligent manufacturing systems (IMS), the different Artificial Intelligence (AI) techniques to be considered and then shows how these AI techniques are used for the components of IMS
- …