3,100 research outputs found

    Determination of the Joint Confidence Region of Optimal Operating Conditions in Robust Design by Bootstrap Technique

    Full text link
    Robust design has been widely recognized as a leading method in reducing variability and improving quality. Most of the engineering statistics literature mainly focuses on finding "point estimates" of the optimum operating conditions for robust design. Various procedures for calculating point estimates of the optimum operating conditions are considered. Although this point estimation procedure is important for continuous quality improvement, the immediate question is "how accurate are these optimum operating conditions?" The answer for this is to consider interval estimation for a single variable or joint confidence regions for multiple variables. In this paper, with the help of the bootstrap technique, we develop procedures for obtaining joint "confidence regions" for the optimum operating conditions. Two different procedures using Bonferroni and multivariate normal approximation are introduced. The proposed methods are illustrated and substantiated using a numerical example.Comment: Two tables, Three figure

    Analyzing the House Fly's Exploratory Behavior with Autoregression Methods

    Full text link
    This paper presents a detailed characterization of the trajectory of a single housefly with free range of a square cage. The trajectory of the fly was recorded and transformed into a time series, which was fully analyzed using an autoregressive model, which describes a stationary time series by a linear regression of prior state values with the white noise. The main discovery was that the fly switched styles of motion from a low dimensional regular pattern to a higher dimensional disordered pattern. This discovered exploratory behavior is, irrespective of the presence of food, characterized by anomalous diffusion.Comment: 20 pages, 9 figures, 1 table, full pape

    Ensemble Sales Forecasting Study in Semiconductor Industry

    Full text link
    Sales forecasting plays a prominent role in business planning and business strategy. The value and importance of advance information is a cornerstone of planning activity, and a well-set forecast goal can guide sale-force more efficiently. In this paper CPU sales forecasting of Intel Corporation, a multinational semiconductor industry, was considered. Past sale, future booking, exchange rates, Gross domestic product (GDP) forecasting, seasonality and other indicators were innovatively incorporated into the quantitative modeling. Benefit from the recent advances in computation power and software development, millions of models built upon multiple regressions, time series analysis, random forest and boosting tree were executed in parallel. The models with smaller validation errors were selected to form the ensemble model. To better capture the distinct characteristics, forecasting models were implemented at lead time and lines of business level. The moving windows validation process automatically selected the models which closely represent current market condition. The weekly cadence forecasting schema allowed the model to response effectively to market fluctuation. Generic variable importance analysis was also developed to increase the model interpretability. Rather than assuming fixed distribution, this non-parametric permutation variable importance analysis provided a general framework across methods to evaluate the variable importance. This variable importance framework can further extend to classification problem by modifying the mean absolute percentage error(MAPE) into misclassify error. Please find the demo code at : https://github.com/qx0731/ensemble_forecast_methodsComment: 14 pages, Industrial Conference on Data Mining 2017 (ICDM 2017

    Stochastic simulations of conditional states of partially observed systems, quantum and classical

    Get PDF
    In a partially observed quantum or classical system the information that we cannot access results in our description of the system becoming mixed even if we have perfect initial knowledge. That is, if the system is quantum the conditional state will be given by a state matrix ρr(t)\rho_r(t) and if classical the conditional state will be given by a probability distribution Pr(x,t)P_r(x,t) where rr is the result of the measurement. Thus to determine the evolution of this conditional state under continuous-in-time monitoring requires an expensive numerical calculation. In this paper we demonstrating a numerical technique based on linear measurement theory that allows us to determine the conditional state using only pure states. That is, our technique reduces the problem size by a factor of NN, the number of basis states for the system. Furthermore we show that our method can be applied to joint classical and quantum systems as arises in modeling realistic measurement.Comment: 16 pages, 11 figure

    Design and analysis of fractional factorial experiments from the viewpoint of computational algebraic statistics

    Full text link
    We give an expository review of applications of computational algebraic statistics to design and analysis of fractional factorial experiments based on our recent works. For the purpose of design, the techniques of Gr\"obner bases and indicator functions allow us to treat fractional factorial designs without distinction between regular designs and non-regular designs. For the purpose of analysis of data from fractional factorial designs, the techniques of Markov bases allow us to handle discrete observations. Thus the approach of computational algebraic statistics greatly enlarges the scope of fractional factorial designs.Comment: 16 page

    Quantum trajectories for the realistic measurement of a solid-state charge qubit

    Get PDF
    We present a new model for the continuous measurement of a coupled quantum dot charge qubit. We model the effects of a realistic measurement, namely adding noise to, and filtering, the current through the detector. This is achieved by embedding the detector in an equivalent circuit for measurement. Our aim is to describe the evolution of the qubit state conditioned on the macroscopic output of the external circuit. We achieve this by generalizing a recently developed quantum trajectory theory for realistic photodetectors [P. Warszawski, H. M. Wiseman and H. Mabuchi, Phys. Rev. A_65_ 023802 (2002)] to treat solid-state detectors. This yields stochastic equations whose (numerical) solutions are the ``realistic quantum trajectories'' of the conditioned qubit state. We derive our general theory in the context of a low transparency quantum point contact. Areas of application for our theory and its relation to previous work are discussed.Comment: 7 pages, 2 figures. Shorter, significantly modified, updated versio

    Approximate Bayesian Computation: a nonparametric perspective

    Full text link
    Approximate Bayesian Computation is a family of likelihood-free inference techniques that are well-suited to models defined in terms of a stochastic generating mechanism. In a nutshell, Approximate Bayesian Computation proceeds by computing summary statistics s_obs from the data and simulating summary statistics for different values of the parameter theta. The posterior distribution is then approximated by an estimator of the conditional density g(theta|s_obs). In this paper, we derive the asymptotic bias and variance of the standard estimators of the posterior distribution which are based on rejection sampling and linear adjustment. Additionally, we introduce an original estimator of the posterior distribution based on quadratic adjustment and we show that its bias contains a fewer number of terms than the estimator with linear adjustment. Although we find that the estimators with adjustment are not universally superior to the estimator based on rejection sampling, we find that they can achieve better performance when there is a nearly homoscedastic relationship between the summary statistics and the parameter of interest. To make this relationship as homoscedastic as possible, we propose to use transformations of the summary statistics. In different examples borrowed from the population genetics and epidemiological literature, we show the potential of the methods with adjustment and of the transformations of the summary statistics. Supplemental materials containing the details of the proofs are available online

    Factor and Simplex Models for Repeated Measures: Application to Two Psychomotor Measures of Alcohol Sensitivity in Twins

    Get PDF
    As part of a larger study, data on arithmetic computation and motor coordination were obtained from 206 twin pairs. The twins were measured once before and three times after ingesting a standard dose of alcohol. Previous analyses ignored the time-series structure of these data. Here we illustrate the application of simplex models for the genetic analysis of covariance structures in a repeated-measures design and compare the results with factor models for the two psychomotor measures. We then present a bivariate analysis incorporating simplex processes common and specific to the two measures. Our analyses confirm the notion that there is genetic variation affecting psychomotor performance which is "switched on" in the presence of alcohol. We compare the merits of analysis of mean products versus covariance matrices and confront some practical problems that may arise in situations where the number of subjects is relatively small and where the causal structure among the latent variables places a heavy demand on the data. © 1989 Plenum Publishing Corporation
    corecore