8,963 research outputs found

    Stochastic frontier models: a bayesian perspective

    Get PDF
    A Bayesian approach to estimation, prediction and model comparison in composed error production models is presented. A broad range of distributions on the inefficiency term define the contending models, which can either be treated separately or pooled. Posterior results are derived for the individual efficiencies as well as for the parameters, and the differences with the usual sampling-theory approach are highlighted. The required numerical integrations are handled by Monte Carlo methods with Importance Sampling, and an empirical example illustrates the procedures

    The approximate coordinate exchange algorithm for Bayesian optimal design of experiments

    Get PDF
    Optimal Bayesian experimental design typically involves maximising the expectation, with respect to the joint distribution of parameters and responses, of some appropriately chosen utility function. This objective function is usually not available in closed form and the design space can be of high dimensionality. The approximate coordinate exchange algorithm is proposed for this maximisation problem where a Gaussian process emulator is used to approximate the objective function. The algorithm can be used for arbitrary utility functions meaning we can consider fully Bayesian optimal design. It can also be used for those utility functions that result in pseudo-Bayesian designs such as the popular Bayesian D-optimality. The algorithm is demonstrated on a range of examples

    Decision-based genetic algorithms for solving multi-period project scheduling with dynamically experienced workforce

    Get PDF
    The importance of the flexibility of resources increased rapidly with the turbulent changes in the industrial context, to meet the customers’ requirements. Among all resources, the most important and considered as the hardest to manage are human resources, in reasons of availability and/or conventions. In this article, we present an approach to solve project scheduling with multi-period human resources allocation taking into account two flexibility levers. The first is the annual hours and working time regulation, and the second is the actors’ multi-skills. The productivity of each operator was considered as dynamic, developing or degrading depending on the prior allocation decisions. The solving approach mainly uses decision-based genetic algorithms, in which, chromosomes don’t represent directly the problem solution; they simply present three decisions: tasks’ priorities for execution, actors’ priorities for carrying out these tasks, and finally the priority of working time strategy that can be considered during the specified working period. Also the principle of critical skill was taken into account. Based on these decisions and during a serial scheduling generating scheme, one can in a sequential manner introduce the project scheduling and the corresponding workforce allocations

    Stochastic frontier models: a bayesian perspective.

    Get PDF
    A Bayesian approach to estimation, prediction and model comparison in composed error production models is presented. A broad range of distributions on the inefficiency term define the contending models, which can either be treated separately or pooled. Posterior results are derived for the individual efficiencies as well as for the parameters, and the differences with the usual sampling-theory approach are highlighted. The required numerical integrations are handled by Monte Carlo methods with Importance Sampling, and an empirical example illustrates the procedures.Efficiency; Composed error models; Production frontier; Prior elicitation;

    Risk-Averse Model Predictive Operation Control of Islanded Microgrids

    Full text link
    In this paper we present a risk-averse model predictive control (MPC) scheme for the operation of islanded microgrids with very high share of renewable energy sources. The proposed scheme mitigates the effect of errors in the determination of the probability distribution of renewable infeed and load. This allows to use less complex and less accurate forecasting methods and to formulate low-dimensional scenario-based optimisation problems which are suitable for control applications. Additionally, the designer may trade performance for safety by interpolating between the conventional stochastic and worst-case MPC formulations. The presented risk-averse MPC problem is formulated as a mixed-integer quadratically-constrained quadratic problem and its favourable characteristics are demonstrated in a case study. This includes a sensitivity analysis that illustrates the robustness to load and renewable power prediction errors

    Methodological Advances in Dea

    Get PDF
    We survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools currently available for exploiting its full potential. Our main points are illustrated by the case of the DEA study used by the regulatory office of the Dutch electricity sector (Dienst Toezicht Elektriciteitswet; Dte) for setting price caps

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio
    • …
    corecore