115 research outputs found

    Theory and Applications of Robust Optimization

    Full text link
    In this paper we survey the primary research, both theoretical and applied, in the area of Robust Optimization (RO). Our focus is on the computational attractiveness of RO approaches, as well as the modeling power and broad applicability of the methodology. In addition to surveying prominent theoretical results of RO, we also present some recent results linking RO to adaptable models for multi-stage decision-making problems. Finally, we highlight applications of RO across a wide spectrum of domains, including finance, statistics, learning, and various areas of engineering.Comment: 50 page

    Non-Equilibrium Statistical Physics of Currents in Queuing Networks

    Get PDF
    We consider a stable open queuing network as a steady non-equilibrium system of interacting particles. The network is completely specified by its underlying graphical structure, type of interaction at each node, and the Markovian transition rates between nodes. For such systems, we ask the question ``What is the most likely way for large currents to accumulate over time in a network ?'', where time is large compared to the system correlation time scale. We identify two interesting regimes. In the first regime, in which the accumulation of currents over time exceeds the expected value by a small to moderate amount (moderate large deviation), we find that the large-deviation distribution of currents is universal (independent of the interaction details), and there is no long-time and averaged over time accumulation of particles (condensation) at any nodes. In the second regime, in which the accumulation of currents over time exceeds the expected value by a large amount (severe large deviation), we find that the large-deviation current distribution is sensitive to interaction details, and there is a long-time accumulation of particles (condensation) at some nodes. The transition between the two regimes can be described as a dynamical second order phase transition. We illustrate these ideas using the simple, yet non-trivial, example of a single node with feedback.Comment: 26 pages, 5 figure

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    A combined approach for analysing heuristic algorithms

    Get PDF
    When developing optimisation algorithms, the focus often lies on obtaining an algorithm that is able to outperform other existing algorithms for some performance measure. It is not common practice to question the reasons for possible performance differences observed. These types of questions relate to evaluating the impact of the various heuristic parameters and often remain unanswered. In this paper, the focus is on gaining insight in the behaviour of a heuristic algorithm by investigating how the various elements operating within the algorithm correlate with performance, obtaining indications of which combinations work well and which do not, and how all these effects are influenced by the specific problem instance the algorithm is solving. We consider two approaches for analysing algorithm parameters and components—functional analysis of variance and multilevel regression analysis—and study the benefits of using both approaches jointly. We present the results of a combined methodology that is able to provide more insights than when the two approaches are used separately. The illustrative case studies in this paper analyse a large neighbourhood search algorithm applied to the vehicle routing problem with time windows and an iterated local search algorithm for the unrelated parallel machine scheduling problem with sequence-dependent setup times.PostprintPeer reviewe

    Finding Common Ground When Experts Disagree: Robust Portfolio Decision Analysis

    Full text link
    corecore