12,145 research outputs found

    Contingency Model Predictive Control for Automated Vehicles

    Full text link
    We present Contingency Model Predictive Control (CMPC), a novel and implementable control framework which tracks a desired path while simultaneously maintaining a contingency plan -- an alternate trajectory to avert an identified potential emergency. In this way, CMPC anticipates events that might take place, instead of reacting when emergencies occur. We accomplish this by adding an additional prediction horizon in parallel to the classical receding MPC horizon. The contingency horizon is constrained to maintain a feasible avoidance solution; as such, CMPC is selectively robust to this emergency while tracking the desired path as closely as possible. After defining the framework mathematically, we demonstrate its effectiveness experimentally by comparing its performance to a state-of-the-art deterministic MPC. The controllers drive an automated research platform through a left-hand turn which may be covered by ice. Contingency MPC prepares for the potential loss of friction by purposefully and intuitively deviating from the prescribed path to approach the turn more conservatively; this deviation significantly mitigates the consequence of encountering ice.Comment: American Control Conference, July 2019; 6 page

    Bayesian Approximate Kernel Regression with Variable Selection

    Full text link
    Nonlinear kernel regression models are often used in statistics and machine learning because they are more accurate than linear models. Variable selection for kernel regression models is a challenge partly because, unlike the linear regression setting, there is no clear concept of an effect size for regression coefficients. In this paper, we propose a novel framework that provides an effect size analog of each explanatory variable for Bayesian kernel regression models when the kernel is shift-invariant --- for example, the Gaussian kernel. We use function analytic properties of shift-invariant reproducing kernel Hilbert spaces (RKHS) to define a linear vector space that: (i) captures nonlinear structure, and (ii) can be projected onto the original explanatory variables. The projection onto the original explanatory variables serves as an analog of effect sizes. The specific function analytic property we use is that shift-invariant kernel functions can be approximated via random Fourier bases. Based on the random Fourier expansion we propose a computationally efficient class of Bayesian approximate kernel regression (BAKR) models for both nonlinear regression and binary classification for which one can compute an analog of effect sizes. We illustrate the utility of BAKR by examining two important problems in statistical genetics: genomic selection (i.e. phenotypic prediction) and association mapping (i.e. inference of significant variants or loci). State-of-the-art methods for genomic selection and association mapping are based on kernel regression and linear models, respectively. BAKR is the first method that is competitive in both settings.Comment: 22 pages, 3 figures, 3 tables; theory added; new simulations presented; references adde

    Randomized Dimension Reduction on Massive Data

    Full text link
    Scalability of statistical estimators is of increasing importance in modern applications and dimension reduction is often used to extract relevant information from data. A variety of popular dimension reduction approaches can be framed as symmetric generalized eigendecomposition problems. In this paper we outline how taking into account the low rank structure assumption implicit in these dimension reduction approaches provides both computational and statistical advantages. We adapt recent randomized low-rank approximation algorithms to provide efficient solutions to three dimension reduction methods: Principal Component Analysis (PCA), Sliced Inverse Regression (SIR), and Localized Sliced Inverse Regression (LSIR). A key observation in this paper is that randomization serves a dual role, improving both computational and statistical performance. This point is highlighted in our experiments on real and simulated data.Comment: 31 pages, 6 figures, Key Words:dimension reduction, generalized eigendecompositon, low-rank, supervised, inverse regression, random projections, randomized algorithms, Krylov subspace method

    Integrated design and control of chemical processes : part I : revision and clasification

    Get PDF
    [EN] This work presents a comprehensive classification of the different methods and procedures for integrated synthesis, design and control of chemical processes, based on a wide revision of recent literature. This classification fundamentally differentiates between “projecting methods”, where controllability is monitored during the process design to predict the trade-offs between design and control, and the “integrated-optimization methods” which solve the process design and the control-systems design at once within an optimization framework. The latter are revised categorizing them according to the methods to evaluate controllability and other related properties, the scope of the design problem, the treatment of uncertainties and perturbations, and finally, the type the optimization problem formulation and the methods for its resolution.[ES] Este trabajo presenta una clasificación integral de los diferentes métodos y procedimientos para la síntesis integrada, diseño y control de procesos químicos. Esta clasificación distingue fundamentalmente entre los "métodos de proyección", donde se controla la controlabilidad durante el diseño del proceso para predecir los compromisos entre diseño y control, y los "métodos de optimización integrada" que resuelven el diseño del proceso y el diseño de los sistemas de control a la vez dentro de un marco de optimización. Estos últimos se revisan clasificándolos según los métodos para evaluar la controlabilidad y otras propiedades relacionadas, el alcance del problema de diseño, el tratamiento de las incertidumbres y las perturbaciones y, finalmente, el tipo de la formulación del problema de optimización y los métodos para su resolución
    corecore