11 research outputs found

    Sign-Perturbed Sums (SPS) with Asymmetric Noise: Robustness Analysis and Robustification Techniques

    Get PDF
    Sign-Perturbed Sums (SPS) is a recently developed finite sample system identification method that can build exact confidence regions for linear regression problems under mild statistical assumptions. The regions are well-shaped, e.g., they are centred around the least-squares (LS) estimate, star-convex and strongly consistent. One of the main assumptions of SPS is that the distribution of the noise terms are symmetric about zero. This paper analyses how robust SPS is with respect to the violation of this assumption and how it could be robustified with respect to non-symmetric noises. First, some alternative solutions are overviewed, then a robustness analysis is performed resulting in a robustified version of SPS. We also suggest a modification of SPS, called LAD-SPS, which builds exact confidence regions around the least-absolute deviation (LAD) estimate instead of the LS estimate. LAD-SPS requires less assumptions as the noise needs only to have a conditionally zero median (w.r.t. the past). Furthermore, that approach can also be robustified using similar ideas as in the LS-SPS case. Finally, some numerical experiments are presented

    On the Poisson Equation of Parameter-Dependent Markov Chains

    Full text link
    The objective of the paper is to revisit a key mathematical technology within the theory of stochastic approximation in a Markovian framework, elaborated in much detail by Benveniste, M\'etivier, and Priouret (1990): the existence, uniqueness and Lipschitz-continuity of the solutions of parameter-dependent Poisson equations associated with parameter-dependent Markov chains on general state spaces. The setup and the methodology of our investigation is based on a new, elegant stability theory for Markov chains, developed by Hairer and Mattingly (2011)

    Control of rivers with flood avoidance

    No full text

    Randomized min-max optimization: the exact risk of multiple cost levels

    No full text
    Abstract-In this paper, we present a theoretical result that applies to convex optimization problems in the presence of an uncertain stochastic parameter. We consider the min-max sample-based solution, i.e. the min-max solution computed over a finite sample of instances of the uncertain stochastic parameter, and the costs incurred by this solution in correspondence of the sampled parameter instances. Our goal is to evaluate the risks associated to the various costs, where the risk associated to a cost is the probability that the cost is exceeded when a new uncertainty instance is seen. The theoretical result proven in this paper is that the risks form a random vector whose probability distribution is always an ordered Dirichlet distribution, irrespective of the probability measure of the uncertain stochastic parameter. This evaluation characterizes completely the risks associated to the costs, and represents a full-fledged result on the reliability of the min-max sample-based solution

    Classification of Solar Wind With Machine Learning

    Get PDF
    We present a four-category classification algorithm for the solar wind, based on Gaussian Process. The four categories are the ones previously adopted in Xu & Borovsky [2015]: ejecta, coronal hole origin plasma, streamer belt origin plasma, and sector reversal origin plasma. The algorithm is trained and tested on a labeled portion of the OMNI dataset. It uses seven inputs: the solar wind speed VswV_{sw}, the temperature standard deviation σT\sigma_T, the sunspot number RR, the f10.7f_{10.7} index, the Alfven speed vAv_A, the proton specific entropy SpS_p and the proton temperature TpT_p compared to a velocity-dependent expected temperature. The output of the Gaussian Process classifier is a four element vector containing the probabilities that an event (one reading from the hourly-averaged OMNI database) belongs to each category. The probabilistic nature of the prediction allows for a more informative and flexible interpretation of the results, for instance being able to classify events as 'undecided'. The new method has a median accuracy larger than 90%90\% for all categories, even using a small set of data for training. The Receiver Operating Characteristic curve and the reliability diagram also demonstrate the excellent quality of this new method. Finally, we use the algorithm to classify a large portion of the OMNI dataset, and we present for the first time transition probabilities between different solar wind categories. Such probabilities represent the 'climatological' statistics that determine the solar wind baseline.Comment: accepted in J. Geophys. Re

    The wait-and-judge scenario approach applied to antenna array design

    Get PDF
    reserved3noThe scenario optimisation approach is a methodology for finding solutions to uncertain convex problems by resorting to a sample of data, which are called “scenarios”. In a min–max set-up, the solution delivered by the scenario approach comes with tight probabilistic guarantees on its risk defined as the probability that an empirical cost threshold will be exceeded when the scenario-based solution is adopted. While the standard theory of scenario optimisation has related the risk of the data-based solution to the number of optimisation variables, a more recent approach, called the “wait-and-judge” scenario approach, enables the user to assess the risk of the solution in a data-dependent way, based on the number of decisive scenarios (“support scenarios”). The aim of this paper is to illustrate the potentials of the wait-and-judge approach for min–max sample-based design and we shall consider to this purpose an antenna array design problem.mixedCarè, Algo*; Garatti, Simone; Campi, Marco C.Carè, Algo; Garatti, Simone; Campi, Marco C

    Old and New Challenges in Finite-Sample System Identification

    No full text
    In 2005, with the publication of the LSCR algorithm (Leave-out Sign-dominant Correlation Regions), a new class of system identification algorithms, for constructing confidence regions around the unknown model parameters, was introduced. These algorithms had two characterising features that, together, set them apart from the previous literature in system identification: first, the constructed regions were accompanied by probabilistic results, certificating the inclusion of the true system parameters, that were rigorous for any number of data points, that is, results were non-asymptotic in nature; second, these inclusion results were proven under very limited prior knowledge on the noise affecting the data. In this talk, we outline a few fundamental, general ideas that are at the core of LSCR and its successors, which are known under the names of SPS (Sign-Perturbed Sums), PDMs (Perturbed Dataset Methods), and, most recently, SPCR (Sign-Perturbed Correlation Regions). In the course of the presentation, the main design and application challenges in this field will be discussed. We will also mention some recent directions of investigations in which we are directly involved; these include the relaxation of traditional assumptions such as the knowledge of the true model order, and the exploitation of a-priori knowledge on the system parameter in constructing the confidence region

    Deterministic continuous-time Virtual Reference Feedback Tuning (VRFT) with application to PID design

    No full text
    In this paper, we introduce a data-driven control design method that does not rely on a model of the plant. The method is inspired by the Virtual Reference Feedback Tuning approach for data-driven controller tuning, but it is here entirely developed in a deterministic, continuous-time setting. A PID autotuner is then developed out of the proposed approach and its effectiveness is tested on an experimental brake-by-wire facility. The final performance is shown to outperform that of a benchmark model-based design method. (C) 2019 Elsevier B.V. All rights reserved
    corecore