53,429 research outputs found

    Optimal Uncertainty Quantification

    Get PDF
    We propose a rigorous framework for Uncertainty Quantification (UQ) in which the UQ objectives and the assumptions/information set are brought to the forefront. This framework, which we call \emph{Optimal Uncertainty Quantification} (OUQ), is based on the observation that, given a set of assumptions and information about the problem, there exist optimal bounds on uncertainties: these are obtained as values of well-defined optimization problems corresponding to extremizing probabilities of failure, or of deviations, subject to the constraints imposed by the scenarios compatible with the assumptions and information. In particular, this framework does not implicitly impose inappropriate assumptions, nor does it repudiate relevant information. Although OUQ optimization problems are extremely large, we show that under general conditions they have finite-dimensional reductions. As an application, we develop \emph{Optimal Concentration Inequalities} (OCI) of Hoeffding and McDiarmid type. Surprisingly, these results show that uncertainties in input parameters, which propagate to output uncertainties in the classical sensitivity analysis paradigm, may fail to do so if the transfer functions (or probability distributions) are imperfectly known. We show how, for hierarchical structures, this phenomenon may lead to the non-propagation of uncertainties or information across scales. In addition, a general algorithmic framework is developed for OUQ and is tested on the Caltech surrogate model for hypervelocity impact and on the seismic safety assessment of truss structures, suggesting the feasibility of the framework for important complex systems. The introduction of this paper provides both an overview of the paper and a self-contained mini-tutorial about basic concepts and issues of UQ.Comment: 90 pages. Accepted for publication in SIAM Review (Expository Research Papers). See SIAM Review for higher quality figure

    Certifying and removing disparate impact

    Full text link
    What does it mean for an algorithm to be biased? In U.S. law, unintentional bias is encoded via disparate impact, which occurs when a selection process has widely different outcomes for different groups, even as it appears to be neutral. This legal determination hinges on a definition of a protected class (ethnicity, gender, religious practice) and an explicit description of the process. When the process is implemented using computers, determining disparate impact (and hence bias) is harder. It might not be possible to disclose the process. In addition, even if the process is open, it might be hard to elucidate in a legal setting how the algorithm makes its decisions. Instead of requiring access to the algorithm, we propose making inferences based on the data the algorithm uses. We make four contributions to this problem. First, we link the legal notion of disparate impact to a measure of classification accuracy that while known, has received relatively little attention. Second, we propose a test for disparate impact based on analyzing the information leakage of the protected class from the other data attributes. Third, we describe methods by which data might be made unbiased. Finally, we present empirical evidence supporting the effectiveness of our test for disparate impact and our approach for both masking bias and preserving relevant information in the data. Interestingly, our approach resembles some actual selection practices that have recently received legal scrutiny.Comment: Extended version of paper accepted at 2015 ACM SIGKDD Conference on Knowledge Discovery and Data Minin

    Active Sampling-based Binary Verification of Dynamical Systems

    Full text link
    Nonlinear, adaptive, or otherwise complex control techniques are increasingly relied upon to ensure the safety of systems operating in uncertain environments. However, the nonlinearity of the resulting closed-loop system complicates verification that the system does in fact satisfy those requirements at all possible operating conditions. While analytical proof-based techniques and finite abstractions can be used to provably verify the closed-loop system's response at different operating conditions, they often produce conservative approximations due to restrictive assumptions and are difficult to construct in many applications. In contrast, popular statistical verification techniques relax the restrictions and instead rely upon simulations to construct statistical or probabilistic guarantees. This work presents a data-driven statistical verification procedure that instead constructs statistical learning models from simulated training data to separate the set of possible perturbations into "safe" and "unsafe" subsets. Binary evaluations of closed-loop system requirement satisfaction at various realizations of the uncertainties are obtained through temporal logic robustness metrics, which are then used to construct predictive models of requirement satisfaction over the full set of possible uncertainties. As the accuracy of these predictive statistical models is inherently coupled to the quality of the training data, an active learning algorithm selects additional sample points in order to maximize the expected change in the data-driven model and thus, indirectly, minimize the prediction error. Various case studies demonstrate the closed-loop verification procedure and highlight improvements in prediction error over both existing analytical and statistical verification techniques.Comment: 23 page

    An improved approach for flight readiness assessment

    Get PDF
    An improved methodology for quantitatively evaluating failure risk for a spaceflight system in order to assess flight readiness is presented. This methodology is of particular value when information relevant to failure prediction, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is limited. In this approach, engineering analysis models that characterize specific failure modes based on the physics and mechanics of the failure phenomena are used in a prescribed probabilistic structure to generate a failure probability distribution that is modified by test and flight experience in a Bayesian statistical procedure. The probabilistic structure and statistical methodology are generally applicable to any failure mode for which quantitative engineering analysis can be employed to characterize the failure phenomenon and are particularly well suited for use under the constraints on information availability that are typical of such spaceflight systems as the Space Shuttle and planetary spacecraft

    Rotorcraft aviation icing research requirements: Research review and recommendations

    Get PDF
    The status of rotorcraft icing evaluation techniques and ice protection technology was assessed. Recommendations are made for near and long term icing programs that describe the needs of industry. These recommended programs are based on a consensus of the major U.S. helicopter companies. Specific activities currently planned or underway by NASA, FAA and DOD are reviewed to determine relevance to the overall research requirements. New programs, taking advantage of current activities, are recommended to meet the long term needs for rotorcraft icing certification

    Effectiveness of organic certification: a study on an italian organic certificator's data.

    Get PDF
    The aim of this paper is to implemnt risk-based models for the inspection procedures in the organic certification. particularly, the aim is to analyse the the relationship between the type of sanction a farm receives, and the farm's structure and productions, aiming at the definition of potential risk factors

    Multilevel Models with Stochastic Volatility for Repeated Cross-Sections: an Application to tribal Art Prices

    Get PDF
    In this paper we introduce a multilevel specification with stochastic volatility for repeated cross-sectional data. Modelling the time dynamics in repeated cross sections requires a suitable adaptation of the multilevel framework where the individuals/items are modelled at the first level whereas the time component appears at the second level. We perform maximum likelihood estimation by means of a nonlinear state space approach combined with Gauss-Legendre quadrature methods to approximate the likelihood function. We apply the model to the first database of tribal art items sold in the most important auction houses worldwide. The model allows to account properly for the heteroscedastic and autocorrelated volatility observed and has superior forecasting performance. Also, it provides valuable information on market trends and on predictability of prices that can be used by art markets stakeholders

    A Quiet Helicopter for Air Taxi Operations

    Get PDF
    NASA is exploring rotorcraft designs for VTOL air taxi operations, also known as urban air mobility (UAM) or on-demand mobility (ODM) applications. Several concept vehicles have been developed, intended to focus and guide NASA research activities in support of aircraft development for this emerging market. This paper examines a single main-rotor helicopter designed specifically for low-noise air taxi operations. Based on demonstrated technology, the aircraft uses a turboshaft engine with a sound-absorbing installation, and the NOTAR anti-torque system to eliminate tail-rotor noise, consequently the noise and annoyance of the aircraft are dominated by the main rotor. Several design parameters are explored to reduce the noise, including rotor tip speed, blade geometry, and higher-harmonic control. Commensurate with the level of design detail, the noise is calculated for compact loading and thickness sources on the rotating blades. The metric is the reduction of the noise for the helicopter certification conditions (takeoff, flyover, and approach), relative a baseline aircraft with typical (high) tip speed, conventional blade planform, and no higher-harmonic control
    corecore