187,637 research outputs found

    Online Dectection and Modeling of Safety Boundaries for Aerospace Application Using Bayesian Statistics

    Get PDF
    The behavior of complex aerospace systems is governed by numerous parameters. For safety analysis it is important to understand how the system behaves with respect to these parameter values. In particular, understanding the boundaries between safe and unsafe regions is of major importance. In this paper, we describe a hierarchical Bayesian statistical modeling approach for the online detection and characterization of such boundaries. Our method for classification with active learning uses a particle filter-based model and a boundary-aware metric for best performance. From a library of candidate shapes incorporated with domain expert knowledge, the location and parameters of the boundaries are estimated using advanced Bayesian modeling techniques. The results of our boundary analysis are then provided in a form understandable by the domain expert. We illustrate our approach using a simulation model of a NASA neuro-adaptive flight control system, as well as a system for the detection of separation violations in the terminal airspace

    SIMULATION OF A SCHOOL CANTEEN TO UNDERSTAND MEAL DURATION IMPACT ON FOOD WASTE

    Get PDF
    A system simulation is a one of the approaches to understand business processes or to explain them to other people. It is an excellent decision making solution to provide data-driven conclusions based on system modelling and experiments. This paper proposes simulation results of a school canteen. The aim of the research was to investigate the relation between a food waste amount and meal time duration. The proposed simulation was based on business process analysis, business process modelling, a Monte Carlo method and expert knowledge. The frequency distributions were constructed based on children meal duration observation completed by their mothers. It is a magnificent citizen science solution to involve mothers in the research because they can additionally better understand their children meal preferences and habits. Therefore, a questionnaire for citizens was developed, which can be applied to collect statistical data for model accuracy improvement and extension

    Systematic Correlation Matrix Evaluation (SCoMaE) – a bottom–up, science-led approach to identifying indicators

    Get PDF
    This study introduces the Systematic Correlation Matrix Evaluation (SCoMaE) method, a bottom–up approach which combines expert judgment and statistical information to systematically select transparent, nonredundant indicators for a comprehensive assessment of the state of the Earth system. The methods consists of two basic steps: (1) the calculation of a correlation matrix among variables relevant for a given research question and (2) the systematic evaluation of the matrix, to identify clusters of variables with similar behavior and respective mutually independent indicators. Optional further analysis steps include (3) the interpretation of the identified clusters, enabling a learning effect from the selection of indicators, (4) testing the robustness of identified clusters with respect to changes in forcing or boundary conditions, (5) enabling a comparative assessment of varying scenarios by constructing and evaluating a common correlation matrix, and (6) the inclusion of expert judgment, for example, to prescribe indicators, to allow for considerations other than statistical consistency. The example application of the SCoMaE method to Earth system model output forced by different CO2 emission scenarios reveals the necessity of reevaluating indicators identified in a historical scenario simulation for an accurate assessment of an intermediate–high, as well as a business-as-usual, climate change scenario simulation. This necessity arises from changes in prevailing correlations in the Earth system under varying climate forcing. For a comparative assessment of the three climate change scenarios, we construct and evaluate a common correlation matrix, in which we identify robust correlations between variables across the three considered scenarios

    Identification and interpretation of patterns in rocket engine data

    Get PDF
    A prototype software system was constructed to detect anomalous Space Shuttle Main Engine (SSME) behavior in the early stages of fault development significantly earlier than the indication provided by either redline detection mechanism or human expert analysis. The major task of the research project is to analyze ground test data, to identify patterns associated with the anomalous engine behavior, and to develop a pattern identification and detection system on the basis of this analysis. A prototype expert system which was developed on both PC and Symbolics 3670 lisp machine for detecting anomalies in turbopump vibration data was checked with data from ground tests 902-473, 902-501, 902-519, and 904-097 of the Space Shuttle Main Engine. The neural networks method was also applied to supplement the statistical method utilized in the prototype system to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. In most cases the anomalies detected by the expert system agree with those reported by NASA. On the neural networks approach, the results are given the successful detection rate higher than 95 percent to identify either normal or abnormal running condition based on the experimental data as well as numerical simulation

    Modelling the balance of care:Impact of an evidence-informed policy on a mental health ecosystem

    Get PDF
    Major efforts worldwide have been made to provide balanced Mental Health (MH) care. Any integrated MH ecosystem includes hospital and community-based care, highlighting the role of outpatient care in reducing relapses and readmissions. This study aimed (i) to identify potential expert-based causal relationships between inpatient and outpatient care variables, (ii) to assess them by using statistical procedures, and finally (iii) to assess the potential impact of a specific policy enhancing the MH care balance on real ecosystem performance. Causal relationships (Bayesian network) between inpatient and outpatient care variables were defined by expert knowledge and confirmed by using multivariate linear regression (generalized least squares). Based on the Bayesian network and regression results, a decision support system that combines data envelopment analysis, Monte Carlo simulation and fuzzy inference was used to assess the potential impact of the designed policy. As expected, there were strong statistical relationships between outpatient and inpatient care variables, which preliminarily confirmed their potential and a priori causal nature. The global impact of the proposed policy on the ecosystem was positive in terms of efficiency assessment, stability and entropy. To the best of our knowledge, this is the first study that formalized expert-based causal relationships between inpatient and outpatient care variables. These relationships, structured by a Bayesian network, can be used for designing evidence-informed policies trying to balance MH care provision. By integrating causal models and statistical analysis, decision support systems are useful tools to support evidence-informed planning and decision making, as they allow us to predict the potential impact of specific policies on the ecosystem prior to its real application, reducing the risk and considering the population’s needs and scientific findings

    Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    Get PDF
    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license

    Flight Guardian: Autonomous Flight Safety Improvement by Monitoring Aircraft Cockpit Instruments

    Get PDF
    During in-flight emergencies, a pilot’s workload increases significantly and it is often during this period of increased stress that human errors occur that consequently diminish the flight safety. Research studies indicate that many plane crashes can be attributed to ineffective cockpit instrument monitoring by the pilot. This manuscript entails the development of Flight Guardian(FG) system being first of its kind that aims to provide efficient flight-deck awareness to improve flight safety while assisting the pilot in abnormal situations. The system is intended to be used in older aircrafts that cannot easily or cost effectively be modified with modern digital avionic systems. One of the important feature of FG system being not physically connected to the aircraft, avoids any impact on airworthiness or the need for re-certification. For the first time, a composite of techniques including video analysis, knowledge representation, and machine belief representations are combined to build a novel flight-deck warning system. The prototype system is tested in both; simulation based Lab and real flight environments under the guidance of expert pilots. The overall system performance is evaluated using statistical analysis of experimental results that proved the robustness of proposed methodology in terms of automated warning generation in hazardous situations

    Investigation of Air Transportation Technology at Princeton University, 1989-1990

    Get PDF
    The Air Transportation Technology Program at Princeton University proceeded along six avenues during the past year: microburst hazards to aircraft; machine-intelligent, fault tolerant flight control; computer aided heuristics for piloted flight; stochastic robustness for flight control systems; neural networks for flight control; and computer aided control system design. These topics are briefly discussed, and an annotated bibliography of publications that appeared between January 1989 and June 1990 is given

    Merging expert and empirical data for rare event frequency estimation : pool homogenisation for empirical Bayes models

    Get PDF
    Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification
    corecore