731,948 research outputs found

    Statistical validation of simulation models: A case study

    Get PDF
    Rigorous statistical validation requires that the responses of the model and the real system have the same expected values. However, the modeled and actual responses are not comparable if they are obtained under different scenarios (environmental conditions). Moreover, data on the real system may be unavailable; sensitivity analysis can then be applied to find out whether the model inputs have effects on the model outputs that agree with the experts' intuition. Not only the total model, but also its modules may be submitted to such sensitivity analyses. This article illustrates these issues through a case study, namely a simulation model for the use of sonar to search for mines on the sea bottom. The methodology, however, applies to models in general.Simulation Models;Statistical Validation;statistics

    Variational cross-validation of slow dynamical modes in molecular kinetics

    Full text link
    Markov state models (MSMs) are a widely used method for approximating the eigenspectrum of the molecular dynamics propagator, yielding insight into the long-timescale statistical kinetics and slow dynamical modes of biomolecular systems. However, the lack of a unified theoretical framework for choosing between alternative models has hampered progress, especially for non-experts applying these methods to novel biological systems. Here, we consider cross-validation with a new objective function for estimators of these slow dynamical modes, a generalized matrix Rayleigh quotient (GMRQ), which measures the ability of a rank-mm projection operator to capture the slow subspace of the system. It is shown that a variational theorem bounds the GMRQ from above by the sum of the first mm eigenvalues of the system's propagator, but that this bound can be violated when the requisite matrix elements are estimated subject to statistical uncertainty. This overfitting can be detected and avoided through cross-validation. These result make it possible to construct Markov state models for protein dynamics in a way that appropriately captures the tradeoff between systematic and statistical errors

    A HIERARCHICAL FRAMEWORK FOR STATISTICAL MODEL VALIDATION OF ENGINEERED SYSTEMS

    Get PDF
    As the role of computational models has increased, the accuracy of computational results has been of great concern to engineering decision-makers. To address a growing concern about the predictive capability of the computational models, this dissertation proposed a generic model validation framework with four research objectives as: Objective 1 &mdash to develop a hierarchical framework for statistical model validation that is applicable to various computational models of engineered products (or systems); Objective 2 &mdash to advance a model calibration technique that can facilitate to improve predictive capability of computational models in a statistical manner; Objective 3 &mdash to build a validity check engine of a computational model with limited experimental data; and Objective 4 &mdash to demonstrate the feasibility and effectiveness of the proposed validation framework with five engineering problems requiring different experimental resources and predictive computational models: (a) cellular phone, (b) tire tread block, (c) thermal challenge problem, (d) constrained-layer damping structure and (e) energy harvesting device. The validation framework consists of three activities: validation planning (top-down), validation execution (bottom-up) and virtual qualification. The validation planning activity requires knowledge about physics-of-failure (PoF) mechanisms and/or system performances of interest. The knowledge facilitates to decompose an engineered system into subsystems and/or components such that PoF mechanisms or system performances of interest can be decomposed accordingly. The validation planning activity takes a top-down approach and identifies vital tests and predictive computational models of which contain both known and unknown model input variable(s). On the other hand, the validation execution activity takes a bottom-up approach, which improves the predictive capability of the computational models from the lowest level to the highest using the statistical calibration technique. This technique compares experimental results with predicted ones from the computational model to determine the best statistical distributions of unknown random variables while maximizing the likelihood function. As the predictive capability of a computational model at a lower hierarchical level is improved, this enhanced model can be fused into the model at a higher hierarchical level. The validation execution activity is then continued for the model at the higher hierarchical level. After the statistical model calibration, a validity of the calibrated model should be assessed; therefore, a hypothesis test for validity check method was developed to measure and evaluate the degree of mismatch between predicted and observed results while considering the uncertainty caused by limited experimental data. Should the model become valid, the virtual qualification can be executed in a statistical sense for new product developments. With five case studies, this dissertation demonstrates that the validation framework is applicable to diverse classes of engineering problems for improving the predictive capability of the computational models, assessing the fidelity of the computational models, and assisting rational decision making on new design alternatives in the product development process

    Approximate cross-validation formula for Bayesian linear regression

    Full text link
    Cross-validation (CV) is a technique for evaluating the ability of statistical models/learning systems based on a given data set. Despite its wide applicability, the rather heavy computational cost can prevent its use as the system size grows. To resolve this difficulty in the case of Bayesian linear regression, we develop a formula for evaluating the leave-one-out CV error approximately without actually performing CV. The usefulness of the developed formula is tested by statistical mechanical analysis for a synthetic model. This is confirmed by application to a real-world supernova data set as well.Comment: 5 pages, 2 figures, invited paper for Allerton2016 conferenc

    Oscillations, metastability and phase transitions in brain and models of cognition

    Get PDF
    Neuroscience is being practiced in many different forms and at many different organizational levels of the Nervous System. Which of these levels and associated conceptual frameworks is most informative for elucidating the association of neural processes with processes of Cognition is an empirical question and subject to pragmatic validation. In this essay, I select the framework of Dynamic System Theory. Several investigators have applied in recent years tools and concepts of this theory to interpretation of observational data, and for designing neuronal models of cognitive functions. I will first trace the essentials of conceptual development and hypotheses separately for discerning observational tests and criteria for functional realism and conceptual plausibility of the alternatives they offer. I will then show that the statistical mechanics of phase transitions in brain activity, and some of its models, provides a new and possibly revealing perspective on brain events in cognition

    Validation of Worst-Case and Statistical Models for an Automotive EMC Expert System

    Get PDF
    Previous papers have presented algorithms for an EMC expert system used to predict potential electromagnetic compatibility problems in a vehicle early in the design process. Here, the accuracy of inductive and capacitive coupling algorithms are verified through representative measurements of crosstalk within an automobile. Worst-case estimates used by the algorithms are compared to measured values and are compared to values estimated using statistical methods. The worst-case algorithms performed well up to 10-20 MHz, but overestimated measured results by several dB in some cases and up to 10-15 dB in others. An approximate statistical variation of the current expert system algorithms also worked well and can help avoid overestimation of problems; however, worst-case estimates better ensure that problems will not be missed, especially in the absence of complete system information

    Urban Spatial Pattern as Self-Organizing System: An Empirical Evaluation of Firm Location Decisions in Cleveland–Akron PMSA, Ohio

    Get PDF
    Economic models of urban spatial patterns have largely ignored complexity as an attribute of urban systems. Complexity theorists on the other hand have not made sufficiently serious and sustained attempts to verify empirically the relevance of complex systems models for urban spatial patterns. This research bridges this gap by simulating the evolution of an urban employment pattern as a self-organizing complex system and seeking its empirical validation. It estimates the model’s parameters by using firm data aggregated to the level of municipalities in Cleveland-Akron Consolidated Metropolitan Statistical Area in Ohio. The interaction among four parameters, forces of attraction and dispersion and their respective rates of dissipation with distance, are modeled as a two-dimensional complex system. The research compares the states of the modeled system with empirical data to present viable methods for verification, calibration and validation of such models
    • …
    corecore