485,034 research outputs found

    Validation of population-based disease simulation models: a review of concepts and methods

    Get PDF
    Abstract Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1) the process of model development, 2) the performance of a model, and 3) the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction). More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility

    SHADHO: Massively Scalable Hardware-Aware Distributed Hyperparameter Optimization

    Full text link
    Computer vision is experiencing an AI renaissance, in which machine learning models are expediting important breakthroughs in academic research and commercial applications. Effectively training these models, however, is not trivial due in part to hyperparameters: user-configured values that control a model's ability to learn from data. Existing hyperparameter optimization methods are highly parallel but make no effort to balance the search across heterogeneous hardware or to prioritize searching high-impact spaces. In this paper, we introduce a framework for massively Scalable Hardware-Aware Distributed Hyperparameter Optimization (SHADHO). Our framework calculates the relative complexity of each search space and monitors performance on the learning task over all trials. These metrics are then used as heuristics to assign hyperparameters to distributed workers based on their hardware. We first demonstrate that our framework achieves double the throughput of a standard distributed hyperparameter optimization framework by optimizing SVM for MNIST using 150 distributed workers. We then conduct model search with SHADHO over the course of one week using 74 GPUs across two compute clusters to optimize U-Net for a cell segmentation task, discovering 515 models that achieve a lower validation loss than standard U-Net.Comment: 10 pages, 6 figure

    A validated computational framework to evaluate the stiffness of 3D printed ankle foot orthoses

    Get PDF
    The purpose of this study was to create and validate a standardized framework for the evaluation of the ankle stiffness of two designs of 3D printed ankle foot orthoses (AFOs). The creation of four finite element (FE) models allowed patient-specific quantification of the stiffness and stress distribution over their specific range of motion during the second rocker of the gait. Validation was performed by comparing the model outputs with the results obtained from a dedicated experimental setup, which showed an overall good agreement with a maximum relative error of 10.38% in plantarflexion and 10.66% in dorsiflexion. The combination of advanced computer modelling algorithms and 3D printing techniques clearly shows potential to further improve the manufacturing process of AFOs

    Inverse approach to Einstein's equations for fluids with vanishing anisotropic stress tensor

    Full text link
    We expand previous work on an inverse approach to Einstein Field Equations where we include fluids with energy flux and consider the vanishing of the anisotropic stress tensor. We consider the approach using warped product spacetimes of class B1B_1. Although restricted, these spacetimes include many exact solutions of interest to compact object studies and to cosmological models studies. The question explored here is as follows: given a spacetime metric, what fluid flow (timelike congruence), if any, could generate the spacetime via Einstein's equations. We calculate the flow from the condition of a vanishing anisotropic stress tensor and give results in terms of the metric functions in the three canonical types of coordinates. A condition for perfect fluid sources is also provided. The framework developed is algorithmic and suited for the study and validation of exact solutions using computer algebra systems. The framework can be applied to solutions in comoving and non-comoving frames of reference, and examples in different types of coordinates are worked out.Comment: 15 pages, matches version to appear in Phys.Rev.

    Development, analysis, and implications of open-source simulations of remotely piloted aircraft

    Get PDF
    In recent years, the use of Remotely Piloted Aircraft (RPAs) for diverse purposes has increased exponentially. As a consequence, the uncertainty created by situations turning into a threat for civilians has led to more restrictive regulations from national administrations such as Transport Canada. Their purpose is to safely integrate RPAs in the current airspace used for piloted aviation by evaluating Sense and Avoid (SAA) strategies and close encounters. The difficulty falls on having to rely on simulated environments because of the risk to the human pilot in the piloted aircraft. In the first part of this research, the technical difficulties associated with the development and study of RPA computer models are discussed. It explores the rationale behind using Open-Source Software (OSS) platforms for simulating RPAs as well as the challenges associated with interacting with OSS at graduate student level. A set of recommendations is proposed as the solution to improve the graduate student experience with OSS. In the second part, particular challenges related to the design of OSS computer models are addressed. Based on: (1) the differences and similarities between piloted and RPA flight simulators and (2) existing Verification and Validation (V&V) approaches, a validation method is presented as a solution to the subject of developing fixed-wing RPAs in OSS environments. This method is used to design two flight dynamics models with SAA applications. The first computer model is presented in tutorial format as a case study for the validation procedure whereas the second computer model is specific for testing SAA strategies. In the last part, one of the designed RPAs is integrated into a computer environment with a representative general aircraft. From the simulated encounters, a diving avoidance manoeuvre on the RPA is developed. This performance is observed to analyze the consequences to the airspace. The implications of this research are seen from three perspectives: (1) the OSS challenges in graduate school are wide-spread across disciplines, (2) the proposed validation procedure is adaptable to fit any computer model and simulation scenario, and (3) the simulated OSS framework with an RPA computer model has served for testing preliminary SAA methods with close encounters with manned aircraft

    The Validation of an Infrared Simulation System

    Get PDF
    A commonly-used term in the simulation domain is ‘validation, verification and accreditation’ (VVA). When analysing simulation predictions for the purpose of system solution development and decision-making, one key question persist: “What confidence can I have in the simulation and its results? ” Knowing the validation status of a simulation system is critical to express confidence in the simulation. A practical validation procedure must be simple and done in the regular course of work. A well-known and acknowledged validation model by Schlesinger depicts the interaction between three entities: Reality, Conceptual Model and Computer Model, and three processes: Analysis & Modelling, Programming and Verification, and Evaluation and Validation. We developed a systematic procedure where each of these six elements is evaluated, investigated and then quantified in terms of a set of criteria (or model properties). Many techniques exist to perform the validation procedure. They include: comparison with other models, face validity, extreme condition testing, historical data validation and predictive validation - to mention a few. The result is a two- dimensional matrix representing the confidence in validation of each of the criteria (model properties) along each of the verification and validation elements. Depending on the nature of the element, the quantification of each cell in this matrix is done numerically or heuristically. Most often literature on validation for simulation systems only provides guidance by means of a theoretical validation framework. This paper briefly describes the procedure used to validate software models in an infrared system simulation, and provides application examples of this process. The discussion includes practical validation techniques, quantification, visualisation, summary reports, and lessons learned during the course of a validation process. The framework presented in this paper is sufficiently general, so that the concepts could be applied to other simulation environments as well

    Network Simulation Cradle

    Get PDF
    This thesis proposes the use of real world network stacks instead of protocol abstractions in a network simulator, bringing the actual code used in computer systems inside the simulator and allowing for greater simulation accuracy. Specifically, a framework called the Network Simulation Cradle is created that supports the kernel source code from FreeBSD, OpenBSD and Linux to make the network stacks from these systems available to the popular network simulator ns-2. Simulating with these real world network stacks reveals situations where the result differs significantly from ns-2's TCP models. The simulated network stacks are able to be directly compared to the same operating system running on an actual machine, making validation simple. When measuring the packet traces produced on a test network and in simulation the results are nearly identical, a level of accuracy previously unavailable using traditional TCP simulation models. The results of simulations run comparing ns-2 TCP models and our framework are presented in this dissertation along with validation studies of our framework showing how closely simulation resembles real world computers. Using real world stacks to simulate TCP is a complementary approach to using the existing TCP models and provides an extra level of validation. This way of simulating TCP and other protocols provides the network researcher or engineer new possibilities. One example is using the framework as a protocol development environment, which allows user-level development of protocols with a standard set of reproducible tests, the ability to test scenarios which are costly or impossible to build physically, and being able to trace and debug the protocol code without affecting results

    OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The OpenTox Framework, developed by the partners in the OpenTox project (<url>http://www.opentox.org</url>), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing.</p> <p>Results</p> <p>The following related ontologies have been developed for OpenTox: a) Toxicological ontology – listing the toxicological endpoints; b) Organs system and Effects ontology – addressing organs, targets/examinations and effects observed in <it>in vivo</it> studies; c) ToxML ontology – representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology– representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink–ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.</p> <p>OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.</p> <p>The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists).</p> <p>Availability</p> <p>The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page <url>http://www.opentox.org/dev/ontology</url>; the OpenTox ontology is available as OWL at <url>http://opentox.org/api/1 1/opentox.owl</url>, the ToxML - OWL conversion utility is an open source resource available at <url>http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/</url></p
    • 

    corecore