76,731 research outputs found

    A Review of Methods for the Analysis of the Expected Value of Information

    Get PDF
    Over recent years Value of Information analysis has become more widespread in health-economic evaluations, specifically as a tool to perform Probabilistic Sensitivity Analysis. This is largely due to methodological advancements allowing for the fast computation of a typical summary known as the Expected Value of Partial Perfect Information (EVPPI). A recent review discussed some estimations method for calculating the EVPPI but as the research has been active over the intervening years this review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present a case study in order to compare the estimation performance of these new methods. We conclude that the most recent development based on non-parametric regression offers the best method for calculating the EVPPI efficiently. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with

    Design, implementation, and testing of advanced virtual coordinate-measuring machines

    Get PDF
    Copyright @ 2011 IEEE. This article has been made available through the Brunel Open Access Publishing Fund.Advanced virtual coordinate-measuring machines (CMMs) (AVCMMs) have recently been developed at Brunel University, which provide vivid graphical representation and powerful simulation of CMM operations, together with Monte-Carlo-based uncertainty evaluation. In an integrated virtual environment, the user can plan an inspection strategy for a given task, carry out virtual measurements, and evaluate the uncertainty associated with the measurement results, all without the need of using a physical machine. The obtained estimate of uncertainty can serve as a rapid feedback for the user to optimize the inspection plan in the AVCMM before actual measurements or as an evaluation of the measurement results performed. This paper details the methodology, design, and implementation of the AVCMM system, including CMM modeling, probe contact and collision detection, error modeling and simulation, and uncertainty evaluation. This paper further reports experimental results for the testing of the AVCMM

    NNVA: Neural Network Assisted Visual Analysis of Yeast Cell Polarization Simulation

    Full text link
    Complex computational models are often designed to simulate real-world physical phenomena in many scientific disciplines. However, these simulation models tend to be computationally very expensive and involve a large number of simulation input parameters which need to be analyzed and properly calibrated before the models can be applied for real scientific studies. We propose a visual analysis system to facilitate interactive exploratory analysis of high-dimensional input parameter space for a complex yeast cell polarization simulation. The proposed system can assist the computational biologists, who designed the simulation model, to visually calibrate the input parameters by modifying the parameter values and immediately visualizing the predicted simulation outcome without having the need to run the original expensive simulation for every instance. Our proposed visual analysis system is driven by a trained neural network-based surrogate model as the backend analysis framework. Surrogate models are widely used in the field of simulation sciences to efficiently analyze computationally expensive simulation models. In this work, we demonstrate the advantage of using neural networks as surrogate models for visual analysis by incorporating some of the recent advances in the field of uncertainty quantification, interpretability and explainability of neural network-based models. We utilize the trained network to perform interactive parameter sensitivity analysis of the original simulation at multiple levels-of-detail as well as recommend optimal parameter configurations using the activation maximization framework of neural networks. We also facilitate detail analysis of the trained network to extract useful insights about the simulation model, learned by the network, during the training process.Comment: Published at IEEE Transactions on Visualization and Computer Graphic

    Rapid Computation of Thermodynamic Properties Over Multidimensional Nonbonded Parameter Spaces using Adaptive Multistate Reweighting

    Full text link
    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over a thousand CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form. We also compute entropy, enthalpy, and radial distribution functions of unsampled parameter combinations using only the data from these sampled states and use the free energies estimates to examine the deviation of simulations from the Born approximation to the solvation free energy

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie
    • 

    corecore