6,789 research outputs found

    Cumulative Index

    Get PDF
    Diffuse liver disease is a growing problem and a major cause of death worldwide. In the final stages the treatment often involves liver resection or transplant and in deciding what course of action is to be taken it is crucial to have a correct assessment of the function of the liver. The current “gold standard” for this assessment is to take a liver biopsy which has a number of disadvantages. As an alternative, a method involving magnetic resonance imaging and mechanistic modeling of the liver has been developed at Linköping University. One of the obstacles for this method to overcome in order to reach clinical implementation is the speed of the parameter estimation. In this project the methodology of metamodeling is tested as a possible solution to this speed problem. Metamodeling involve making models of models using extensive model simulations and mathematical tools. With the use of regression methods, clustering algorithms, and optimization, different methods for parameter estimation have been evaluated. The results show that several, but not all, of the parameters could be accurately estimated using metamodeling and that metamodeling could be a highly useful tool when modeling biological systems. With further development, metamodeling could bring this non-invasive method for estimation of liver function a major step closer to application in the clinic

    Sensitivity analysis of expensive black-box systems using metamodeling

    Get PDF
    Simulations are becoming ever more common as a tool for designing complex products. Sensitivity analysis techniques can be applied to these simulations to gain insight, or to reduce the complexity of the problem at hand. However, these simulators are often expensive to evaluate and sensitivity analysis typically requires a large amount of evaluations. Metamodeling has been successfully applied in the past to reduce the amount of required evaluations for design tasks such as optimization and design space exploration. In this paper, we propose a novel sensitivity analysis algorithm for variance and derivative based indices using sequential sampling and metamodeling. Several stopping criteria are proposed and investigated to keep the total number of evaluations minimal. The results show that both variance and derivative based techniques can be accurately computed with a minimal amount of evaluations using fast metamodels and FLOLA-Voronoi or density sequential sampling algorithms.Comment: proceedings of winter simulation conference 201

    Mechanical MNIST: A benchmark dataset for mechanical metamodels

    Full text link
    Metamodels, or models of models, map defined model inputs to defined model outputs. Typically, metamodels are constructed by generating a dataset through sampling a direct model and training a machine learning algorithm to predict a limited number of model outputs from varying model inputs. When metamodels are constructed to be computationally cheap, they are an invaluable tool for applications ranging from topology optimization, to uncertainty quantification, to multi-scale simulation. By nature, a given metamodel will be tailored to a specific dataset. However, the most pragmatic metamodel type and structure will often be general to larger classes of problems. At present, the most pragmatic metamodel selection for dealing with mechanical data has not been thoroughly explored. Drawing inspiration from the benchmark datasets available to the computer vision research community, we introduce a benchmark data set (Mechanical MNIST) for constructing metamodels of heterogeneous material undergoing large deformation. We then show examples of how our benchmark dataset can be used, and establish baseline metamodel performance. Because our dataset is readily available, it will enable the direct quantitative comparison between different metamodeling approaches in a pragmatic manner. We anticipate that it will enable the broader community of researchers to develop improved metamodeling techniques for mechanical data that will surpass the baseline performance that we show here.Accepted manuscrip

    Comparing Strategies to Prevent Stroke and Ischemic Heart Disease in the Tunisian Population: Markov Modeling Approach Using a Comprehensive Sensitivity Analysis Algorithm.

    Get PDF
    Background. Mathematical models offer the potential to analyze and compare the effectiveness of very different interventions to prevent future cardiovascular disease. We developed a comprehensive Markov model to assess the impact of three interventions to reduce ischemic heart diseases (IHD) and stroke deaths: (i) improved medical treatments in acute phase, (ii) secondary prevention by increasing the uptake of statins, (iii) primary prevention using health promotion to reduce dietary salt consumption. Methods. We developed and validated a Markov model for the Tunisian population aged 35–94 years old over a 20-year time horizon. We compared the impact of specific treatments for stroke, lifestyle, and primary prevention on both IHD and stroke deaths. We then undertook extensive sensitivity analyses using both a probabilistic multivariate approach and simple linear regression (metamodeling). Results. The model forecast a dramatic mortality rise, with 111,134 IHD and stroke deaths (95% CI 106567 to 115048) predicted in 2025 in Tunisia. The salt reduction offered the potentially most powerful preventive intervention that might reduce IHD and stroke deaths by 27% (−30240 [−30580 to −29900]) compared with 1% for medical strategies and 3% for secondary prevention. The metamodeling highlighted that the initial development of a minor stroke substantially increased the subsequent probability of a fatal stroke or IHD death. Conclusions. The primary prevention of cardiovascular disease via a reduction in dietary salt consumption appeared much more effective than secondary or tertiary prevention approaches. Our simple but comprehensive model offers a potentially attractive methodological approach that might now be extended and replicated in other contexts and populations

    A Language Description is More than a Metamodel

    Get PDF
    Within the context of (software) language engineering, language descriptions are considered first class citizens. One of the ways to describe languages is by means of a metamodel, which represents the abstract syntax of the language. Unfortunately, in this process many language engineers forget the fact that a language also needs a concrete syntax and a semantics. In this paper I argue that neither of these can be discarded from a language description. In a good language description the abstract syntax is the central element, which functions as pivot between concrete syntax and semantics. Furthermore, both concrete syntax and semantics should be described in a well-defined formalism

    Screening and metamodeling of computer experiments with functional outputs. Application to thermal-hydraulic computations

    Get PDF
    To perform uncertainty, sensitivity or optimization analysis on scalar variables calculated by a cpu time expensive computer code, a widely accepted methodology consists in first identifying the most influential uncertain inputs (by screening techniques), and then in replacing the cpu time expensive model by a cpu inexpensive mathematical function, called a metamodel. This paper extends this methodology to the functional output case, for instance when the model output variables are curves. The screening approach is based on the analysis of variance and principal component analysis of output curves. The functional metamodeling consists in a curve classification step, a dimension reduction step, then a classical metamodeling step. An industrial nuclear reactor application (dealing with uncertainties in the pressurized thermal shock analysis) illustrates all these steps
    • 

    corecore