37,219 research outputs found

    Confidence intervals of prediction accuracy measures for multivariable prediction models based on the bootstrap-based optimism correction methods

    Full text link
    In assessing prediction accuracy of multivariable prediction models, optimism corrections are essential for preventing biased results. However, in most published papers of clinical prediction models, the point estimates of the prediction accuracy measures are corrected by adequate bootstrap-based correction methods, but their confidence intervals are not corrected, e.g., the DeLong's confidence interval is usually used for assessing the C-statistic. These naive methods do not adjust for the optimism bias and do not account for statistical variability in the estimation of parameters in the prediction models. Therefore, their coverage probabilities of the true value of the prediction accuracy measure can be seriously below the nominal level (e.g., 95%). In this article, we provide two generic bootstrap methods, namely (1) location-shifted bootstrap confidence intervals and (2) two-stage bootstrap confidence intervals, that can be generally applied to the bootstrap-based optimism correction methods, i.e., the Harrell's bias correction, 0.632, and 0.632+ methods. In addition, they can be widely applied to various methods for prediction model development involving modern shrinkage methods such as the ridge and lasso regressions. Through numerical evaluations by simulations, the proposed confidence intervals showed favourable coverage performances. Besides, the current standard practices based on the optimism-uncorrected methods showed serious undercoverage properties. To avoid erroneous results, the optimism-uncorrected confidence intervals should not be used in practice, and the adjusted methods are recommended instead. We also developed the R package predboot for implementing these methods (https://github.com/nomahi/predboot). The effectiveness of the proposed methods are illustrated via applications to the GUSTO-I clinical trial

    Modelling and simulation framework for reactive transport of organic contaminants in bed-sediments using a pure java object - oriented paradigm

    Get PDF
    Numerical modelling and simulation of organic contaminant reactive transport in the environment is being increasingly relied upon for a wide range of tasks associated with risk-based decision-making, such as prediction of contaminant profiles, optimisation of remediation methods, and monitoring of changes resulting from an implemented remediation scheme. The lack of integration of multiple mechanistic models to a single modelling framework, however, has prevented the field of reactive transport modelling in bed-sediments from developing a cohesive understanding of contaminant fate and behaviour in the aquatic sediment environment. This paper will investigate the problems involved in the model integration process, discuss modelling and software development approaches, and present preliminary results from use of CORETRANS, a predictive modelling framework that simulates 1-dimensional organic contaminant reaction and transport in bed-sediments

    Enhancing Energy Production with Exascale HPC Methods

    Get PDF
    High Performance Computing (HPC) resources have become the key actor for achieving more ambitious challenges in many disciplines. In this step beyond, an explosion on the available parallelism and the use of special purpose processors are crucial. With such a goal, the HPC4E project applies new exascale HPC techniques to energy industry simulations, customizing them if necessary, and going beyond the state-of-the-art in the required HPC exascale simulations for different energy sources. In this paper, a general overview of these methods is presented as well as some specific preliminary results.The research leading to these results has received funding from the European Union's Horizon 2020 Programme (2014-2020) under the HPC4E Project (www.hpc4e.eu), grant agreement n° 689772, the Spanish Ministry of Economy and Competitiveness under the CODEC2 project (TIN2015-63562-R), and from the Brazilian Ministry of Science, Technology and Innovation through Rede Nacional de Pesquisa (RNP). Computer time on Endeavour cluster is provided by the Intel Corporation, which enabled us to obtain the presented experimental results in uncertainty quantification in seismic imagingPostprint (author's final draft

    Review of the Synergies Between Computational Modeling and Experimental Characterization of Materials Across Length Scales

    Full text link
    With the increasing interplay between experimental and computational approaches at multiple length scales, new research directions are emerging in materials science and computational mechanics. Such cooperative interactions find many applications in the development, characterization and design of complex material systems. This manuscript provides a broad and comprehensive overview of recent trends where predictive modeling capabilities are developed in conjunction with experiments and advanced characterization to gain a greater insight into structure-properties relationships and study various physical phenomena and mechanisms. The focus of this review is on the intersections of multiscale materials experiments and modeling relevant to the materials mechanics community. After a general discussion on the perspective from various communities, the article focuses on the latest experimental and theoretical opportunities. Emphasis is given to the role of experiments in multiscale models, including insights into how computations can be used as discovery tools for materials engineering, rather than to "simply" support experimental work. This is illustrated by examples from several application areas on structural materials. This manuscript ends with a discussion on some problems and open scientific questions that are being explored in order to advance this relatively new field of research.Comment: 25 pages, 11 figures, review article accepted for publication in J. Mater. Sc

    Integrated system to perform surrogate based aerodynamic optimisation for high-lift airfoil

    Get PDF
    This work deals with the aerodynamics optimisation of a generic two-dimensional three element high-lift configuration. Although the high-lift system is applied only during take-off and landing in the low speed phase of the flight the cost efficiency of the airplane is strongly influenced by it [1]. The ultimate goal of an aircraft high lift system design team is to define the simplest configuration which, for prescribed constraints, will meet the take-off, climb, and landing requirements usually expressed in terms of maximum L/D and/or maximum CL. The ability of the calculation method to accurately predict changes in objective function value when gaps, overlaps and element deflections are varied is therefore critical. Despite advances in computer capacity, the enormous computational cost of running complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimisation. To cut down the cost, surrogate models, also known as metamodels, are constructed from and then used in place of the actual simulation models. This work outlines the development of integrated systems to perform aerodynamics multi-objective optimisation for a three-element airfoil test case in high lift configuration, making use of surrogate models available in MACROS Generic Tools, which has been integrated in our design tool. Different metamodeling techniques have been compared based on multiple performance criteria. With MACROS is possible performing either optimisation of the model built with predefined training sample (GSO) or Iterative Surrogate-Based Optimization (SBO). In this first case the model is build independent from the optimisation and then use it as a black box in the optimisation process. In the second case is needed to provide the possibility to call CFD code from the optimisation process, and there is no need to build any model, it is being built internally during the optimisation process. Both approaches have been applied. A detailed analysis of the integrated design system, the methods as well as th

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie
    • …
    corecore