7,651 research outputs found

    MAESTRO, CASTRO, and SEDONA -- Petascale Codes for Astrophysical Applications

    Full text link
    Performing high-resolution, high-fidelity, three-dimensional simulations of Type Ia supernovae (SNe Ia) requires not only algorithms that accurately represent the correct physics, but also codes that effectively harness the resources of the most powerful supercomputers. We are developing a suite of codes that provide the capability to perform end-to-end simulations of SNe Ia, from the early convective phase leading up to ignition to the explosion phase in which deflagration/detonation waves explode the star to the computation of the light curves resulting from the explosion. In this paper we discuss these codes with an emphasis on the techniques needed to scale them to petascale architectures. We also demonstrate our ability to map data from a low Mach number formulation to a compressible solver.Comment: submitted to the Proceedings of the SciDAC 2010 meetin

    On the predictivity of pore-scale simulations : estimating uncertainties with multilevel Monte Carlo

    Get PDF
    A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another “equivalent” sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [1], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers, extrapolation and post-processing techniques. The proposed method can be efficiently used in many porous media applications for problems such as stochastic homogenization/upscaling, propagation of uncertainty from microscopic fluid and rock properties to macro-scale parameters, robust estimation of Representative Elementary Volume size for arbitrary physics

    Towards a Mini-App for Smoothed Particle Hydrodynamics at Exascale

    Full text link
    The smoothed particle hydrodynamics (SPH) technique is a purely Lagrangian method, used in numerical simulations of fluids in astrophysics and computational fluid dynamics, among many other fields. SPH simulations with detailed physics represent computationally-demanding calculations. The parallelization of SPH codes is not trivial due to the absence of a structured grid. Additionally, the performance of the SPH codes can be, in general, adversely impacted by several factors, such as multiple time-stepping, long-range interactions, and/or boundary conditions. This work presents insights into the current performance and functionalities of three SPH codes: SPHYNX, ChaNGa, and SPH-flow. These codes are the starting point of an interdisciplinary co-design project, SPH-EXA, for the development of an Exascale-ready SPH mini-app. To gain such insights, a rotating square patch test was implemented as a common test simulation for the three SPH codes and analyzed on two modern HPC systems. Furthermore, to stress the differences with the codes stemming from the astrophysics community (SPHYNX and ChaNGa), an additional test case, the Evrard collapse, has also been carried out. This work extrapolates the common basic SPH features in the three codes for the purpose of consolidating them into a pure-SPH, Exascale-ready, optimized, mini-app. Moreover, the outcome of this serves as direct feedback to the parent codes, to improve their performance and overall scalability.Comment: 18 pages, 4 figures, 5 tables, 2018 IEEE International Conference on Cluster Computing proceedings for WRAp1

    A bibliography on parallel and vector numerical algorithms

    Get PDF
    This is a bibliography of numerical methods. It also includes a number of other references on machine architecture, programming language, and other topics of interest to scientific computing. Certain conference proceedings and anthologies which have been published in book form are listed also

    H theorem for contact forces in granular materials

    Full text link
    A maximum entropy theorem is developed and tested for granular contact forces. Although it is idealized, describing two dimensional packings of round, rigid, frictionless, cohesionless disks with coordination number Z=4, it appears to describe a central part of the physics present in the more general cases. The theorem does not make the strong claims of Edwards' hypothesis, nor does it rely upon Edwards' hypothesis at any point. Instead, it begins solely from the physical assumption that closed loops of grains are unable to impose strong force correlations around the loop. This statement is shown to be a generalization of Boltzmann's Assumption of Molecular Chaos (his \textit{stosszahlansatz}), allowing for the extra symmetries of granular stress propagation compared to the more limited symmetries of momentum propagation in a thermodynamic system. The theorem that follows from this is similar to Boltzmann's HH theorem and is presented as an alternative to Edwards' hypothesis for explaining some granular phenomena. It identifies a very interesting feature of granular packings: if the generalized \textit{stosszahlansatz} is correct, then the bulk of homogeneous granular packings must satisfy a maximum entropy condition simply by virtue of being stable, without any exploration of phase space required. This leads to an independent derivation of the contact force statistics, and these predictions have been compared to numerical simulation data in the isotropic case. The good agreement implies that the generalized \textit{stosszahlansatz} is indeed accurate at least for the isotropic state of the idealized case studied here, and that it is the reductionist explanation for contact force statistics in this case.Comment: 15 pages, 8 figures, to appear in Phys. Rev.

    Advanced Bayesian framework for uncertainty estimation of sediment transport models

    Get PDF
    2018 Summer.Includes bibliographical references.Numerical sediment transport models are widely used to forecast the potential changes in rivers that might result from natural and/or human influences. Unfortunately, predictions from those models always possess uncertainty, so that engineers interpret the model results very conservatively, which can lead to expensive over-design of projects. The Bayesian inference paradigm provides a formal way to evaluate the uncertainty in model forecasts originating from uncertain model elements. However, existing Bayesian methods have rarely been used for sediment transport models because they often have large computational times. In addition, past research has not sufficiently addressed ways to treat the uncertainty associated with diverse sediment transport variables. To resolve those limitations, this study establishes a formal and efficient Bayesian framework to assess uncertainty in the predictions from sediment transport models. Throughout this dissertation, new methodologies are developed to represent each of three main uncertainty sources including poorly specified model parameter values, measurement errors contained in the model input data, and imperfect sediment transport equations used in the model structure. The new methods characterize how those uncertain elements affect the model predictions. First, a new algorithm is developed to estimate the parameter uncertainty and its contribution to prediction uncertainty using fewer model simulations. Second, the uncertainties of various input data are described using simple error equations and evaluated within the parameter estimation framework. Lastly, an existing method that can assess the uncertainty related to the selection and application of a transport equation is modified to enable consideration of multiple model output variables. The new methodologies are tested with a one-dimensional sediment transport model that simulates flume experiments and a natural river. Overall, the results show that the new approaches can reduce the computational time about 16% to 55% and produce more accurate estimates (e.g., prediction ranges can cover about 6% to 46% more of the available observations) compared to existing Bayesian methods. Thus, this research enhances the applicability of Bayesian inference for sediment transport modeling. In addition, this study provides several avenues to improve the reliability of the uncertainty estimates, which can help guide interpretation of model results and strategies to reduce prediction uncertainty
    • 

    corecore