3,274 research outputs found

    Monte Carlo domain decomposition for robust nuclear reactor analysis

    Get PDF
    Monte Carlo (MC) neutral particle transport codes are considered the gold-standard for nuclear simulations, but they cannot be robustly applied to high-fidelity nuclear reactor analysis without accommodating several terabytes of materials and tally data. While this is not a large amount of aggregate data for a typical high performance computer, MC methods are only embarrassingly parallel when the key data structures are replicated for each processing element, an approach which is likely infeasible on future machines. The present work explores the use of spatial domain decomposition to make full-scale nuclear reactor simulations tractable with Monte Carlo methods, presenting a simple implementation in a production-scale code. Good performance is achieved for mesh-tallies of up to 2.39 TB distributed across 512 compute nodes while running a full-core reactor benchmark on the Mira Blue Gene/Q supercomputer at the Argonne National Laboratory. In addition, the effects of load imbalances are explored with an updated performance model that is empirically validated against observed timing results. Several load balancing techniques are also implemented to demonstrate that imbalances can be largely mitigated, including a new and efficient way to distribute extra compute resources across finer domain meshes.United States. Dept. of Energy. Center for Exascale Simulation of Advanced Reactor

    Data decomposition of Monte Carlo particle transport simulations via tally servers

    Get PDF
    An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithm in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.United States. Dept. of Energy. Naval Reactors Division. Rickover Fellowship Program in Nuclear EngineeringUnited States. Dept. of Energy. Office of Advanced Scientific Computing Research (Contract DE-AC02-06CH11357)United States. Dept. of Energy (Consortium for Advanced Simulation of Light Water Reactors. Contract DE-AC05-00OR22725

    Monte Carlo and Depletion Reactor Analysis for High-Performance Computing Applications

    Get PDF
    This dissertation discusses the research and development for a coupled neutron trans- port/isotopic depletion capability for use in high-preformance computing applications. Accurate neutronics modeling and simulation for \real reactor problems has been a long sought after goal in the computational community. A complementary \stretch goal to this is the ability to perform full-core depletion analysis and spent fuel isotopic characterization. This dissertation thus presents the research and development of a coupled Monte Carlo transport/isotopic depletion implementation with the Exnihilo framework geared for high-performance computing architectures to enable neutronics analysis for full-core reactor problems. An in-depth case study of the current state of Monte Carlo neutron transport with respect to source sampling, source convergence, uncertainty underprediction and biases associated with localized tallies in Monte Carlo eigenvalue calculations was performed using MCNPand KENO. This analysis is utilized in the design and development of the statistical algorithms for Exnihilo\u27s Monte Carlo framework, Shift. To this end, a methodology has been developed in order to perform tally statistics in domain decomposed environments. This methodology has been shown to produce accurate tally uncertainty estimates in domain-decomposed environments without a significant increase in the memory requirements, processor-to-processor communications, or computational biases. With the addition of parallel, domain-decomposed tally uncertainty estimation processes, a depletion package was developed for the Exnihilo code suite to utilize the depletion capabilities of the Oak Ridge Isotope GENeration code. This interface was designed to be transport agnostic, meaning that it can be used by any of the reactor analysis packages within Exnihilo such as Denovo or Shift. Extensive validation and testing of the ORIGEN interface and coupling with the Shift Monte Carlo transport code is performed within this dissertation, and results are presented for the calculated eigenvalues, material powers, and nuclide concentrations for the depleted materials. These results are then compared to ORIGEN and TRITON depletion calculations, and analysis shows that the Exnihilo transport-depletion capability is in good agreement with these codes

    Global sensitivity analysis of computer models with functional inputs

    Get PDF
    Global sensitivity analysis is used to quantify the influence of uncertain input parameters on the response variability of a numerical model. The common quantitative methods are applicable to computer codes with scalar input variables. This paper aims to illustrate different variance-based sensitivity analysis techniques, based on the so-called Sobol indices, when some input variables are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary meta-modeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked Generalized Linear Models (GLM) or Generalized Additive Models (GAM). The ``mean'' model allows to estimate the sensitivity indices of each scalar input variables, while the ``dispersion'' model allows to derive the total sensitivity index of the functional input variables. The proposed approach is compared to some classical SA methodologies on an analytical function. Lastly, the proposed methodology is applied to a concrete industrial computer code that simulates the nuclear fuel irradiation

    Verification of the Parallel Transport Codes Parafish and AZTRAN with the TAKEDA Benchmarks

    Get PDF
    With the increase in computational resources, parallel computation in neutron transport codes is inherent since it allows simulations with high spatial-angular resolution. Among the different methodologies available for the solution of the neutron transport equation, spherical harmonics (PN_{N}) and discrete-ordinates (SN_{N}) approximations have been widely used, as they are established classical methods for performing nuclear reactor calculations. This work focuses on describing and verifying two parallel deterministic neutron transport codes under development. The first one is the Parafish code that is based on the finite-element method and PN_{N} approximation. The second one is the AZTRAN code, based on the RTN-0 nodal method and SN_{N} approximation. The capabilities of these two codes have been tested on the TAKEDA benchmarks and the results obtained show good behavior and accuracy compared to the Monte Carlo reference solutions. Additionally, the speedup obtained by each code in the parallel execution is acceptable. In general, the results encourage further improvement in the codes to be comparable to other well-validated deterministic transport codes

    On the use of tally servers in Monte Carlo simulations of light-water reactors

    Get PDF
    An algorithm for decomposing tally data in Monte Carlo simulations using servers has recently been proposed and analyzed. In the present work, we make a number of refinements to a theoretical performance model of the tally server algorithm to better predict the performance of a realistic reactor simulation using Monte Carlo. The impact of subdividing fuel into annular segments on parameters of the performance model is evaluated and shown to result in a predicted overhead of less than 20% for a PWR benchmark on the Mira Blue Gene/Q supercomputer. Additionally, a parameter space study is performed comparing tally server implementations using blocking and non-blocking communication. Non-blocking communication is shown to reduce the communication overhead relative to blocking communication, in some cases resulting in negative overhead.United States. Dept. of Energy. Office of Advanced Scientific Computing Research (Contract DE-AC02-06CH11357

    A high-fidelity multiphysics system for neutronic, thermalhydraulic and fuel-performance analysis of Light Water Reactors

    Get PDF
    Das Verhalten des Kerns in einem Leichtwasserreaktor (LWR) wird von neutronenphysikalischen, thermohydraulischen und thermomechanischen Phänomenen dominiert. Komplexe Rückkopplungsmechanismen verbinden diese physikalischen Bereiche. Einer der aktuellen Tendenzen in der Reaktorphysik ist daher die Implementierung von Multiphysik-Methoden, die diese Wechselwirkungen erfassen, um eine konsistente Beschreibung des Kerns zu liefern. Ein weiterer wichtiger Arbeitsbereich ist die Entwicklung von High-Fidelity-Rechenprogrammen, die die Modellierungsauflösung erhöhen und starke Vereinfachungen eliminieren, die in räumlich homogenisierten Simulationen verwendet werden. Multiphysik- und High-Fidelity-Methoden sind auf die Verfügbarkeit von Hochleistungsrechnern angewiesen, die die Machbarkeit und den Umfang dieser Art von Simulationen begrenzen. Das Ziel dieser Arbeit ist die Entwicklung eines Multiphysik-Simulationssystems, das in der Lage ist, gekoppelte neutronenphysikalische, thermohydraulische und thermomechanische Analysen von LWR-Kernen mit einer High-Fidelity-Methodik durchzuführen. Um dies zu erreichen, wird die Monte-Carlo-Teilchentransportmethode verwendet, um das Verhalten der neutronenphysikalischen Effekte zu simulieren, ohne auf größere physikalische Näherungen zurückzugreifen. Für die Abbrandrechnungen bezüglich des gesamten Kerns, wird eine gebietsbezogene Datenaufteilung der Partikelverfolgung vorgeschlagen und implementiert. Die Kombination der Monte-Carlo-Methode mit der Thermohydraulik auf Unterkanalebene und eine vollständige Analyse des Brennstoffverhaltens aller Brennstäbe beschreibt eine extrem detaillierte Darstellung des Kerns. Die erforderliche Rechenleistung erreicht die Grenzen aktueller Hochleistungsrechner. Auf der Softwareseite wird ein innovativer objektorientierter Kopplungsansatz verwendet, um die Modularität, Flexibilität und Wartbarkeit des Programms zu erhöhen. Die Genauigkeit dieses gekoppelten Systems von drei Programmen wird mit experimentellen Daten von zwei in Betrieb befindlichen Kraftwerken, einem Pre-Konvoi DWR und dem Temelín II WWER-1000 Reaktor, bewertet. Für diese beiden Fälle werden die Ergebnisse der Abbrandrechnung des gesamten Kerns anhand von Messungen der kritischen Borkonzentration und des Brennstabneutronenflusses validiert. Diese Simulationen dienen der Darstellung der hochmodernen Modellierungsfähigkeiten des entwickelten Werkzeugs und zeigen die Durchführbarkeit dieser Methodik für industrielle Anwendungen

    Multi-core performance studies of a Monte Carlo neutron transport code

    Get PDF
    Performance results are presented for a multi-threaded version of the OpenMC Monte Carlo neutronics code using OpenMP in the context of nuclear reactor criticality calculations. Our main interest is production computing, and thus we limit our approach to threading strategies that both require reasonable levels of development effort and preserve the code features necessary for robust application to real-world reactor problems. Several approaches are developed and the results compared on several multi-core platforms using a popular reactor physics benchmark. A broad range of performance studies are distilled into a simple, consistent picture of the empirical performance characteristics of reactor Monte Carlo algorithms on current multi-core architectures.United States. Dept. of Energy. Office of Advanced Scientific Computing Research (Contract DEAC02-06CH11357
    • …
    corecore