The European Journal of Physics N (EPJ-N)
Not a member yet
    401 research outputs found

    Enhancing severe accident management through research

    No full text
    After the Fukushima Daiichi accident, a new wave of research projects aiming at enhancing severe accident (SA) management was launched under different international frameworks. This was the case of MUSA (Management and Uncertainties of Severe Accidents), AMHYCO (Towards and enhanced Accident Management of the H2 and CO combustion risk) and SOCRATES (Assessment of liquid Source Term for accidental post management phase), which under the frame of H2020 and HEUROPE EURATOM were devised to optimize different aspects of Severe Accident management. MUSA explored how bringing uncertainties quantification in the Severe Accident analysis might provide sounder insights into effects and timing of accident management actions. AMHYCO brought new insights into combustion risk management, particularly during the ex-vessel phase of the accident, by combining in a selective manner different analytical approaches and data on recombination and combustion of gas mixtures (i.e., H2/CO/air/steam). SOCRATES is hitting accident management related to liquid source terms, with emphasis in the long-run of the accident. This paper describes the major outcomes of the projects and outlines what should come next for an efficient application of the insights gained in the accident management

    Status of GPU capabilities within the Shift Monte Carlo radiation transport code

    No full text
    Shift is a general-purpose Monte Carlo (MC) radiation transport code for fission, fusion, and national security applications. Shift has been adapted to efficiently run on GPUs in order to leverage leadership-class supercomputers. This work presents Shift’s current GPU capabilities. These include core radiation transport capabilities for eigenvalue and fixed-source simulations, and support for non-uniform domain decomposition, Doppler broadening, free-gas elastic scattering, general-purpose geometry, hybrid MC/deterministic transport, and depletion. Transport results demonstrate a 2–5× GPU-to-CPU speedup on a per-node basis for an eigenvalue problem on the Frontier supercomputer and a 28× speedup for a fixed-source problem on the Summit supercomputer

    Capability overview of the DIANE multiparticle transport code

    No full text
    The DIANE code is a simulation software that solves the transport equation for neutrons, photons, electrons and light ions using the Monte Carlo method. The DIANE code can perform various kinds of calculations, such as criticality or shielding simulations. This paper presents an overview of the DIANE code capabilities, going through the description of input data, the transport simulation and some examples of applications

    ARCHER-a Monte Carlo code for multi-particle radiotherapy through GPU-accelerated simulation and DL-based denoising

    No full text
    The ARCHER project was initiated about 14 years ago to explore the use of emerging GPU technologies for fast Monte Carlo (MC) calculations. This paper presents the latest work to integrate the newly developed deep conventional neural network (dCNN) based MC denoising method with GPU-based MC multi-particle radiation transport simulation method to demonstrate a real-time dose computing capability for clinically realistic radiotherapy examples. The computing process involves GPU-based dose calculations that is followed by dCNN-based denoising. The dCNN-based dose denoiser is designed and employed to reduce the statistical uncertainty in dose distributions in patient anatomy defined by 3D computed tomography (CT) images. The training data include a range of dose distributions covering low-count/high-noise (DoseLCHN) and high-count/low-noise (DoseHCLN). The extremely large DoseLCHN and DoseHCLN dataset was generated from ARCHER. The DoseLCHN dataset is input into the trained model to output a predicted DoseHCLN dataset. For the evaluation, the DoseHCLN dataset produced by ARCHER is considered to be the ground truth. Experimental results show that the dose distributions generated from newly proposed method agreed consistently with the DoseHCLN produced from ARCHER. For hundreds of patient radiation treatment cases involving photons and protons, the average running time for one patient (GPU-based dose simulation followed by dCNN-based denoising) is about 200 ms. These preliminary results have demonstrated the feasibility of real-time Monte Carlo dose computing using an integrated dCNN-based denoising and GPU-based dose calculational approach. On-going studies involving more radiation types and clinical procedures are expected to facilitate the use of real-time MC dose planning and verification in the clinical workflow

    Reactor performance, system reliability: instrumentation and control

    No full text
    The safe operation of nuclear power plants relies on Non-destructive Evaluation (NDE) of safety critical components in both the initial manufacturing phase and over the reactor’s lifecycle. The conventional approach consists of in-service inspections scheduled at regular intervals, with a periodicity adapted to expected or observed failure mechanisms and their kinetics. The three projects discussed in this paper address and challenge this approach in different ways. iWeld focuses on the inspection of welds and aims to take information about the microstructure of the material under inspection into account to improve the performance of ultrasound inspections. El-Peacetolero designed a hand-held, low power embedded optoelectronic system for an in-situ real-time assessment of aging polymers. FIND aims to develop in-situ instrumentation adapted to the specific requirements of the nuclear power industry and introduces continuous monitoring of metallic pipes to prevent their failure and optimise maintenance. The three projects discussed are at different stages: El-Peacetolero ends in February 2025, FIND just kicked off, and iWeld is halfway in between. In this review, we try to give a high-level introduction, and discuss particular challenges and achievements

    Link between material properties and integrity assessment of NPP components within EU funded projects APAL, INCEFA-SCALE and FRACTESUS

    No full text
    Deep understanding of aging of the most important nuclear power plant (NPP) components and their material degradation on the one hand and development of advanced methods of the assessment of those components’ integrity and lifetime on the other hand is the only way to ensure safe operation of NPPs for long-term operation (LTO). The most significant degradation mechanisms are fatigue and irradiation embrittlement. Within Euratom research and training programme HORIZON 2020, several projects were running in several past years focussed on the research of the above-mentioned degradation mechanisms and on the way of assessing their impact. Three such projects are described in this paper: APAL (Advanced PTS Analysis for LTO) project addresses challenges associated with multidisciplinary character of the pressurised thermal shock (PTS) analyses (both deterministic and probabilistic) and quantification of safety margins, INCEFA-SCALE (INcreasing safety in NPPs by Covering gaps in Environmental Fatigue Assessment – focusing on gaps between laboratory data and component SCALE) aims to improve assessments of fatigue lifetime of nuclear power plant components when subjected to environmentally assisted fatigue (EAF) loading and to provide guidance on the transferability of laboratory scale testing results to component-scale, FRACTESUS (Fracture mechanics testing of irradiated RPV steels by means of sub-sized specimens) aims to determine the effect of specimen size on the fracture toughness properties. Large inter-laboratory testing is included to prove the repeatability and reproducibility of the small-scale testing of fracture toughness properties. Finite element models (FEM) are used to support the experimental results

    European collaborations for safe and efficient dismantling: digital twins, ontology and data exchange

    No full text
    Due to economic considerations and political decisions, an increasing number of nuclear facilities is to be dismantled in the coming decades. The large number of nuclear decommissioning projects must comply with the reliability and safety requirements in order to make the dismantling operations more efficient, safer and more cost-effective. This paper gives an overview of European coordinated efforts to develop and demonstrate the use of digital tools and methods for safe and efficient decommissioning activities through the projects PLEIADES (PLatform based on Emerging and Interoperable Applications for enhanced Decommissioning processES) and DORADO (Digital twins and Ontology for Robot Assisted Decommissioning Operations). Achieved by the end of 2023, the PLEIADES project defined a common ontology specifically designed for nuclear decommissioning projects. It developed a central server for combining data while ensuring compatibility and it provided the first pilot integration of digital decommissioning and waste management support tools. PLEIADES demonstrated the usefulness and efficiency of this concept using data of three real nuclear sites. Starting in the second half of 2024, the DORADO project will continue this work by creating a holistic digital data-driven platform as a BIM/DT (Building-Information-Model/Digital-Twin) and by integrating new digital tools into a coherent suite customized for decommissioning applications. Eight digital technologies will be integrated, including point-cloud data, 3D models and change detection, sensors data fusion, ALARA (As Low as Reasonably Achievable) dose estimation, robot mission optimization, and smart voice assistant interface

    Development of TopMC 1.0 for nuclear technology applications

    No full text
    Particle transport plays an important role in nuclear technology applications. As a generalized methodology, Monte Carlo is widely employed for particle transport. We investigated several key difficulties in the field, specifically addressing aspects like voxel modeling, coupled photon-electron transportation, and advanced pulse height tallying methodologies. We have developed some essential technologies to enhance the capabilities of the Multi-functional Program for Neutronics Calculation, Nuclear Design and Safety Evaluation (TopMC). This contribution presents the progress in TopMC’s R&D, including the voxel model establishment based on medical image data and fast particle tracking method, an electron transport mechanism grounded in the condensed history approach, and a variance reduction strategy to improve the efficiency of pulse height tallies. Moreover, a series of applications in the nuclear technology field were used to validate and verify TopMC, demonstrating its accuracy and efficiency. TopMC can be applied in particle transport of Boron Neutron Capture Therapy (BNCT), nuclear logging, gamma radiation detection systems, electron accelerator, etc

    The PIANOFORTE partnership: Elevating European research for enhanced radiation protection

    No full text
    The PIANOFORTE partnership (2022–2029) aims to enhance radiation protection for the public, patients, workers, and the environment across various exposure scenarios. This European initiative addresses key barriers in health and environmental risk research related to ionising radiation and promotes findings that support effective radiation protection policies. By building a comprehensive pan-European scientific and technological foundation, PIANOFORTE ensures that the radiation protection system remains fit-for-purpose, delivers science-based policy recommendations and improved practices across sectors using nuclear technology and ionising radiation, including both energy-related and non-energy applications. In the medical field, PIANOFORTE works to reduce uncertainties in health risk estimates and support innovations in cancer diagnosis and therapies. Other key priorities include developing reliable methods for evaluating radiation protection related to new technologies and managing radiation emergencies, improving strategies for both immediate response and long-term recovery. The Partnership's multi-stage prioritisation mechanism of research needs ensures that developed efforts reflect the perspectives of a broad range of stakeholders, including researchers, policy makers, regulators, implementers and practitioners. This inclusiveness aligns research priorities with pressing societal challenges, such as climate change impacts and nuclear technology safety. PIANOFORTE's open call process funds research projects that align with its strategic goals, expanding its network from 58 to 108 partners after inclusion of new partners of granted projects during the two first open calls. Additional calls will continue to foster collaboration and increase research capacity across Europe. By adopting FAIR (Findable, Accessible, Interoperable, and Reusable) data management practices and embracing open science, PIANOFORTE supports the broader radiation protection community in sharing infrastructure and research outcomes. Educational initiatives are central to PIANOFORTE's mission, as it builds Europe's expertise in radiation protection through training programmes for current and next generation scientists. Structured dialogue with stakeholders strengthens the Partnership's impact, bridging research and policy and helping to create a well-informed, resilient society capable of making sound, risk-aware decisions about nuclear and radiation-related issues

    Representativity studies of GEN-III large cores to ZPR experiments with respect to nuclear data

    No full text
    Uncertainty quantification plays a crucial role in demonstrating the safety of nuclear reactors by assessing and accounting for the various sources of uncertainty in reactor performance predictions. This process helps establish safety margins, which are essential for ensuring that the reactor operates safely under a wide range of conditions. For existing reactors, it is mainly based on comparisons between calculations and measurements. However, the lack of experimental data in some cases (new reactor concepts, accidental conditions,…) has made the so-called “transposition”, at the very least, a complement to the latter. The most commonly used methods for this purpose rely on bayesian inference and requires a high degree of similarity between the integral parameters of the different configurations, also called representativity. This paper presents the methodology and some results of evaluated representativity factors between ZPR experiments and a Gen-III+ target core issued from the UAM benchmark at different scales and their evolution throughout the fuel cycle life, using the industrial state-of-the-art code COCAGNE. The goal is to study the relevance of such approach in an industrial context. The paper focuses on the effective multiplication factor and the center over periphery fission rate ratio. Standard (SPT) and generalized (GPT) perturbation theories are employed to determine sensitivities with respect to nuclear data and their uncertainties are propagate to the outputs through the sandwich rule with covariance data collapsed from a fine to a coarse energy mesh

    0

    full texts

    341

    metadata records
    Updated in last 30 days.
    The European Journal of Physics N (EPJ-N)
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇