4,691 research outputs found

    ASCR/HEP Exascale Requirements Review Report

    Full text link
    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio

    Unraveling Diffusion in Fusion Plasma: A Case Study of In Situ Processing and Particle Sorting

    Full text link
    This work starts an in situ processing capability to study a certain diffusion process in magnetic confinement fusion. This diffusion process involves plasma particles that are likely to escape confinement. Such particles carry a significant amount of energy from the burning plasma inside the tokamak to the diverter and damaging the diverter plate. This study requires in situ processing because of the fast changing nature of the particle diffusion process. However, the in situ processing approach is challenging because the amount of data to be retained for the diffusion calculations increases over time, unlike in other in situ processing cases where the amount of data to be processed is constant over time. Here we report our preliminary efforts to control the memory usage while ensuring the necessary analysis tasks are completed in a timely manner. Compared with an earlier naive attempt to directly computing the same diffusion displacements in the simulation code, this in situ version reduces the memory usage from particle information by nearly 60% and computation time by about 20%

    Purdue Contribution of Fusion Simulation Program

    Full text link

    Scientific workflow orchestration interoperating HTC and HPC resources

    Get PDF
    8 páginas, 7 figuras.-- El Pdf del artículo es la versión pre-print.In this work we describe our developments towards the provision of a unified access method to different types of computing infrastructures at the interop- eration level. For that, we have developed a middleware suite which bridges not interoperable middleware stacks used for building distributed computing infrastructues, UNICORE and gLite. Our solution allows to transparently access and operate on HPC and HTC resources from a single interface. Using Kepler as workflow manager, we provide users with the needed integration of codes to create scientific workflows accessing both types of infrastructures.Peer reviewe

    Shape: A 3D Modeling Tool for Astrophysics

    Full text link
    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a-priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.Comment: 13 pages, 11 figures, accepted for publication in the "IEEE Transactions on Visualization and Computer Graphics

    토카막 통합 시뮬레이션 코드의 개발과 여러 장치에 대한 적용 연구

    Get PDF
    학위논문(박사) -- 서울대학교대학원 : 공과대학 에너지시스템공학부, 2022. 8. 나용수.The in-depth design and implementation of a newly developed integrated suite of codes, TRIASSIC (tokamak reactor integrated automated suite for simulation and computation), are reported. The suite comprises existing plasma simulation codes, including equilibrium solvers, 1.5D and 2D plasma transport solvers, neoclassical and anomalous transport models, current drive and heating (cooling) models, and 2D grid generators. The components in TRIASSIC could be fully modularized, by adopting a generic data structure as its internal data. Due to a unique interfacing method that does not depend on the generic data itself, legacy codes that are no longer maintained by the original author were easily interfaced. The graphical user interface and the parallel computing of the framework and its components are also addressed. The verification of TRIASSIC in terms of equilibrium, transport, and heating is also shown. Following the data model and definition of the data structure, a declarative programming method was adopted in the core part of the framework. The method was used to keep the internal data consistency of the data by enforcing the reciprocal relations between the data nodes, contributing to extra flexibility and explicitness of the simulations. TRIASSIC was applied on various devices including KSTAR, VEST, and KDEMO, owing to its flexibility in composing a workflow. TRIASSIC was validated against KSTAR plasmas in terms of interpretive and predictive modelings. The prediction and validation on the VEST device using TRIASSIC are also shown. For the applications to the upcoming KDEMO device, the machine design parameters were optimized, targeting an economical fusion demonstration reactor.본 연구에서는 TRIASSIC (tokamak reactor integrated automated suite for simulation and computation) 코드의 자세한 디자인과 실행 결과에 대해 소개합니다. 이 시뮬레이션 코드는 기존에 존재하던 플라즈마 평형, 1.5차원 및 2차원 플라즈마 수송, 신고전 및 난류 수송 모델, 전류 구동 및 가열 (냉각) 모델, 그리고 2차원 격자 생성기 등의 코드를 구성하여 만들어졌습니다. 프레임워크 내 데이터 구조로써 일반 데이터 구조를 채택함으로써 TRIASSIC의 코드 구성요소들은 완전한 모듈화 방식으로 결합될 수 있었습니다. 일반 데이터 구조에 의존하지 않는 독특한 코드 결합 방식으로 인해, 더 이상 유지보수되지 않는 레거시 코드들 또한 쉽게 결합될 수 있었습니다. 본 코드의 그래피컬 유저 인터페이스, 프레임워크와 코드 구성 요소들의 병렬 컴퓨팅에 관한 내용도 다뤄집니다. 평형, 수송, 그리고 가열 측면에서의 TRIASSIC 시뮬레이션의 검증 내용도 소개됩니다. 시뮬레이션 프레임워크 내 일반 데이터 구조의 데이터 모델과 데이터 정의를 만족시키기 위해, 데이터를 관리하는 프레임워크의 중심부에는 선언적 프로그래밍이 도입되었습니다. 선언적 프로그래밍을 통해 일반 데이터의 데이터 노드 간 관계식을 만족시킴으로써 데이터 간 내부 일관성을 확보하고, 코드의 유연성과 명시성을 추가적으로 확보할 수 있었습니다. TRIASSIC은 해석적, 예측적 모델링 측면에서 KSTAR 플라즈마를 대상으로 검증되었습니다. VEST 장치를 대상으로 한 예측 및 이에 대한 검증 내용 또한 서술됩니다. 경제적인 핵융합 실증로 건설을 목표로 KDEMO 장치에 대한 적용 및 장치 설계 최적화 연구도 소개됩니다.Abstract 1 Table of Contents 2 List of Figures 4 List of Tables 10 Chapter 1. Introduction 11 1.1. Background 11 1.1.1. Fusion Reactor and Modeling 11 1.1.2. Interpretive Analysis and Predictive Modeling 17 1.1.3. Modular Approach 21 1.1.4. The Standard Data Structure 24 1.1.5. The Internal Data Consistency in a Generic Data 28 1.1.6. Integration of Physics Codes into IDS 29 1.2. Overview of the Research 31 Chapter 2. Development of Integrated Suite of Codes 33 2.1. Development of TRIASSIC 33 2.1.1. Design Requirements 33 2.1.2. Overview of TRIASSIC 35 2.1.3. Comparison of Integrated Simulation Codes 40 2.2. Components in the Framework 43 2.2.1. Physics Codes Interfaced with the Framework 43 2.2.2. Physics Code Interfacings 46 2.2.3. Graphical User Interface 52 2.2.4. Jobs Scheduler and MPI 55 2.3. Verifications 57 2.3.1. The Coordinate Conventions 57 2.3.2. Coupling of Equilibrium-Transport 59 2.3.3. Neoclassical Transport and Bootstrap Current 63 2.3.4. Heating and Current Drive 65 Chapter 3. Improvements in Keeping the Internal Data Consistency 68 3.1. Background 68 3.2. Possible Implementations of a Component 71 3.3. A Method Adopted in the Framework 73 3.3.1. Prerequisites and Relation Definitions 73 3.3.2. Adding Relations in the Framework 78 3.3.3. Applying Relations 80 3.4. Performance and Flexibility of the Framework 83 3.4.1. Performance Enhancement 83 3.4.2. Flexibility and Maintenance of the Framework 85 Chapter 4. Applications to Various Devices 91 4.1. Applications to KSTAR 91 4.1.1. Kinetic equilibrium workflow and its validation 91 4.1.2. Stationary-state predictive modeling workflow 95 4.2. Application to VEST 102 4.2.1. Time-dependent predictive modeling workflow 103 4.3. Application to KDEMO 106 4.3.1. Predictive simulation workflow for optimization 106 Chapter 5. Summary and Conclusion 112 5.1. Summary and Conclusion 112 Appendix 116 A. Code Snippet of the Relation Definition 116 Bibliography 118 Abstract in Korean 126박

    Reconstruction of tokamak plasma safety factor profile using deep learning

    Full text link
    In tokamak operations, accurate equilibrium reconstruction is essential for reliable real-time control and realistic post-shot instability analysis. The safety factor (q) profile defines the magnetic field line pitch angle, which is the central element in equilibrium reconstruction. The motional Stark effect (MSE) diagnostic has been a standard measurement for the magnetic field line pitch angle in tokamaks that are equipped with neutral beams. However, the MSE data are not always available due to experimental constraints, especially in future devices without neutral beams. Here we develop a deep learning-based surrogate model of the gyrokinetic toroidal code for q profile reconstruction (SGTC-QR) that can reconstruct the q profile with the measurements without MSE to mimic the traditional equilibrium reconstruction with the MSE constraint. The model demonstrates promising performance, and the sub-millisecond inference time is compatible with the real-time plasma control system

    Scientific workflow orchestration interoperating HTC and HPC resources

    Get PDF
    8 páginas, 7 figuras.-- El Pdf del artículo es la versión pre-print.In this work we describe our developments towards the provision of a unified access method to different types of computing infrastructures at the interop- eration level. For that, we have developed a middleware suite which bridges not interoperable middleware stacks used for building distributed computing infrastructues, UNICORE and gLite. Our solution allows to transparently access and operate on HPC and HTC resources from a single interface. Using Kepler as workflow manager, we provide users with the needed integration of codes to create scientific workflows accessing both types of infrastructures.Peer reviewe
    corecore