29 research outputs found

    Fine-grained parallelization of a Vlasov-Poisson application on GPU

    Get PDF
    International audienceUnderstanding turbulent transport in magnetised plasmas is a subject of major importance to optimise experiments in tokamak fusion reactors. Also, simulations of fusion plasma consume a great amount of CPU time on today's supercomputers. The Vlasov equation provides a useful framework to model such plasma. In this paper, we focus on the parallelization of a 2D semi-Lagrangian Vlasov solver on GPGPU. The originality of the approach lies in the needed overhaul of both numerical scheme and algorithms, in order to compute accurately and efficiently in the CUDA framework. First, we show how to deal with 32-bit floating point precision, and we look at accuracy issues. Second, we exhibit a very fine grain parallelization that fits well on a many-core architecture. A speed-up of almost 80 has been obtained by using a GPU instead of one CPU core. As far as we know, this work presents the first semi-Lagrangian Vlasov solver ported onto GPU

    Solving the Vlasov equation for one-dimensional models with long range interactions on a GPU

    Full text link
    We present a GPU parallel implementation of the numeric integration of the Vlasov equation in one spatial dimension based on a second order time-split algorithm with a local modified cubic-spline interpolation. We apply our approach to three different systems with long-range interactions: the Hamiltonian Mean Field, Ring and the self-gravitating sheet models. Speedups and accuracy for each model and different grid resolutions are presented

    Efficient algorithms for the fast computation of space charge effects caused by charged particles in particle accelerators

    Get PDF
    In this dissertation, a Poisson solver is improved with three parts: the efficient integrated Green's function; the discrete cosine transform of the efficient integrated Green's function values; the implicitly zero-padded fast Fourier transform for charge density. In addition, the high performance computing technology is utilized for the further improvement of efficiency, such as: OpenMP API, OpenMP+CUDA, MPI, and MPI+OpenMP parallelizations. The examples and simulation results are matched with the results of the commonly used Poisson solver to demonstrate the accuracy performance

    Numerical Simulations of the Dark Universe: State of the Art and the Next Decade

    Get PDF
    We present a review of the current state of the art of cosmological dark matter simulations, with particular emphasis on the implications for dark matter detection efforts and studies of dark energy. This review is intended both for particle physicists, who may find the cosmological simulation literature opaque or confusing, and for astro-physicists, who may not be familiar with the role of simulations for observational and experimental probes of dark matter and dark energy. Our work is complementary to the contribution by M. Baldi in this issue, which focuses on the treatment of dark energy and cosmic acceleration in dedicated N-body simulations. Truly massive dark matter-only simulations are being conducted on national supercomputing centers, employing from several billion to over half a trillion particles to simulate the formation and evolution of cosmologically representative volumes (cosmic scale) or to zoom in on individual halos (cluster and galactic scale). These simulations cost millions of core-hours, require tens to hundreds of terabytes of memory, and use up to petabytes of disk storage. The field is quite internationally diverse, with top simulations having been run in China, France, Germany, Korea, Spain, and the USA. Predictions from such simulations touch on almost every aspect of dark matter and dark energy studies, and we give a comprehensive overview of this connection. We also discuss the limitations of the cold and collisionless DM-only approach, and describe in some detail efforts to include different particle physics as well as baryonic physics in cosmological galaxy formation simulations, including a discussion of recent results highlighting how the distribution of dark matter in halos may be altered. We end with an outlook for the next decade, presenting our view of how the field can be expected to progress. (abridged)Comment: 54 pages, 4 figures, 3 tables; invited contribution to the special issue "The next decade in Dark Matter and Dark Energy" of the new Open Access journal "Physics of the Dark Universe". Replaced with accepted versio

    ASCR/HEP Exascale Requirements Review Report

    Full text link
    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio

    Software for Exascale Computing - SPPEXA 2016-2019

    Get PDF
    This open access book summarizes the research done and results obtained in the second funding phase of the Priority Program 1648 "Software for Exascale Computing" (SPPEXA) of the German Research Foundation (DFG) presented at the SPPEXA Symposium in Dresden during October 21-23, 2019. In that respect, it both represents a continuation of Vol. 113 in Springer’s series Lecture Notes in Computational Science and Engineering, the corresponding report of SPPEXA’s first funding phase, and provides an overview of SPPEXA’s contributions towards exascale computing in today's sumpercomputer technology. The individual chapters address one or more of the research directions (1) computational algorithms, (2) system software, (3) application software, (4) data management and exploration, (5) programming, and (6) software tools. The book has an interdisciplinary appeal: scholars from computational sub-fields in computer science, mathematics, physics, or engineering will find it of particular interest

    Laskennallisten avaruussäämallien kehittäminen, validointi ja käyttö

    Get PDF
    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the Vlasov description of plasma is carried out using the Vlasiator model. The test shows that the Vlasov equation for plasma in six-dimensionsional phase space is solved correctly by Vlasiator, that results are obtained beyond those of the magnetohydrodynamic (MHD) description of plasma and that global magnetospheric simulations using a hybrid-Vlasov model are feasible on current hardware. For the first time four global magnetospheric models using the MHD description of plasma (BATS-R-US, GUMICS, OpenGGCM, LFM) are run with identical solar wind input and the results compared to observations in the ionosphere and outer magnetosphere. Based on the results of the global magnetospheric MHD model GUMICS a hypothesis is formulated for a new mechanism of plasmoid formation in the Earth's magnetotail.Avaruuteen lähetetyistä laitteista suurin osa sijaitsee Maan magnetosfäärissä, missä ne altistuvat avaruussäälle. Avaruussäällä tarkoitetaan Maan lähiavaruuden läpötilan, magneettikentän ja muiden ominaisuuksien päivittäistä vaihtelua auringosta jatkuvasti virtaavan plasman - aurinkotuulen - vuoksi. Avaruussäällä voi olla haitallisia vaikutuksia myön Maan pinnalla, esimerkkinä sähkönsiirtoverkkoihin indusoituvat suuret tasavirrat. Tietokoneiden laskentateho on kasvanut eksponentiaalisesti jo vuosikymmenien ajan, minkä seurauksena myös laskennallisen mallinnuksen merkitys tieteelle on kasvanut huomattavasti. Laskennallinen mallintaminen on erityisen tärkeää avaruusplasmafysiikassa, sillä aurinkokunnan ulkopuolelta ei ole suoria mittauksia, eikä kaikkia avaruusplasman ominaisuuksia voida tutkia maanpäällisissä laboratorioissa. Supertietokoneiden laskentaytimien määrän kasvaessa myös laskennallisten mallien rinnakkaisesta suorituskyvystä on tullut ratkaisevan tärkeää. Väitöskirja koostuu johdannosta ja neljästä vertaisarvioidusta julkaisusta joissa kuvataan laskennallisten avaruussäämallien kehittämistä ja käyttöä Maan lähiavaruuden tutkimiseen. Avaruussäämallien kaikki kehittämisaskeleet käydään läpi alkaen alustavasta suunnittelusta ja toteutuksesta, jaetun muistin rinnakkaistuksesta ja laskentanopeuden optimoinnista aina testaukseen ja validointiin asti. Väitöskirjan yhteydessä on suunniteltu ja toteutettu rinnakkainen hila jota käytetään tällä hetkellä kahdessa Ilmatieteen laitoksella kehitettävässä avaruussäämallissa. Näistä toisen, Vlasovin yhtälöllä plasmaa mallintavan Vlasiatorin, täyden kuusiulotteisen faasiavaruuden (kolme paikka- ja kolme nopeusulottuvuutta) käsittävällä magnetosfääritestillä on osoitettu mallin toimivuus ja soveltuvuus Maapallon koko magnetosfäärin mallintamiseen nykyisillä supertietokoneilla. Ensimmäistä kertaa on myös vertailtu neljän eri avaruussäämallin (BATS-R-US, GUMICS, OpenGGCM, LFM) tuottamia ennusteita Maan lähiavaruudesta käyttäen samaa aurinkotuulisyötettä
    corecore