47 research outputs found
The ESCAPE project : Energy-efficient Scalable Algorithms for Weather Prediction at Exascale
In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure.
The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors.
This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche a l'Operationnel a Meso-Echelle) and ALADIN (Aire Limitee Adaptation Dynamique Developpement International); and COSMO-EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf.
The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU-GPU arrangements
The ESCAPE project: Energy-efficient Scalable Algorithms for Weather Prediction at Exascale
Abstract. In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure. The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors. This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche à l'Opérationnel à Meso-Echelle) and ALADIN (Aire Limitée Adaptation Dynamique Développement International); and COSMO–EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf. The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU–GPU arrangements
Stratégie locale de prise en charge des patients suspects d’infection à Covid-19 dans le Cantal lors du premier confinement
CONTEXTE : dans le Cantal, la pandémie du Covid-19 a permis la création de centres de consultation dédiés au Covid-19 pendant le premier confinement : les CMAS. Ces structures sont le fruit d’une collaboration médecine de ville, médecine hospitalière et instances locales. Ils ont pour but d’examiner les patients suspects d’infection au Covid-19 en condition sécurisée pour préserver les cabinets médicaux cantaliens et pour mutualiser les ressources humaines et matérielles.OBJECTIF : évaluer la stratégie de prise en charge des patients suspects d’infection au Covid-19 notamment en analysant les données des CMAS et la répercussion sur la population de l’épidémie de Covid-19.MÉTHODE : Il s’agit d’une étude quantitative analytique rétrospective menée du 16 mars au 10 mai 2020 qui compare des données du Cantal avec celles de la région Rhône-Alpes-Auvergne et de la France. Ces données sont issues des CMAS, du DIM du Cantal, de Geodes Santé Publique France, de la DRESS et de l’INSEE.RÉSULTATS : les CMAS ont été bénéfiques dans la prise en charge des patients suspects d’infection au Covid-19 en diminuant le nombre de personnes aux urgences, en orientant efficacement les cas graves ou potentiellement graves. Cela a permis de protéger les cabinets médicaux, la population et les Centres Hospitaliers cantaliens. Cette stratégie est issue d’une collaboration entre médecine de ville, hospitalière et instances locales. Certaines données nous amènent à dire que le Covid-19 n’a pas eu de répercussion grave sur la population : taux d’occupation des lits en soins critiques, taux de décès cumulés et taux d’hospitalisation pour 100 000 habitants faibles.CONCLUSION : les cantaliens ont appliqué les référentiels nationaux (COREB) en les adaptant aux exigences locales et répondu rapidement et efficacement à l’épidémie de Covid-19 avec la création des CMAS
Etude monocentrique recherchant un indicateur d'exposition aux nanoparticules et Ă©valuation de la relation avec des pathologies infiltratives pulmonaires idiopathiques (Etude NanoPI)
Introduction: Les nanomatériaux représentent une technologie en pleine expansion. Malgré un nombre croissant de publication, les données disponibles quand à leur potentiel impact sur la santé humaine sont très limitées en particulier au niveau respiratoire. Objectif : L'objectif de cette étude est d'obtenir des informations qualitatives et quantitatives sur l'exposition aux nanoéléments (NEs) de patients présentant une pneumopathie infiltrative diffuse (PID). L'objectif secondaire est de rechercher un éventuel lien entre le niveau d'exposition et le type de PID selon qu'elle s'inscrit dans un tableau idiopathique ou dans un cadre étiologique déterminé. Matériel et méthodes : Il s'agit d'une étude prospective, transversale, monocentrique, mené dans le service de Pneumologie du CHU de St Etienne. Les prélèvements biologiques (Sang, Urine, AB, LBA, CAE) de 100 patients ont été analysés afin de déterminer leur charge en NEs. Résultats : Lors d'une analyse intermédiaire, comprenant 45 patients, il a été mis en évidence une charge en NEs extrêmement variable dans les AB avec la détermination de 3 niveaux d'expositions. Il n'a pas été mis en évidence de différence significative entre les 2 groupes. Seuls 2 prélèvements de LBA présente un niveau notable de NEs. Les analyses minéralogiques sont en attente. Conclusion : L'exposition aux NEs semble être extrêmement variable mais bien réelle. Il n'a pas été possible de mettre en évidence un lien entre cette exposition et le type de PID présentée par le patient. L'absence quasi complète de NEs dans le LBA pose de nombreuses questions sur la biodistribution et la biopersistance de ces NEs aussi bien au niveau respiratoire qu'au niveau général. Les analyses minéralogiques devraient améliorer nos connaissances dans le domaine de la nanotoxicité.ST ETIENNE-BU Médecine (422182102) / SudocSudocFranceF
Fast Sky to Sky Interpolation for Radio Interferometric Imaging
International audienceReconstruction of radio interferometric images requires the processing of data in Fourier space that dot not have regular coordinates, preventing direct use of the Fast Fourier Transform. The most common solution is to rely on interpolation algorithms, called gridding, that are computationally expensive. In this paper, we propose an algorithmic reinterpretation, named sky to sky method, to reduce the computation cost of the gridding operation and its adjoint, the degridding, when used successively. We analyze the impact of interpolation step size regarding the computation cost and the reconstruction error. We also illustrate this optimization on a full reconstruction with gradient descent and CLEAN algorithm. Finally, we obtain acceleration factors between 1.2 and 16.4 without additional approximation
Fast Sky to Sky Interpolation for Radio Interferometric Imaging
International audienceReconstruction of radio interferometric images requires the processing of data in Fourier space that dot not have regular coordinates, preventing direct use of the Fast Fourier Transform. The most common solution is to rely on interpolation algorithms, called gridding, that are computationally expensive. In this paper, we propose an algorithmic reinterpretation, named sky to sky method, to reduce the computation cost of the gridding operation and its adjoint, the degridding, when used successively. We analyze the impact of interpolation step size regarding the computation cost and the reconstruction error. We also illustrate this optimization on a full reconstruction with gradient descent and CLEAN algorithm. Finally, we obtain acceleration factors between 1.2 and 16.4 without additional approximation
Fast Sky to Sky Interpolation for Radio Interferometric Imaging
International audienceReconstruction of radio interferometric images requires the processing of data in Fourier space that dot not have regular coordinates, preventing direct use of the Fast Fourier Transform. The most common solution is to rely on interpolation algorithms, called gridding, that are computationally expensive. In this paper, we propose an algorithmic reinterpretation, named sky to sky method, to reduce the computation cost of the gridding operation and its adjoint, the degridding, when used successively. We analyze the impact of interpolation step size regarding the computation cost and the reconstruction error. We also illustrate this optimization on a full reconstruction with gradient descent and CLEAN algorithm. Finally, we obtain acceleration factors between 1.2 and 16.4 without additional approximation