19 research outputs found

    Assessment of desertification sensitivity in Algeria

    Get PDF
    The map of desertification sensitivity is established by MEDALUS approach developed for the Mediterranean region on the basis on desertification sensitivity index, itself developed from several relevant factors known to influence the degradation processes (climate, vegetation, soil, human activities). The preparation of each layer is obtained from the geometric mean of the weighted parameters chosen to characterize and quantify this factor. Cartographic and alphanumeric data are captured and structured in a database, managed and analysed by a Geographic Information System. The output data have enabled the development of maps with small scale describing quality indexes. The maps about quality as well as the map of desertification sensitivity were anaLa carte de sensibilité à la désertification est établie selon la démarche MEDALUS éla- borée pour la région méditerranéenne sur la base des valeurs d'un indice de sensibilité à la désertification lui-même élaboré à partir de plusieurs facteurs pertinents connus pour leur influence sur les processus de phénomène (climat, végétation, sol, activités humaines). Les données cartographiques et alphanumérique sont saisies et structurées dans une base de données, gérées et analysées par un système d'information géographique (SIG). La préparation des couches climat, sol, végétation alimentant le SIG est obtenue à partir de la moyenne géométrique des valeurs pondérées des différents paramètres retenus pour caractériser et quant fier les facteurs. Les données en sortie ont permis l'élaboration de cartes à petite échelle portant un indice de qualité ; les différentes cartes de qualité ainsi que la carte de sensibilité à la désertification obtenues ont fait l'objet d'une discussion illustrant l'intérêt mais aussi les limites de la démarche suivie

    Machine Learning Techniques for Fast Shower Simulation at the ATLAS Experiment

    No full text
    The simulation of the passage of particles through the detectors of the Large Hadron Collider (LHC) is a core component of any physics analysis. However, a detailed and accurate simulation of the detector response using the Geant4 toolkit is a time and CPU consuming process. This is especially intensified with the large number of simulated events, typical physics analysis need. This thesis documents Machine Learning (ML) based alternatives for a faster simulation of the showering of particles in the ATLAS calorimeter. An ML approach that extends current parametrized simulation is also proposed. The work presented in this thesis follows three main stages: data preprocessing, ML model design, validation and integration into the ATLAS simulation framework. For data preprocessing, the calorimeter cell information is used to derive a suitable data structure. A finer granularity of voxels is then used to better capture the structure of the shower and extend the range of energy and the calorimeter regions. In the preprocessing stage, an innovative ML technique is introduced to automatically learn the optimal structure of the data. The resulting structure is general enough to be compatible with any particle energy and detector region. Once the data is preprocessed and an adapted structure is defined, a Variational AutoEncoder (VAE) termed FastCaloVSim learns to reproduce the showering of particles in the ATLAS calorimeter. A new, physics inspired, loss function is proposed to accurately map the shower energy, the total energy deposited per calorimeter layer and the total energy deposited in all the layers. Furthermore, the VAE is designed as a conditional model, i.e. the learning is conditioned on the pseudorapidity region of the calorimeter as well as the energy of the particle. The model performance is evaluated both as a standalone algorithm and as part of the ATLAS simulation framework. The last stage of this work describes the integration of FastCaloVSim into Athena, the ATLAS simulation framework, allowing further validation of the overall simulation pipeline

    MetaHEP: Meta learning for fast shower simulation of high energy physics experiments

    No full text
    For High Energy Physics (HEP) experiments, such as the Large Hadron Collider (LHC) experiments, the calorimeter is a key detector to measure the energy of particles. Incident particles interact with the material of the calorimeter, creating cascades of secondary particles, so-called showers. A detailed description of the showering process relies on simulation methods that precisely describe all particle interactions with matter. Constrained by the need for precision, the simulation of calorimeters is inherently slow and constitutes a bottleneck for HEP analysis. Furthermore, with the upcoming high luminosity upgrade of the LHC and a much-increased data production rate, the amount of required simulated events will increase. Several research directions have recently investigated the use of Machine Learning based models to accelerate particular calorimeter response simulation. These models typically require a large amount of data and time for training, and the result is a simulation tuned specifically to this configuration. Meta-learning has emerged recently as a fast learning algorithm using small training datasets. In this paper, we use a meta-learning approach that “learns to learn” to generate showers from multiple calorimeter geometries, using a first-order gradient-based algorithm. We present MetaHEP, the first application of the meta-learning approach to accelerate shower simulation using very high granular data and using one of the calorimeters proposed for the Future Circular Collider (FCC), a next-generation of high-performance particle colliders

    Meta-learning for multiple detector geometry modeling

    No full text
    The simulation of the passage of particles through the detectors of High Energy Physics (HEP) experiments is a core component of any physics analysis. A detailed and accurate simulation of the detector response using the Geant4 toolkit is a time and CPU consuming process. With the upcoming high luminosity LHC upgrade, with more complex events and a much increased trigger rate, the amount of required simulated events will increase. Several research directions investigated the use of Machine Learning based models to accelerate particular detector response simulation. This results in a specifically tuned simulation and generally these models require a large amount of data for training. Meta learning has emerged recently as fast learning algorithm using small training datasets. In this paper, we propose a meta-learning model that “learns to learn” to generate electromagnetic showers using a first-order gradient based algorithm. This model is trained on multiple detector geometries and can rapidly adapt to a new geometry using few training samples

    Évaluation de la sensibilité à la désertification en Algérie

    No full text
    Assessment of desertification sensitivity in Algeria. The map of desertification sensitivity is established by MEDALUS approach developed for the Mediterranean region on the basis on desertification sensitivity index, itself developed from several relevant factors known to influence the degradation processes (climate, vegetation, soil, human activities). The preparation of each layer is obtained from the geometric mean of the weighted parameters chosen to characterize and quantify this factor. Cartographic and alphanumeric data are captured and structured in a database, managed and analysed by a Geographic Information System. The output data have enabled the development of maps with small scale describing quality indexes. The maps about quality as well as the map of desertification sensitivity were analysed.La carte de sensibilité à la désertification est établie selon la démarche MEDALUS élaborée pour la région méditerranéenne sur la base des valeurs d’un indice de sensibilité à la désertification, lui-même élaboré à partir de plusieurs facteurs pertinents connus pour leur influence sur les processus de ce phénomène (climat, végétation, sol, activités humaines). Les données cartographiques et alphanumériques sont saisies et structurées dans une base de données, gérées et analysées par un système d’information géographique (SIG). La préparation des couches climat, sol, végétation alimentant le SIG est obtenue à partir de la moyenne géométrique des valeurs pondérées des différents paramètres retenus pour caractériser et quantifier les facteurs. Les données en sortie ont permis l’élaboration de cartes à petite échelle portant un indice de qualité ; les différentes cartes de qualité ainsi que la carte de sensibilité à la désertification obtenues ont fait l’objet d’une discussion illustrant l’intérêt mais aussi les limites de la démarche suivie.Salamani Mostefa, Kadi-Hanifi Halima, Hirche Aziz, Nedjraoui Dalila. Évaluation de la sensibilité à la désertification en Algérie. In: Revue d'Écologie (La Terre et La Vie), tome 68, n°1, 2013. pp. 71-84

    Fast Calorimeter Simulation in ATLAS with DNNs

    No full text
    The ATLAS physics program relies on very large samples of GEANT4 simulated events, which provide a highly detailed and accurate simulation of the ATLAS detector. But this accuracy comes with a high price in CPU, predominantly caused by the calorimeter simulation. The sensitivity of many physics analyses is already limited by the available Monte Carlo statistics and will be even more so in the future. Therefore, sophisticated fast simulation tools are developed. Prototypes are being developed using cutting edge machine learning approaches to learn the appropriate calorimeter response, which are expected to improve modeling of correlations within showers. Two different approaches, using Variational Auto-Encoders or Generative Adversarial Networks, are trained to model the shower simulation. These new tools are described and first results presente

    Deep generative models for fast shower simulation in ATLAS

    No full text
    Modeling the physics of the detector response to particle collisions is one of the most CPU intensive and time consuming aspects of LHC computing. With the upcoming high-luminosity upgrade and need for ever larger simulated datasets to support physics analysis, the development of new faster simulation techniques is required. A promising line of investigation is to switch from generating physics samples by means of a programmed detector response in Geant 4 to using a deep neural network that learns the appropriate output response trained from a much smaller set of Geant 4 simulated events. Such networks are then capable of generating new outputs, statistically independent from the training sample. Deep neural networks have been widely used in various science areas with a recent focus on generative modeling using Variational Auto-Encoders (VAEs) and Generative Adversarial Networks (GANs). We present results from a feasibility study to model particle showers in the ATLAS calorimeter. We will demonstrate the features of fully simulated showers that are reproduced and discuss the speed improvements

    Contribution of phytoecological data to spatialize soil erosion : application of the RUSLE model in the Algerian atlas

    No full text
    International audienceAmong the models used to assess water erosion, the RUSLE model is commonly used. Policy makers can act on cover (C-factor) and conservation practice (P-factor) to reduce erosion, with less costly action on soil surface characteristics. However, the widespread use of vegetation indices such as NDVI does not allow for a proper assessment of the C-factor in drylands where stones, crusted surfaces and litter strongly influence soil protection. Two sub-factors of C, canopy cover (CC) and soil cover (SC), can be assessed from phytoecological measurements that include gravel-pebbles cover, physical mulch, annual and perennial vegetation. This paper introduces a method to calculate the C-factor from phytoecological data and, in combination with remote sensing and a geographic information system (GIS), to map it over large areas. A supervised classification, based on field phytoecological data, is applied to radiometric data from Landsat-8/OLI satellite images. Then, a C-factor value, whose SC and CC subfactors are directly derived from the phytoecological measurements, is assigned to each land cover unit. This method and RUSLE are implemented on a pilot region of 3828 km 2 of the Saharan Atlas, composed of rangelands and steppe formations, and intended to become an observatory. The protective effect against erosion by gravel-pebbles (50%) is more than twice that of vegetation (23%). The C-factor derived from NDVI (0.67) is higher and more evenly distributed than that combining these two contributions (0.37 on average). Finally, priorities are proposed to decision-makers by crossing the synthetic map of erosion sensitivity and a decision matrix of management priorities

    Deep generative models for fast shower simulation in ATLAS

    Full text link
    The need for large scale and high fidelity simulated samples for the extensive physics program of the ATLAS experiment at the Large Hadron Collider motivates the development of new simulation techniques. Building on the recent success of deep learning algorithms, Variational Auto-Encoders and Generative Adversarial Networks are investigated for modeling the response of the ATLAS electromagnetic calorimeter for photons in a central calorimeter region over a range of energies. The properties of synthesized showers are compared to showers from a full detector simulation using Geant4. This feasibility study demonstrates the potential of using such algorithms for fast calorimeter simulation for the ATLAS experiment in the future and opens the possibility to complement current simulation techniques. To employ generative models for physics analyses, it is required to incorporate additional particle types and regions of the calorimeter and enhance the quality of the synthesised showers
    corecore