16 research outputs found

    Analysis and estimation of the scientific performance of the GAMMA-400 experiment

    Get PDF
    2013/2014Per uno studio completo che parte dalla materia oscura e va all'origine e propagazione dei raggi cosmici, quello multi canale è uno degli approcci migliori per risolvere i quesiti irrisolti della fisica delle astroparticelle. GAMMA-400, grazie alla sua natura duale, dedita allo studio di raggi cosmici (elettroni fino alle energie del TeV e protoni e nuclei fino a 10^{15}-10^{16} eV) e raggi gamma (da 50 MeV fino a qualche TeV), si dedicherà allo studio di questi problemi. Lo scopo di questa tesi è lo studio delle prestazioni di GAMMA-400 per l'osservazione dei raggi gamma. Due diverse configurazioni della geometria sono state studiate: la "baseline" e la cosiddetta configurazione "enhanced". Le principali differenze tra queste due configurazioni si trovano nel tracciatore e nel calorimetro. Il tracciatore della "baseline" è composto da dieci piani di silicio, otto dei quali comprendono anche uno strato di ~0.1 X_0 di tungsteno. Il tracciatore della configurazione "enhanced" è invece composto da 25 piani di silicio inframezzati da uno strato di tungsteno di ~0.03 X_0. Il calorimetro della "baseline" è diviso in due sezioni: una parte composta da due piani di ioduro di cesio e silicio (chiamata "pre-shower") e una seconda parte composta da 28x28x12 cubi di ioduro di cesio. Il calorimetro della configurazione "enhanced" è invece composto solo da 20x20x20 cubi di ioduro di cesio. Per stimare le prestazioni ho sviluppato un algoritmo di ricostruzione della direzione del raggio gamma incidente. La ricostruzione può fare uso delle informazioni provenienti dal tracciatore, dal "pre-shower" o dal calorimetro, sia combinandole che singolarmente. Le direzioni ottenuta grazie alle informazioni del solo "pre-shower" o del solo calorimetro, anche se di minor risoluzione, possono essere utili per aumentare il numero di fotoni visti ad alta energia e per fornire le informazioni necessarie all'osservazione di transienti con i telescopi Cherenkov a terra. La risoluzione angolare utilizzando il tracciatore è migliore nel caso della configurazione "enhanced". A basse energie ciò è possibile grazie al minore tungsteno, e di conseguenza minor "scattering" multiplo, presente all'interno del tracciatore. Il calorimetro più piccolo, e più profondo, seppur ostacolando la ricostruzione dell'energia di fotoni ad alta energia, produce anche un numero minore di particelle di "backsplash" che peggiorano la ricostruzione delle tracce. L'area efficace totale della "baseline", potendo contare su un calorimetro più grande ed il "pre-shower", è più grande rispetto alla configurazione "enhanced". La risoluzione angolare, l'area efficace e la strategia di osservazione dello strumento contribuiscono alla sensitività per sorgenti puntiformi. La sensitività totale dello strumento è migliore per la "baseline" per energie maggiori di 5 GeV. Ho implementato un set prelminare di condizioni di "trigger" per lo studio dei raggi gamma tramite l'utilizzo delle informazioni del tracciatore. La necessità di rigettare la maggior parte delle particelle cariche deriva dall'elevato fondo presente in orbita (~10^6 protoni per raggio gamma) e una capacità di "downlink" limitata (~100 GB/day). Tra le due configurazioni si nota una differenza di meno dell'1% nel numero rimanente di protoni. Seppur promettente, tale risultato deve essere migliorato e possibili miglioramenti sono descritti nella tesi. Gli algoritmi di ricostruzione e "trigger" sono applicati all'analisi della possibilità di studiare "gamma-ray burst" (GRB) con la principale strumentazione a bordo di GAMMA-400. Una stima del numero di eventi non ricostruiti, perchè avvengono nel tempo morto tra due "trigger", è effettuata tramite la simulazione di un ipotetico GRB accoppiata ai tempi di arrivo dei fotoni presi dai dati reali di due GRB osservati da Fermi. In nessuna delle due configurazioni è visibile una percentuale significativa di "pile-up". Anche aumentando il flusso dei GRB la percentuale di eventi non ricostruiti non supera mai il 6%. Nonostante questo risultato, molto dipenderà dal disegno finale dell’elettronica di lettura dei rivelatori che potrebbe aumentare i tempi morti dello strumento.XXVII Ciclo198

    CALOCUBE: An approach to high-granularity and homogenous calorimetry for space based detectors

    Get PDF
    Future space experiments dedicated to the observation of high-energy gamma and cosmic rays will increasingly rely on a highly performing calorimetry apparatus, and their physics performance will be primarily determined by the geometrical dimensions and the energy resolution of the calorimeter deployed. Thus it is extremely important to optimize its geometrical acceptance, the granularity, and its absorption depth for the measurement of the particle energy with respect to the total mass of the apparatus which is the most important constraint for a space launch. The proposed design tries to satisfy these criteria while staying within a total mass budget of about 1.6 tons. Calocube is a homogeneous calorimeter instrumented with Cesium iodide (CsI) crystals, whose geometry is cubic and isotropic, so as to detect particles arriving from every direction in space, thus maximizing the acceptance; granularity is obtained by filling the cubic volume with small cubic CsI crystals. The total radiation length in any direction is more than adequate for optimal electromagnetic particle identification and energy measurement, whilst the interaction length is at least sufficient to allow a precise reconstruction of hadronic showers. Optimal values for the size of the crystals and spacing among them have been studied. The design forms the basis of a three-year R&D activity which has been approved and financed by INFN. An overall description of the system, as well as results from preliminary tests on particle beams will be described

    Nightside condensation of iron in an ultra-hot giant exoplanet

    Get PDF
    Ultra-hot giant exoplanets receive thousands of times Earth's insolation. Their high-temperature atmospheres (>2,000 K) are ideal laboratories for studying extreme planetary climates and chemistry. Daysides are predicted to be cloud-free, dominated by atomic species and substantially hotter than nightsides. Atoms are expected to recombine into molecules over the nightside, resulting in different day-night chemistry. While metallic elements and a large temperature contrast have been observed, no chemical gradient has been measured across the surface of such an exoplanet. Different atmospheric chemistry between the day-to-night ("evening") and night-to-day ("morning") terminators could, however, be revealed as an asymmetric absorption signature during transit. Here, we report the detection of an asymmetric atmospheric signature in the ultra-hot exoplanet WASP-76b. We spectrally and temporally resolve this signature thanks to the combination of high-dispersion spectroscopy with a large photon-collecting area. The absorption signal, attributed to neutral iron, is blueshifted by -11+/-0.7 km s-1 on the trailing limb, which can be explained by a combination of planetary rotation and wind blowing from the hot dayside. In contrast, no signal arises from the nightside close to the morning terminator, showing that atomic iron is not absorbing starlight there. Iron must thus condense during its journey across the nightside.Comment: Published in Nature (Accepted on 24 January 2020.) 33 pages, 11 figures, 3 table

    Detection of the blazar S4 0954+65 at very-high-energy with the MAGIC telescopes during an exceptionally high optical state

    Get PDF
    The very high energy (VHE ¿ 100 GeV) -ray MAGIC observations of the blazar S4 0954+65, were triggered by an exceptionally high flux state of emission in the optical. This blazar has a disputed redshift of z = 0.368 or z ¿ 0.45 and an uncertain classification among blazar subclasses. The exceptional source state described here makes for an excellent opportunity to understand physical processes in the jet of S4 0954+65 and thus contribute to its classification. Methods. We investigated the multiwavelength (MWL) light curve and spectral energy distribution (SED) of the S4 0954+65 blazar during an enhanced state in February 2015 and have put it in context with possible emission scenarios. We collected photometric data in radio, optical, X-ray, and ¿-ray. We studied both the optical polarization and the inner parsec-scale jet behavior with 43 GHz data. Results. Observations with the MAGIC telescopes led to the first detection of S4 0954+65 at VHE. Simultaneous data with Fermi-LAT at high energy ¿-ray(HE, 100 MeV < E < 100 GeV) also show a period of increased activity. Imaging at 43 GHz reveals the emergence of a new feature in the radio jet in coincidence with the VHE flare. Simultaneous monitoring of the optical polarization angle reveals a rotation of approximately 100. Conclusions. The high emission state during the flare allows us to compile the simultaneous broadband SED and to characterize it in the scope of blazar jet emission models. The broadband spectrum can be modeled with an emission mechanism commonly invoked for flat spectrum radio quasars (FSRQs), that is, inverse Compton scattering on an external soft photon field from the dust torus, also known as external Compton. The light curve and SED phenomenology is consistent with an interpretation of a blob propagating through a helical structured magnetic field and eventually crossing a standing shock in the jet, a scenario typically applied to FSRQs and low-frequency peaked BL Lac objects (LBL). © ESO 2018.The financial support of the German BMBF and MPG, the Italian INFN and INAF, the Swiss National Fund SNF, the ERDF under the Spanish MINECO (FPA2015-69818-P, FPA2012-36668, FPA2015-68378-P, FPA2015-69210-C6-2-R, FPA2015-69210-C6-4-R, FPA2015-69210-C6-6-R, AYA2015-71042-P, AYA2016-76012-C3-1-P, ESP2015-71662-C2-2-P, CSD2009-00064), and the Japanese JSPS and MEXT is gratefully acknowledged. This work was also supported by the Spanish Centro de Excelencia “Severo Ochoa” SEV-2012-0234 and SEV-2015-0548, and Unidad de Excelencia “María de Maeztu” MDM-2014-0369, by the Croatian Science Foundation (HrZZ) Project IP-2016-06-9782 and the University of Rijeka Project 13.12.1.3.02, by the DFG Collaborative Research Centers SFB823/C4 and SFB876/C3, the Polish National Research Centre grant UMO-2016/22/M/ST9/00382 and by the Brazilian MCTIC, CNPq, and FAPERJ. IA acknowledges support by a Ramón y Cajal grant of the Ministerio de Economía, Industria y Competitividad (MINECO) of Spain. The research at the IAA–CSIC was supported in part by the MINECO through grants AYA2016–80889–P, AYA2013–40825–P, and AYA2010–14844, and by the regional government of Andalucía through grant P09–FQM–4784.Peer Reviewe

    ESPRESSO at VLT. On-sky performance and first results

    Get PDF
    Context. ESPRESSO is the new high-resolution spectrograph of ESO's Very Large Telescope (VLT). It was designed for ultra-high radial-velocity (RV) precision and extreme spectral fidelity with the aim of performing exoplanet research and fundamental astrophysical experiments with unprecedented precision and accuracy. It is able to observe with any of the four Unit Telescopes (UTs) of the VLT at a spectral resolving power of 140 000 or 190 000 over the 378.2 to 788.7 nm wavelength range; it can also observe with all four UTs together, turning the VLT into a 16 m diameter equivalent telescope in terms of collecting area while still providing a resolving power of 70 000. Aims: We provide a general description of the ESPRESSO instrument, report on its on-sky performance, and present our Guaranteed Time Observation (GTO) program along with its first results. Methods: ESPRESSO was installed on the Paranal Observatory in fall 2017. Commissioning (on-sky testing) was conducted between December 2017 and September 2018. The instrument saw its official start of operations on October 1, 2018, but improvements to the instrument and recommissioning runs were conducted until July 2019. Results: The measured overall optical throughput of ESPRESSO at 550 nm and a seeing of 0.65″ exceeds the 10% mark under nominal astroclimatic conditions. We demonstrate an RV precision of better than 25 cm s-1 during a single night and 50 cm s-1 over several months. These values being limited by photon noise and stellar jitter shows that the performance is compatible with an instrumental precision of 10 cm s-1. No difference has been measured across the UTs, neither in throughput nor RV precision. Conclusions: The combination of the large collecting telescope area with the efficiency and the exquisite spectral fidelity of ESPRESSO opens a new parameter space in RV measurements, the study of planetary atmospheres, fundamental constants, stellar characterization, and many other fields. Based on GTOs collected at the European Southern Observatory under ESO program(s) 1102.C-0744, 1102.C-0958 and 1104.C-0350 by the ESPRESSO Consortium

    Baseline telescope layouts of the Cherenkov Telescope Array

    No full text
    International audienceThe Cherenkov Telescope Array (CTA) will be the next generation of ground-based instrument for Very High Energy gamma-ray astronomy. It will improve the sensitivity of current telescopes by up to an order of magnitude and provide energy coverage from 20 GeV up to 300 TeV. This improvement will be achieved using a total of 19 and 99 telescopes of three different sizes spread out over 0.4 and 4.5 km2^2 at two sites, respectively, in the Northern and Southern Hemispheres. After a concerted effort involving three different large-scale Monte Carlo productions performed during the last years, here, the baseline layouts for both CTA sites that should emerge after several years of construction are presented

    The Cherenkov Telescope Array production system for data-processing and Monte Carlo simulation

    Get PDF
    The Cherenkov Telescope Array (CTA) is the next-generation instrument in the field of very high energy gamma-ray astronomy. It will be composed of two arrays of Imaging Atmospheric Cherenkov Telescopes, located at La Palma (Spain) and Paranal (Chile). The construction of CTA has just started with the installation of the first telescope on site at La Palma and the first data expected by the end of 2018. The scientific operations should begin in 2022 for a duration of about 30 years. The overall amount of data produced during these operations is around 27 PB per year. The associated computing power for data processing and Monte Carlo (MC) simulations is of the order of hundreds of millions of CPU HS06 hours per year. In order to cope with these high computing requirements, we have developed a production system prototype based on the DIRAC framework, that we have intensively exploited during the past 6 years to handle massive MC simulations on the grid for the CTA design and prototyping phases. CTA workflows are composed of several inter-dependent steps, which we used to handle separately within our production system. In order to fully automatize the whole workflows execution, we have partially revised the production system by further enhancing the data-driven behavior and by extending the use of meta-data to link together the different steps of a workflow. In this contribution we present the application of the production system to the last years MC campaigns as well as the recent production system evolution, intended to obtain a fully data-driven and automatized workflow execution for efficient processing of real telescope data

    The Cherenkov Telescope Array production system for data-processing and Monte Carlo simulation

    No full text
    The Cherenkov Telescope Array (CTA) is the next-generation instrument in the field of very high energy gamma-ray astronomy. It will be composed of two arrays of Imaging Atmospheric Cherenkov Telescopes, located at La Palma (Spain) and Paranal (Chile). The construction of CTA has just started with the installation of the first telescope on site at La Palma and the first data expected by the end of 2018. The scientific operations should begin in 2022 for a duration of about 30 years. The overall amount of data produced during these operations is around 27 PB per year. The associated computing power for data processing and Monte Carlo (MC) simulations is of the order of hundreds of millions of CPU HS06 hours per year. In order to cope with these high computing requirements, we have developed a production system prototype based on the DIRAC framework, that we have intensively exploited during the past 6 years to handle massive MC simulations on the grid for the CTA design and prototyping phases. CTA workflows are composed of several inter-dependent steps, which we used to handle separately within our production system. In order to fully automatize the whole workflows execution, we have partially revised the production system by further enhancing the data-driven behavior and by extending the use of meta-data to link together the different steps of a workflow. In this contribution we present the application of the production system to the last years MC campaigns as well as the recent production system evolution, intended to obtain a fully data-driven and automatized workflow execution for efficient processing of real telescope data
    corecore