401 research outputs found

    Introduzione al modello cosmologico Lambda-CDM

    Get PDF
    Il modello Lambda-CDM è il modello della cosmologia attualmente utilizzato per la descrizione dell'universo e che meglio interpreta i dati osservativi disponibili. Questi dati sono utilizzati per capire quali siano gli elementi che compongono l'universo e in quale misura contribuiscano alla sua evoluzione; essi suggeriscono l'esistenza, oltre alla materia ordinaria e alla radiazione, della costante cosmologica Lambda e della materia oscura (o CDM, cold dark matter) che forniscono il nome al modello. Stabiliti i parametri tipici delle componenti, il modello prevede l'andamento del fattore di scala, una grandezza in evoluzione temporale determinante nel calcolo delle distanze. Lo scopo della presente trattazione è quello di esporre le principali caratteristiche del modello Lambda-CDM. Verrà pertanto effettuata una introduzione alla relatività generale e alla cosmologia soffermandosi su quali principi si basino, quali siano le grandezze fisiche tipiche di queste teorie e quali equazioni utilizzino. In seguito verranno introdotte la costante cosmologica e la materia oscura: si capirà qual è il loro effetto nell'universo e quali sono le evidenze sperimentali che hanno condotto al loro utilizzo. Infine verranno presentati i più semplici modelli cosmologici con lo scopo di capire come le varie componenti dell'universo contribuiscano all'evoluzione dell'universo e interagiscano tra di loro; la trattazione culminerà nell'esposizione del modello Lambda-CDM

    Agilepy: A Python framework for AGILE data

    Get PDF
    The Italian AGILE space mission, with its Gamma-Ray Imaging Detector (GRID) instrument sensitive in the 30 Me–50 GeV γray energy band, has been operating since 2007. Agilepy is an open-source Python package to analyse AGILE/GRID data. The package is built on top of the command-line version of the AGILE Science Tools, developed by the AGILE Team, publicly available and released by ASI/SSDC. The primary purpose of the package is to provide an easy to use high-level interface to analyse AGILE/GRID data by simplifying the configuration of the tasks and ensuring straightforward access to the data. The current features are the generation and display of sky maps and light curves, the access to \gray sources catalogues, the analysis to perform spectral model and position fitting, the wavelet analysis. Agilepy also includes an interface tool providing the time evolution of the AGILE off-axis viewing angle for a chosen sky region. The Flare Advocate team also uses the tool to analyse the data during the daily monitoring of the γray sky. Agilepy (and its dependencies) can be easily installed using Anaconda

    The Compton Spectrometer and Imager

    Full text link
    The Compton Spectrometer and Imager (COSI) is a NASA Small Explorer (SMEX) satellite mission in development with a planned launch in 2027. COSI is a wide-field gamma-ray telescope designed to survey the entire sky at 0.2-5 MeV. It provides imaging, spectroscopy, and polarimetry of astrophysical sources, and its germanium detectors provide excellent energy resolution for emission line measurements. Science goals for COSI include studies of 0.511 MeV emission from antimatter annihilation in the Galaxy, mapping radioactive elements from nucleosynthesis, determining emission mechanisms and source geometries with polarization measurements, and detecting and localizing multimessenger sources. The instantaneous field of view for the germanium detectors is >25% of the sky, and they are surrounded on the sides and bottom by active shields, providing background rejection as well as allowing for detection of gamma-ray bursts and other gamma-ray flares over most of the sky. In the following, we provide an overview of the COSI mission, including the science, the technical design, and the project status.Comment: 8 page

    The cosipy library: COSI's high-level analysis software

    Full text link
    The Compton Spectrometer and Imager (COSI) is a selected Small Explorer (SMEX) mission launching in 2027. It consists of a large field-of-view Compton telescope that will probe with increased sensitivity the under-explored MeV gamma-ray sky (0.2-5 MeV). We will present the current status of cosipy, a Python library that will perform spectral and polarization fits, image deconvolution, and all high-level analysis tasks required by COSI's broad science goals: uncovering the origin of the Galactic positrons, mapping the sites of Galactic nucleosynthesis, improving our models of the jet and emission mechanism of gamma-ray bursts (GRBs) and active galactic nuclei (AGNs), and detecting and localizing gravitational wave and neutrino sources. The cosipy library builds on the experience gained during the COSI balloon campaigns and will bring the analysis of data in the Compton regime to a modern open-source likelihood-based code, capable of performing coherent joint fits with other instruments using the Multi-Mission Maximum Likelihood framework (3ML). In this contribution, we will also discuss our plans to receive feedback from the community by having yearly software releases accompanied by publicly-available data challenges

    Production of He-4 and (4) in Pb-Pb collisions at root(NN)-N-S=2.76 TeV at the LHC

    Get PDF
    Results on the production of He-4 and (4) nuclei in Pb-Pb collisions at root(NN)-N-S = 2.76 TeV in the rapidity range vertical bar y vertical bar <1, using the ALICE detector, are presented in this paper. The rapidity densities corresponding to 0-10% central events are found to be dN/dy4(He) = (0.8 +/- 0.4 (stat) +/- 0.3 (syst)) x 10(-6) and dN/dy4 = (1.1 +/- 0.4 (stat) +/- 0.2 (syst)) x 10(-6), respectively. This is in agreement with the statistical thermal model expectation assuming the same chemical freeze-out temperature (T-chem = 156 MeV) as for light hadrons. The measured ratio of (4)/He-4 is 1.4 +/- 0.8 (stat) +/- 0.5 (syst). (C) 2018 Published by Elsevier B.V.Peer reviewe

    A new implementation of an optimal filter for the detection of galaxy clusters through weak lensing

    Get PDF
    We developed a new version of a C++ code, Get the Halo 2021, that implements the optimal linear matched filter presented in Maturi et al.(2005). Our aim is to detect dark matter haloes of clusters of galaxies through their weak gravitational lensing signatures applying the filter to a catalogue of simulated galaxy ellipticities. The dataset represents typical data that will be available thanks to the Euclid mission, thus we are able to forecast the filter performances on weak lensing data obtained by Euclid. The linear matched filter is optimised to maximise the signal-to-noise ratio (S/N) of the detections and minimise the number of spurious detections caused by superposition of large-scale structures; this is achieved by suppressing those spatial frequencies dominated by the large-scale structure contamination. We compared our detections with the true population of dark matter haloes used to produce the catalogue of ellipticities. We confirmed the expectations on the filter performance raised by Maturi et al.(2005) and Pace et al.(2007). We found that S/N 7 can be considered as a reliable threshold to detect haloes through weak lensing as 83% of our detections with S/N>7 were matched to the haloes; this is consistent with Pace et al.(2007). The purity of our catalogues of detections increases as a function of S/N and reaches 100% at S/N 10.5-11. We also confirmed that the filter selects preferentially haloes with redshift between 0.2 and 0.5, that have an intermediate distance between observer and background sources, condition that maximises the lensing effects. The completeness of our catalogues is a steadily growing function of the mass until 4-5Msun/h, where it reaches values 58-68%. Our algorithm might be used to enhance the reliability of the detections of the AMICO code (Bellagamba et al.2018), the optimal linear matched filter implemented in the Euclid data analysis pipeline to identify galaxy clusters in photometric data (Euclid Collaboration et al.2019)

    XRF analysis searching for fingerprint elemental profile in south-eastern Sicily tomatoes

    No full text
    Abstract The implementation of analytical techniques able to certify food quality and origin in a fast and non-destructive way is becoming a widespread need in the agri-food sector. Among the physical non-destructive techniques, X-ray fluorescence (XRF) spectrometry is often used to analyze the elemental composition of biological samples. In this study, X-ray fluorescence (XRF) elemental profiles were measured on tomato samples belonging to different geographical areas in Sicily (Italy). The purpose of this investigation was aiming to establish a protocol for in-situ measurement and analysis able to provide quality assessment and traceability of PGI agri-food products, specifically sustaining health safety and self qualifying bio-chemical signature. In detail, sampling was performed in one of the most tomato productive area of south-eastern Sicily (Pachino district), characterised by a relative higher amount of Organic Carbon and Cation Exchange Capacity, and compared with samples from other growing areas of Sicily, falling in Ragusa province and Mt. Etna region. Experimental data were analyzed in the framework of multivariate analysis by using principal component analysis and further validated by discriminant analysis. The results show the presence of specific elemental signatures associated to several characterizing elements. This methodology establishes the possibility to disentangle a clear fingerprint pattern associated to the geographical origin of an agri-food product

    Detailed Design Of the Science Alert Generation of the ACADA System

    No full text
    The scope of this document is to provide a detailed design description of the SAG Sub- system of the Array Control and Data Acquisition (ACADA) System that will be operating the instruments at both CTA-N and CTA-S installations. The main audience of this document is the development team that is expected to implement the sub-system. Motivations for the main design decisions taken are provided in connection with requirements and standards. This document describes writing SW design for the SAG sub-system in the context of the ACADA system. Throughout the document, UML is the preferred notation for depicting the component to be designed. The modelling tool used is Sparx Enterprise Architect. Throughout this document, the term “component” is used to refer to anything beyond the level of the system to be designed. A component usually consists of several sub-compo- nents. The main components of the SAG sub-system are: 1) SAG-SUP: Supervisors 2) SAG-RECO: Image Parameter Extractor and Low Level Reconstruction Pipeline 3) SAG-DQ: Data Quality 4) SAG-SCI: High-Level Analysi
    corecore