46 research outputs found

    Use of Laboratory Data to Model Interstellar Chemistry

    Get PDF
    Our laboratory research program is about the formation of molecules on dust grains analogues in conditions mimicking interstellar medium environments. Using surface science techniques, in the last ten years we have investigated the formation of molecular hydrogen and other molecules on different types of dust grain analogues. We analyzed the results to extract quantitative information on the processes of molecule formation on and ejection from dust grain analogues. The usefulness of these data lies in the fact that these results have been employed by theoreticians in models of the chemical evolution of ISM environments

    Multiwavelength study of the galactic PeVatron candidate LHAASO J2108+5157

    Get PDF
    Context. Several new ultrahigh-energy (UHE) γ-ray sources have recently been discovered by the Large High Altitude Air Shower Observatory (LHAASO) collaboration. These represent a step forward in the search for the so-called Galactic PeVatrons, the enigmatic sources of the Galactic cosmic rays up to PeV energies. However, it has been shown that multi-TeV γ-ray emission does not necessarily prove the existence of a hadronic accelerator in the source; indeed this emission could also be explained as inverse Compton scattering from electrons in a radiation-dominated environment. A clear distinction between the two major emission mechanisms would only be made possible by taking into account multi-wavelength data and detailed morphology of the source. Aims. We aim to understand the nature of the unidentified source LHAASO J2108+5157, which is one of the few known UHE sources with no very high-energy (VHE) counterpart. Methods. We observed LHAASO J2108+5157 in the X-ray band with XMM-Newton in 2021 for a total of 3.8 hours and at TeV energies with the Large-Sized Telescope prototype (LST-1), yielding 49 hours of good-quality data. In addition, we analyzed 12 years of Fermi-LAT data, to better constrain emission of its high-energy (HE) counterpart 4FGL J2108.0+5155. We used naima and jetset software packages to examine the leptonic and hadronic scenario of the multi-wavelength emission of the source. Results. We found an excess (3.7σ) in the LST-1 data at energies E > 3 TeV. Further analysis of the whole LST-1 energy range, assuming a point-like source, resulted in a hint (2.2σ) of hard emission, which can be described with a single power law with a photon index of Σ = 1.6 ± 0.2 the range of 0.3 - 100 TeV. We did not find any significant extended emission that could be related to a supernova remnant (SNR) or pulsar wind nebula (PWN) in the XMM-Newton data, which puts strong constraints on possible synchrotron emission of relativistic electrons. We revealed a new potential hard source in Fermi-LAT data with a significance of 4σ and a photon index of Σ = 1.9 ± 0.2, which is not spatially correlated with LHAASO J2108+5157, but including it in the source model we were able to improve spectral representation of the HE counterpart 4FGL J2108.0+5155. Conclusions. The LST-1 and LHAASO observations can be explained as inverse Compton-dominated leptonic emission of relativistic electrons with a cutoff energy of 100-30+70 TeV. The low magnetic field in the source imposed by the X-ray upper limits on synchrotron emission is compatible with a hypothesis of a PWN or a TeV halo. Furthermore, the spectral properties of the HE counterpart are consistent with a Geminga-like pulsar, which would be able to power the VHE-UHE emission. Nevertheless, the lack of a pulsar in the neighborhood of the UHE source is a challenge to the PWN/TeV-halo scenario. The UHE γ rays can also be explained as π0 decay-dominated hadronic emission due to interaction of relativistic protons with one of the two known molecular clouds in the direction of the source. Indeed, the hard spectrum in the LST-1 band is compatible with protons escaping a shock around a middle-aged SNR because of their high low-energy cut-off, but the origin of the HE γ-ray emission remains an open question

    Sensitivity of the Cherenkov Telescope Array to TeV photon emission from the Large Magellanic Cloud

    Get PDF
    A deep survey of the Large Magellanic Cloud at ∼0.1-100 TeV photon energies with the Cherenkov Telescope Array is planned. We assess the detection prospects based on a model for the emission of the galaxy, comprising the four known TeV emitters, mock populations of sources, and interstellar emission on galactic scales. We also assess the detectability of 30 Doradus and SN 1987A, and the constraints that can be derived on the nature of dark matter. The survey will allow for fine spectral studies of N 157B, N 132D, LMC P3, and 30 Doradus C, and half a dozen other sources should be revealed, mainly pulsar-powered objects. The remnant from SN 1987A could be detected if it produces cosmic-ray nuclei with a flat power-law spectrum at high energies, or with a steeper index 2.3-2.4 pending a flux increase by a factor of >3-4 over ∼2015-2035. Large-scale interstellar emission remains mostly out of reach of the survey if its >10 GeV spectrum has a soft photon index ∼2.7, but degree-scale 0.1-10 TeV pion-decay emission could be detected if the cosmic-ray spectrum hardens above >100 GeV. The 30 Doradus star-forming region is detectable if acceleration efficiency is on the order of 1−10 per cent of the mechanical luminosity and diffusion is suppressed by two orders of magnitude within <100 pc. Finally, the survey could probe the canonical velocity-averaged cross-section for self-annihilation of weakly interacting massive particles for cuspy Navarro-Frenk-White profiles

    Sensitivity of the Cherenkov Telescope Array to a dark matter signal from the Galactic centre

    Get PDF
    We provide an updated assessment of the power of the Cherenkov Telescope Array (CTA) to search for thermally produced dark matter at the TeV scale, via the associated gamma-ray signal from pair-annihilating dark matter particles in the region around the Galactic centre. We find that CTA will open a new window of discovery potential, significantly extending the range of robustly testable models given a standard cuspy profile of the dark matter density distribution. Importantly, even for a cored profile, the projected sensitivity of CTA will be sufficient to probe various well-motivated models of thermally produced dark matter at the TeV scale. This is due to CTA's unprecedented sensitivity, angular and energy resolutions, and the planned observational strategy. The survey of the inner Galaxy will cover a much larger region than corresponding previous observational campaigns with imaging atmospheric Cherenkov telescopes. CTA will map with unprecedented precision the large-scale diffuse emission in high-energy gamma rays, constituting a background for dark matter searches for which we adopt state-of-the-art models based on current data. Throughout our analysis, we use up-to-date event reconstruction Monte Carlo tools developed by the CTA consortium, and pay special attention to quantifying the level of instrumental systematic uncertainties, as well as background template systematic errors, required to probe thermally produced dark matter at these energies

    Quantum Tunneling of Oxygen Atoms on Very Cold Surfaces

    Get PDF
    <p>Any evolving system can change state via thermal mechanisms (hopping a barrier) or via quantum tunneling. Most of the time, efficient classical mechanisms dominate at high temperatures. This is why an increase of the temperature can initiate the chemistry. We present here an experimental investigation of O-atom diffusion and reactivity on water ice. We explore the 6-25 K temperature range at submonolayer surface coverages. We derive the diffusion temperature law and observe the transition from quantum to classical diffusion. Despite the high mass of O, quantum tunneling is efficient even at 6 K. As a consequence, the solid-state astrochemistry of cold regions should be reconsidered and should include the possibility of forming larger organic molecules than previously expected.</p>

    The role of laboratory experiments in the characterisation of silicon-based cosmic material

    No full text
    International audienceSilicate grains in space have attracted recently a wide interest of astrophysicists due to the increasing amount and quality of observational data, especially thanks to the results obtained by the Infrared Space Observatory. The observations have shown that the presence of silicates is ubiquitous in space and that their properties vary with environmental characteristics. Silicates, together with carbon, are the principal components of solid matter in space. Since their formation, silicate grains cross many environments characterised by different physical and chemical conditions which can induce changes to their nature. Moreover, the transformations experienced in the interplay of silicate grains and the medium where they are dipped, are part of a series of processes which are the subject of possible changes in the nature of the space environment itself. Then, chemical and physical changes of silicate grains during their life play a key role in the chemical evolution of the entire Galaxy. The knowledge of silicate properties related to the conditions where they are found in space is strictly related to the study in the laboratory of the possible formation and transformation mechanisms they experience. The application of production and processing methods, capable to reproduce actual space conditions, together with the use of analytical techniques to investigate the nature of the material samples, form a subject of a complex laboratory experimental approach directed to the understanding of cosmic matter. The goal of the present paper is to review the experimental methods applied in various laboratories to the simulation and characterisation of cosmic silicate analogues. The paper describes also laboratory studies of the chemical reactions undergone and induced by silicate grains. The comparison of available laboratory results with observational data shows the essential constraints imposed by astronomical observations and, at the same time, indicates the most puzzling problems that deserve particular attention for the future. The outstanding open problems are reported and discussed. The final purpose of this paper is to provide an overview of the present stage of knowledge about silicates in space and to provide to the reader some indication of the future developments in the field

    Study of the neutron-induced reaction 17O(n,α)14C at astrophysical energies via the Trojan Horse method

    No full text
    Stellar nucleosynthesis processes are of vital importance for nuclear physics: all the heavy elements are created by neutron capture reactions that take place in stars. To correctly study such reactions the neutron abundance available in the environment must be known, which means that also the so-called “neutron poisons” must be considered. The present work will focus on the reaction 17O(n,α)14C which removes neutrons from the stellar environment during the s-process. Even though the study of such reactions is of high interest, it still presents several technological problems regarding both the creation and characterization of the neutron beam and the radioprotection of the facility. Therefore, the Trojan Horse Method, an indirect method, has been chosen to study the 17O(n, α)14C reaction in the energy region of astrophysical interest, from 300 keV in the center-of-mass frame down to zero. In the present work, after briefly recalling the main features of the method and reporting on the state of the art for the reaction cross-section measurements, the latest THM experiment will be presented

    A case for the economics of secure software development

    No full text
    Over the past 15 years the topic of information security economics has grown to become a large and diverse field, influencing security thinking on issues as diverse as bitcoin markets and cybersecurity insurance. An aspect yet to receive much attention in this respect is that of secure software development, or 'SWSec' - another area that has seen a surge of research since 2000. SWSec provides paradigms, practices and procedures that offer some promise to address current security problems, yet those solutions face financial and technical barriers that necessitate a more thorough approach to planning and execution. Meanwhile, information security economics has developed theory and practice to support a particular world-view; however, it has yet to account for the investments, constructs and benefits of SWSec. As the frequency and severity of computer misuse has increased, both areas have struggled to impart a new mindset for addressing the inherent issues that arise in a diverse, connected and functionality-driven landscape. This paper presents a call for the establishment of an economics of secure software development. We present the primary challenges facing practice, citing relevant literature from both communities to illustrate where commonalities lie - and where further work is needed. Those challenges are decomposed into a research agenda, deriving from the application of principles in both themes a lack of models, representation and analysis in practice. A framework emerges that facilitates discussions of security theory and practice
    corecore