16 research outputs found

    HARPO: a TPC as a gamma-ray telescope and polarimeter

    Full text link
    A gas Time Projection Chamber can be used for gamma-ray astronomy with excellent angular-precision and sensitivity to faint sources, and for polarimetry, through the measurement of photon conversion to e+ee^+e^- pairs. We present the expected performance in simulations and the recent development of a demonstrator for tests in a polarized photon beam.Comment: SPIE Astronomical Telescopes + Instrumentation, Ultraviolet to gamma ray, Montr\'eal, Canada 2014. v2: note added in proof. Copyright 2014 SPIE. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibite

    The Athena X-ray Integral Field Unit: a consolidated design for the system requirement review of the preliminary definition phase

    Full text link
    The Athena X-ray Integral Unit (X-IFU) is the high resolution X-ray spectrometer, studied since 2015 for flying in the mid-30s on the Athena space X-ray Observatory, a versatile observatory designed to address the Hot and Energetic Universe science theme, selected in November 2013 by the Survey Science Committee. Based on a large format array of Transition Edge Sensors (TES), it aims to provide spatially resolved X-ray spectroscopy, with a spectral resolution of 2.5 eV (up to 7 keV) over an hexagonal field of view of 5 arc minutes (equivalent diameter). The X-IFU entered its System Requirement Review (SRR) in June 2022, at about the same time when ESA called for an overall X-IFU redesign (including the X-IFU cryostat and the cooling chain), due to an unanticipated cost overrun of Athena. In this paper, after illustrating the breakthrough capabilities of the X-IFU, we describe the instrument as presented at its SRR, browsing through all the subsystems and associated requirements. We then show the instrument budgets, with a particular emphasis on the anticipated budgets of some of its key performance parameters. Finally we briefly discuss on the ongoing key technology demonstration activities, the calibration and the activities foreseen in the X-IFU Instrument Science Center, and touch on communication and outreach activities, the consortium organisation, and finally on the life cycle assessment of X-IFU aiming at minimising the environmental footprint, associated with the development of the instrument. Thanks to the studies conducted so far on X-IFU, it is expected that along the design-to-cost exercise requested by ESA, the X-IFU will maintain flagship capabilities in spatially resolved high resolution X-ray spectroscopy, enabling most of the original X-IFU related scientific objectives of the Athena mission to be retained. (abridged).Comment: 48 pages, 29 figures, Accepted for publication in Experimental Astronomy with minor editin

    The Athena X-ray Integral Field Unit: a consolidated design for the system requirement review of the preliminary definition phase

    Get PDF
    The Athena X-ray Integral Unit (X-IFU) is the high resolution X-ray spectrometer studied since 2015 for flying in the mid-30s on the Athena space X-ray Observatory. Athena is a versatile observatory designed to address the Hot and Energetic Universe science theme, as selected in November 2013 by the Survey Science Committee. Based on a large format array of Transition Edge Sensors (TES), X-IFU aims to provide spatially resolved X-ray spectroscopy, with a spectral resolution of 2.5 eV (up to 7 keV) over a hexagonal field of view of 5 arc minutes (equivalent diameter). The X-IFU entered its System Requirement Review (SRR) in June 2022, at about the same time when ESA called for an overall X-IFU redesign (including the X-IFU cryostat and the cooling chain), due to an unanticipated cost overrun of Athena. In this paper, after illustrating the breakthrough capabilities of the X-IFU, we describe the instrument as presented at its SRR (i.e. in the course of its preliminary definition phase, so-called B1), browsing through all the subsystems and associated requirements. We then show the instrument budgets, with a particular emphasis on the anticipated budgets of some of its key performance parameters, such as the instrument efficiency, spectral resolution, energy scale knowledge, count rate capability, non X-ray background and target of opportunity efficiency. Finally, we briefly discuss the ongoing key technology demonstration activities, the calibration and the activities foreseen in the X-IFU Instrument Science Center, touch on communication and outreach activities, the consortium organisation and the life cycle assessment of X-IFU aiming at minimising the environmental footprint, associated with the development of the instrument. Thanks to the studies conducted so far on X-IFU, it is expected that along the design-to-cost exercise requested by ESA, the X-IFU will maintain flagship capabilities in spatially resolved high resolution X-ray spectroscopy, enabling most of the original X-IFU related scientific objectives of the Athena mission to be retained. The X-IFU will be provided by an international consortium led by France, The Netherlands and Italy, with ESA member state contributions from Belgium, Czech Republic, Finland, Germany, Poland, Spain, Switzerland, with additional contributions from the United States and Japan.The French contribution to X-IFU is funded by CNES, CNRS and CEA. This work has been also supported by ASI (Italian Space Agency) through the Contract 2019-27-HH.0, and by the ESA (European Space Agency) Core Technology Program (CTP) Contract No. 4000114932/15/NL/BW and the AREMBES - ESA CTP No.4000116655/16/NL/BW. This publication is part of grant RTI2018-096686-B-C21 funded by MCIN/AEI/10.13039/501100011033 and by “ERDF A way of making Europe”. This publication is part of grant RTI2018-096686-B-C21 and PID2020-115325GB-C31 funded by MCIN/AEI/10.13039/501100011033

    METHODOLOGIE DE DEVELOPPEMENT ET DE MODELISATION UML DES SYSTEMES D'ACQUISITION ET DE TRAITEMENT EN TEMPS REEL POUR LES EXPERIENCES DE PHYSIQUE DES HAUTES ENERGIES

    Get PDF
    Jury : Reza ANSARI (président), Jean-Marc GEIB (rapporteur), François TERRIER (directeur), Sylvain TISSERANT (rapporteur), Bertrand VALLAGEThe increasing complexity of the real-time data acquisition and processing systems (TDAQ : the so-called Trigger and Data AcQuisition systems) in high energy physics calls for an appropriate evolution of development tools. This work is about the interplay between in principle specifications of TDAQ systems and their actual design and realization on a concrete hardware and software platform. The basis of our work is to define a methodology for the development of TDAQ systems that meets the specific demands for the development of such systems. The result is the detailed specification of a “methodological framework” based on the Unified Modeling Language (UML) and designed to manage a development process. The use of this UML-based methodological framework progressively leads to the setting up of a “home-made” framework, i.e. a development tool that comprises reusable components and generic architectural elements adapted to TDAQ systems. The main parts of this dissertation are sections II to IV. Section II is devoted to the characterization and evolution of TDAQ systems. In section III, we review the main technologies that are relevant to our problematic, namely software reuse techniques such as design patterns and frameworks, especially concerning the real-time and embedded systems domain. Our original conceptual contribution is presented in section IV, where we give a detailed, formalized and example-driven specification of our development model. Our final conclusions are presented in section V, where we present the MORDICUS project devoted to a concrete realization of our UML methodological framework, and the deep affinities between our work and the emerging “Model Driven Architecture” (MDA) paradigm developed by the Object Management Group.La complexité croissante des systèmes d'acquisition et de traitement en temps réel (TDAQ) pour les expériences de physique des hautes énergies appelle à une évolution ad hoc des outils de développement. Dans cet ouvrage, nous traitons de l'articulation entre la spécification de principe des systèmes TDAQ et leur conception/réalisation sur une plateforme matérielle et logicielle concrète. Notre travail repose sur la définition d'une méthodologie de développement des systèmes TDAQ qui réponde aux problématiques de développements particulières à ces systèmes. Il en résulte la spécification détaillée d'un « canevas méthodologique » basé sur le langage UML, destiné à encadrer un processus de développement. L'usage de ce canevas méthodologique UML doit permettre la mise en place progressive d'un canevas « maison », c'est-à-dire un atelier de développement comprenant des composants réutilisables et des éléments d'architecture génériques adaptés aux applications TDAQ. L'ouvrage s'articule autour de 4 sections principales. La section II est dévolue à la caractérisation et à l'évolution des systèmes TDAQ. En section III, nous nous intéressons aux technologies pertinentes pour notre problématique, notamment les techniques de réutilisation logicielle telles les motifs récurrents (design patterns) et les canevas (frameworks), avec une orientation en faveur des domaines du temps réel et de l'embarqué. Notre apport conceptuel spécifique est exposé en section IV, où nous procédons notamment à la spécification détaillée, formalisée et exemples à l'appui, de notre modèle de développement. Enfin, nous terminons notre propos en section V en évoquant le projet de R&D MORDICUS de mise en œuvre concrète de notre canevas méthodologique UML, ainsi que les développements récents de l'OMG (Object Management Group) sur l'architecture orientée modèles (Model Driven Architecture), particulièrement en résonance avec notre travail

    Délégation et hiérarchie

    No full text
    Delegation and hierarchy Three reasons are given in agency models to justify the choice of a three-level hierarchy as the organization's internal structure : communication costs, double moral hazard and contract incompleteness. In this paper, we suggest another explanation. We show, in a simple model, that a three-level hierarchy is the optimal internal structure when the different tasks of the agency strongly overlap.Délégation et hiérarchie Cet article se propose d'apporter une explication à l'introduction d'un niveau intermédiaire dans une relation principal-agent. Les trois arguments qui sont principalement avancés dans les modèles d'agence pour expliquer le choix d'une hié­rarchie à trois échelons comme structure interne d'organisation sont le coût de la communication, la présence de double risque moral et le caractère incomplet des contrats. Nous proposons une autre justification dans cet article. Nous montrons, dans le cadre d'un modèle simple, qu'une hiérarchie à trois échelons est une structure interne optimale lorsque les activités de l'agence sont fortement imbriquées.Vafaï Kouroche, Anvar Shebli. Délégation et hiérarchie. In: Revue économique, volume 49, n°5, 1998. pp. 1199-1225

    Méthodologie de développement et de modélisation UML des systèmes d'acquisition et de traitement en temps réel pour les expériences de physique des hautes énergies

    No full text
    La complexité croissante des systèmes d'acquisition et de traitement en temps réel (TDAQ) pour les expériences de physique des hautes énergies appelle à une évolution ad hoc des outils de développement. Dans cet ouvrage, nous traitons de l'articulation entre la spécification de principe des systèmes TDAQ et leur conception/réalisation sur une plateforme matérielle et logicielle concrète. Notre travail repose sur la définition d'une méthodologie de développement des systèmes TDAQ qui réponde aux problématiques de développement particulières à ces systèmes. Il en résulte la spécification détaillée d'un "canevas méthodologique" basé sur le langage UML, destiné à encadrer un processus de développement. L'usage de ce canevas méthodologique UML doit permettre la mise en place progressive d'un canevas " maison ", c'est-à-dire un atelier de développement comprenant des composants réutilisables et des éléments d'architecture génériques adaptés aux applications TDAQ. L'ouvrage s'articule autour de 4 sections principales. La section II est dévolue à la caractérisation et à l'évolution des systèmes TDAQ. En section III, nous nous intéressons aux technologies pertinentes pour notre problématique, notamment les techniques de réutilisation logicielle telles les motifs récurrents (design patterns) et les canevas (frameworks), avec une orientation en faveur des domaines du temps réel et de l'embarqué. Notre apport conceptuel spécifique est exposé en section IV, où nous procédons notamment à la spécification détaillée, formalisée et exemples à l'appui, de notre modèle de développement. Enfin, nous terminons notre propos en section V en évoquant le projet de R&D MORDICUS de mise en œuvre concrète de notre canevas méthodologique UML, ainsi que les développements récents de l'OMG (Object Management Group) sur l'architecture orientée modèles (Model Driven Architecture), particulièrement en résonance avec notre travail.The increasing complexity of the real-time data acquisition and processing systems (TDAQ : the so-called Trigger and Data AcQuisition systems) in high energy physics calls for an appropriate evolution of development tools. This work is about the interplay between in principle specifications of TDAQ systems and their actual design and realization on a concrete hardware and software platform. The basis of our work is to define a methodology for the development of TDAQ systems that meets the specific demands for the development of such systems. The result is the detailed specification of a "methodological framewor" based on the Unified Modeling Language (UML) and designed to manage a development process. The use of this UML-based methodological framework progressively leads to the setting up of a "home-made" framework, i.e. a development tool that comprises reusable components and generic architectural elements adapted to TDAQ systems. The main parts of this dissertation are sections II to IV. Section II is devoted to the characterization and evolution of TDAQ systems. In section III, we review the main technologies that are relevant to our problematic, namely software reuse techniques such as design patterns and frameworks, especially concerning the real-time and embedded systems domain. Our original conceptual contribution is presented in section IV, where we give a detailed, formalized and example- driven specification of our development model. Our final conclusions are presented in section V, where we present the MORDICUS project devoted to a concrete realization of our UML methodological framework, and the deep affinities between our work and the emerging "Model Driven Architecture" (MDA) paradigm developed by the Object Management Group.ORSAY-PARIS 11-BU Sciences (914712101) / SudocSudocFranceF

    Challenges in reverse engineering of C++ to UML

    No full text
    International audienceModel-driven engineering provides possibly several advantages compared to the direct implementation of a system. An existing code basis needs to be imported into the modeling language in a process known as reverse engineering. However, there is an abstraction gap between the programming language (C++) and the model-ing language, in our case UML. This implies that the model obtained via reverse engineering is a model that directly mirrors the object-oriented implementation structures and does not use higher-level modeling mechanisms such as component-based concepts or state-machines. In addition, some concepts of the implementation languages can not be expressed in UML, such as advanced templates. We show some of these aspects based on examples. Therefore, new systems are often either developed from scratch or model-driven approaches are not applied. The latter has become more attractive recently, as IDEs offer powerful refactoring mechanisms andAI based code completion - model-driven approaches need to catch up with respect to AI support to remain competitive

    Deep-sea bioluminescence blooms after dense water formation at the ocean surface

    No full text
    The deep ocean is the largest and least known ecosystem on Earth. It hosts numerous pelagic organisms, most of which are able to emit light. Here we present a unique data set consisting of a 2.5-year long record of light emission by deep-sea pelagic organisms, measured from December 2007 to June 2010 at the ANTARES underwater neutrino telescope in the deep NW Mediterranean Sea, jointly with synchronous hydrological records. This is the longest continuous time-series of deep-sea bioluminescence ever recorded. Our record reveals several weeks long, seasonal bioluminescence blooms with light intensity up to two orders of magnitude higher than background values, which correlate to changes in the properties of deep waters. Such changes are triggered by the winter cooling and evaporation experienced by the upper ocean layer in the Gulf of Lion that leads to the formation and subsequent sinking of dense water through a process known as "open-sea convection". It episodically renews the deep water of the study area and conveys fresh organic matter that fuels the deep ecosystems. Luminous bacteria most likely are the main contributors to the observed deep-sea bioluminescence blooms. Our observations demonstrate a consistent and rapid connection between deep open-sea convection and bathypelagic biological activity, as expressed by bioluminescence. In a setting where dense water formation events are likely to decline under global warming scenarios enhancing ocean stratification, in situ observatories become essential as environmental sentinels for the monitoring and understanding of deep-sea ecosystem shifts
    corecore