214 research outputs found

    An experimental testbed for NEAT to demonstrate micro-pixel accuracy

    Full text link
    NEAT is an astrometric mission proposed to ESA with the objectives of detecting Earth-like exoplanets in the habitable zone of nearby solar-type stars. In NEAT, one fundamental aspect is the capability to measure stellar centroids at the precision of 5e-6 pixel. Current state-of-the-art methods for centroid estimation have reached a precision of about 4e-5 pixel at Nyquist sampling. Simulations showed that a precision of 2 micro-pixels can be reached, if intra and inter pixel quantum efficiency variations are calibrated and corrected for by a metrology system. The European part of the NEAT consortium is designing and building a testbed in vacuum in order to achieve 5e-6 pixel precision for the centroid estimation. The goal is to provide a proof of concept for the precision requirement of the NEAT spacecraft. In this paper we give the basic relations and trade-offs that come into play for the design of a centroid testbed and its metrology system. We detail the different conditions necessary to reach the targeted precision, present the characteristics of our current design and describe the present status of the demonstration.Comment: SPIE proceeding

    A detector interferometric calibration experiment for high precision astrometry

    Full text link
    Context: Exoplanet science has made staggering progress in the last two decades, due to the relentless exploration of new detection methods and refinement of existing ones. Yet astrometry offers a unique and untapped potential of discovery of habitable-zone low-mass planets around all the solar-like stars of the solar neighborhood. To fulfill this goal, astrometry must be paired with high precision calibration of the detector. Aims: We present a way to calibrate a detector for high accuracy astrometry. An experimental testbed combining an astrometric simulator and an interferometric calibration system is used to validate both the hardware needed for the calibration and the signal processing methods. The objective is an accuracy of 5e-6 pixel on the location of a Nyquist sampled polychromatic point spread function. Methods: The interferometric calibration system produced modulated Young fringes on the detector. The Young fringes were parametrized as products of time and space dependent functions, based on various pixel parameters. The minimization of func- tion parameters was done iteratively, until convergence was obtained, revealing the pixel information needed for the calibration of astrometric measurements. Results: The calibration system yielded the pixel positions to an accuracy estimated at 4e-4 pixel. After including the pixel position information, an astrometric accuracy of 6e-5 pixel was obtained, for a PSF motion over more than five pixels. In the static mode (small jitter motion of less than 1e-3 pixel), a photon noise limited precision of 3e-5 pixel was reached

    First experimental results of very high accuracy centroiding measurements for the neat astrometric mission

    Full text link
    NEAT is an astrometric mission proposed to ESA with the objectives of detecting Earth-like exoplanets in the habitable zone of nearby solar-type stars. NEAT requires the capability to measure stellar centroids at the precision of 5e-6 pixel. Current state-of-the-art methods for centroid estimation have reached a precision of about 2e-5 pixel at two times Nyquist sampling, this was shown at the JPL by the VESTA experiment. A metrology system was used to calibrate intra and inter pixel quantum efficiency variations in order to correct pixelation errors. The European part of the NEAT consortium is building a testbed in vacuum in order to achieve 5e-6 pixel precision for the centroid estimation. The goal is to provide a proof of concept for the precision requirement of the NEAT spacecraft. In this paper we present the metrology and the pseudo stellar sources sub-systems, we present a performance model and an error budget of the experiment and we report the present status of the demonstration. Finally we also present our first results: the experiment had its first light in July 2013 and a first set of data was taken in air. The analysis of this first set of data showed that we can already measure the pixel positions with an accuracy of about 1e-4 pixel.Comment: SPIE conference proceeding

    Stellar Content from high resolution galactic spectra via Maximum A Posteriori

    Full text link
    This paper describes STECMAP (STEllar Content via Maximum A Posteriori), a flexible, non-parametric inversion method for the interpretation of the integrated light spectra of galaxies, based on synthetic spectra of single stellar populations (SSPs). We focus on the recovery of a galaxy's star formation history and stellar age-metallicity relation. We use the high resolution SSPs produced by PEGASE-HR to quantify the informational content of the wavelength range 4000 - 6800 Angstroms. A detailed investigation of the properties of the corresponding simplified linear problem is performed using singular value decomposition. It turns out to be a powerful tool for explaining and predicting the behaviour of the inversion. We provide means of quantifying the fundamental limitations of the problem considering the intrinsic properties of the SSPs in the spectral range of interest, as well as the noise in these models and in the data. We performed a systematic simulation campaign and found that, when the time elapsed between two bursts of star formation is larger than 0.8 dex, the properties of each episode can be constrained with a precision of 0.04 dex in age and 0.02 dex in metallicity from high quality data (R=10 000, signal-to-noise ratio SNR=100 per pixel), not taking model errors into account. The described methods and error estimates will be useful in the design and in the analysis of extragalactic spectroscopic surveys.Comment: 31 pages, 23 figures, accepted for publication in MNRA

    Non Destructive Evaluation of Containment Walls in Nuclear Power Plants

    Get PDF
    Two functions are regularly tested on the containment walls in order to anticipate a possible accident. The first is mechanical to resist at a possible internal over-pressure and the second is to prevent leakage. The reference accident LLOCA (Large Loss of Coolant Accident) is the rupture of a pipe in the primary circuit of a nuclear plant. In this case, the pressure and temperature can reach 5 bar and 180°C in 20 seconds. The national project ‘Non-destructive testing of the containment structures of nuclear plants’ aims at studying the non-destructive techniques capable to evaluate the concrete properties and its damaging or progression of cracks. This 4-year-project is segmented into two parts. The first consists in developing and selecting the most relevant NDEs (Non Destructive Evaluations) in the laboratory to reach these goals. These evaluations are developed in conditions representing the real conditions of the stresses generated during ten-yearly visits of the plants or those related to an accident. The second part consists in applying the selected techniques to two containment structures under pressure. The first (technique) is proposed by the ONERA (National Office for Aerospace Studies and Research of France) and the second is a mock-up of a containment wall on a 1/3 scale made by EDF (Electricity of France) within the VeRCoRs program. Communication bears on the part of the project that concerns the damaging and cracking follow-up. The tests are done in bending on 3 or 4 points in order to study the cracks’ generation, their propagation, as well as their opening and closing. The mostly ultrasonic techniques developed concern linear or non-linear acoustic: acoustic emission [1], LOCADIFF (Locating with diffuse ultrasound) [2], energy diffusion, surface waves velocity and attenuation, DAET (Dynamic Acousto-Elasticity Testing) [3]. The data contribute to providing the mapping of the parameters searched for, either in volume, in surface or globally. Image correlation is an important additional asset to validate the coherence of the data. The spatial normalization of the data allows proposing algorithms on the combination of the experimental data. The tests results are presented and they show the capacity and the limits of the evaluation of the volume, surface or global data. A data fusion procedure is associated with these results

    NectarCAM : a camera for the medium size telescopes of the Cherenkov Telescope Array

    Full text link
    NectarCAM is a camera proposed for the medium-sized telescopes of the Cherenkov Telescope Array (CTA) covering the central energy range of ~100 GeV to ~30 TeV. It has a modular design and is based on the NECTAr chip, at the heart of which is a GHz sampling Switched Capacitor Array and a 12-bit Analog to Digital converter. The camera will be equipped with 265 7-photomultiplier modules, covering a field of view of 8 degrees. Each module includes the photomultiplier bases, high voltage supply, pre-amplifier, trigger, readout and Ethernet transceiver. The recorded events last between a few nanoseconds and tens of nanoseconds. The camera trigger will be flexible so as to minimize the read-out dead-time of the NECTAr chips. NectarCAM is designed to sustain a data rate of more than 4 kHz with less than 5\% dead time. The camera concept, the design and tests of the various subcomponents and results of thermal and electrical prototypes are presented. The design includes the mechanical structure, cooling of the electronics, read-out, clock distribution, slow control, data-acquisition, triggering, monitoring and services.Comment: In Proceedings of the 34th International Cosmic Ray Conference (ICRC2015), The Hague, The Netherlands. All CTA contributions at arXiv:1508.0589
    • 

    corecore