341,879 research outputs found

    High-speed imaging and wavefront sensing with an infrared avalanche photodiode array

    Get PDF
    Infrared avalanche photodiode arrays represent a panacea for many branches of astronomy by enabling extremely low-noise, high-speed and even photon-counting measurements at near-infrared wavelengths. We recently demonstrated the use of an early engineering-grade infrared avalanche photodiode array that achieves a correlated double sampling read noise of 0.73 e- in the lab, and a total noise of 2.52 e- on sky, and supports simultaneous high-speed imaging and tip-tilt wavefront sensing with the Robo-AO visible-light laser adaptive optics system at the Palomar Observatory 1.5-m telescope. We report here on the improved image quality achieved simultaneously at visible and infrared wavelengths by using the array as part of an image stabilization control-loop with adaptive-optics sharpened guide stars. We also discuss a newly enabled survey of nearby late M-dwarf multiplicity as well as future uses of this technology in other adaptive optics and high-contrast imaging applications.Comment: Accepted to Astrophysical Journal. 8 pages, 3 figures and 1 tabl

    First cosmic shear results from the Canada-France-Hawaii Telescope Wide Synoptic Legacy Survey

    Full text link
    We present the first measurements of the weak gravitational lensing signal induced by the large scale mass distribution from data obtained as part of the ongoing Canada-France-Hawaii Telescope Legacy Survey (CFHTLS). The data used in this analysis are from the Wide Synoptic Survey, which aims to image ~170 square degree in five filters. We have analysed ~22 deg2 (31 pointings) of i' data spread over two of the three survey fields. These data are of excellent quality and the results bode well for the remainder of the survey: we do not detect a significant `B'-mode, suggesting that residual systematics are negligible at the current level of accuracy. Assuming a Cold Dark Matter model and marginalising over the Hubble parameter h=[0.6,0.8], the source redshift distribution and systematics, we constrain sigma_8, the amplitude of the matter power spectrum. At a fiducial matter density Omega_m=0.3 we find sigma_8=0.85+-0.06. This estimate is in excellent agreement with previous studies. Combination of our results with those from the Deep component of the CFHTLS enables us to place a constraint on a constant equation of state for the dark energy, based on cosmic shear data alone. We find that w_0<-0.8 at 68% confidence.Comment: Submitted to Ap

    On the effect of image denoising on galaxy shape measurements

    Full text link
    Weak gravitational lensing is a very sensitive way of measuring cosmological parameters, including dark energy, and of testing current theories of gravitation. In practice, this requires exquisite measurement of the shapes of billions of galaxies over large areas of the sky, as may be obtained with the EUCLID and WFIRST satellites. For a given survey depth, applying image denoising to the data both improves the accuracy of the shape measurements and increases the number density of galaxies with a measurable shape. We perform simple tests of three different denoising techniques, using synthetic data. We propose a new and simple denoising method, based on wavelet decomposition of the data and a Wiener filtering of the resulting wavelet coefficients. When applied to the GREAT08 challenge dataset, this technique allows us to improve the quality factor of the measurement (Q; GREAT08 definition), by up to a factor of two. We demonstrate that the typical pixel size of the EUCLID optical channel will allow us to use image denoising.Comment: Accepted for publication in A&A. 8 pages, 5 figure

    Pixelation effects in weak lensing

    Get PDF
    Weak gravitational lensing can be used to investigate both dark matter and dark energy but requires accurate measurements of the shapes of faint, distant galaxies. Such measurements are hindered by the finite resolution and pixel scale of digital cameras. We investigate the optimum choice of pixel scale for a space-based mission, using the engineering model and survey strategy of the proposed Supernova Acceleration Probe as a baseline. We do this by simulating realistic astronomical images containing a known input shear signal and then attempting to recover the signal using the Rhodes, Refregier, & Groth algorithm. We find that the quality of shear measurement is always improved by smaller pixels. However, in practice, telescopes are usually limited to a finite number of pixels and operational life span, so the total area of a survey increases with pixel size. We therefore fix the survey lifetime and the number of pixels in the focal plane while varying the pixel scale, thereby effectively varying the survey size. In a pure trade-off for image resolution versus survey area, we find that measurements of the matter power spectrum would have minimum statistical error with a pixel scale of 0.09 '' for a 0.14 '' FWHM point-spread function (PSF). The pixel scale could be increased to similar to 0.16 '' if images dithered by exactly half-pixel offsets were always available. Some of our results do depend on our adopted shape measurement method and should be regarded as an upper limit: future pipelines may require smaller pixels to overcome systematic floors not yet accessible, and, in certain circumstances, measuring the shape of the PSF might be more difficult than those of galaxies. However, the relative trends in our analysis are robust, especially those of the surface density of resolved galaxies. Our approach thus provides a snapshot of potential in available technology, and a practical counterpart to analytic studies of pixelation, which necessarily assume an idealized shape measurement method

    The Network for Calibration and Validation in Earth Observation (NCAVEO) 2006 field campaign

    No full text
    This paper describes a remote sensing field campaign undertaken by the Network for Calibration and Validation in Earth Observation (NCAVEO) in southern England in June 2006. The aims of the campaign were: (a) to gain experience in the collection and use of field data to validate radiance and reflectance products from airborne and satellite sensors; (b) to share best practice on the validation of leaf-area index (LAI) estimates derived from satellite sensor data; and (c) to assemble a quality controlled, multi-scale, multi-sensor data set for algorithm development and testing. Data specifically to support the campaign experiments were acquired by CHRIS/Proba, SPOT and three satellites in the DMC constellation. Three aircraft fitted with hyperspectral sensors, LiDAR and high performance digital survey cameras were flown over the test area. Several field teams made measurements on the ground, and many data sets were acquired near-simultaneously so as to allow direct inter-comparison. The data may be accessed via the NERC EO Data Centre and potential uses are many and varied, including research, education and training on the physical basis of remote sensing (e.g. sensor and instrument calibration); image understanding (e.g. up- and down-scaling); and remote sensing applications (e.g. land cover mapping, forest survey, river habitat survey, LAI estimation, policy-related issues)

    Photometric Redshifts with Surface Brightness Priors

    Full text link
    We use galaxy surface brightness as prior information to improve photometric redshift (photo-z) estimation. We apply our template-based photo-z method to imaging data from the ground-based VVDS survey and the space-based GOODS field from HST, and use spectroscopic redshifts to test our photometric redshifts for different galaxy types and redshifts. We find that the surface brightness prior eliminates a large fraction of outliers by lifting the degeneracy between the Lyman and 4000 Angstrom breaks. Bias and scatter are improved by about a factor of 2 with the prior for both the ground and space data. Ongoing and planned surveys from the ground and space will benefit, provided that care is taken in measurements of galaxy sizes and in the application of the prior. We discuss the image quality and signal-to-noise requirements that enable the surface brightness prior to be successfully applied.Comment: 15 pages, 13 figures, matches published versio

    Snowex 2017 Community Snow Depth Measurements: A Quality-Controlled, Georeferenced Product

    Get PDF
    Snow depth was one of the core ground measurements required to validate remotely-sensed data collected during SnowEx Year 1, which occurred in Colorado. The use of a single, common protocol was fundamental to produce a community reference dataset of high quality. Most of the nearly 100 Grand Mesa and Senator Beck Basin SnowEx ground crew participants contributed to this crucial dataset during 6-25 February 2017. Snow depths were measured along ~300 m transects, whose locations were determined according to a random-stratified approach using snowfall and tree-density gradients. Two-person teams used snowmobiles, skis, or snowshoes to travel to staked transect locations and to conduct measurements. Depths were measured with a 1-cm incremented probe every 3 meters along transects. In shallow areas of Grand Mesa, depth measurements were also collected with GPS snow-depth probes (a.k.a. MagnaProbes) at ~1-m intervals. During summer 2017, all reference stake positions were surveyed with <10 cm accuracy to improve overall snow depth location accuracy. During the campaign, 193 transects were measured over three weeks at Grand Mesa and 40 were collected over two weeks in Senator Beck Basin, representing more than 27,000 depth values. Each day of the campaign depth measurements were written in waterproof field books and photographed by National Snow and Ice Data Center (NSIDC) participants. The data were later transcribed and prepared for extensive quality assessment and control. Common issues such as protocol errors (e.g., survey in reverse direction), notebook image issues (e.g., halo in the center of digitized picture), and data-entry errors (sloppy writing and transcription errors) were identified and fixed on a point-by-point basis. In addition, we strove to produce a georeferenced product of fine quality, so we calculated and interpolated coordinates for every depth measurement based on surveyed stakes and the number of measurements made per transect. The product has been submitted to NSIDC in csv format. To educate data users, we present the study design and processing steps that have improved the quality and usability of this product. Also, we will address measurement and design uncertainties, which are different in open vs. forest areas

    Noise Estimates for Measurements of Weak Lensing from the Lyman-alpha Forest

    Full text link
    We have proposed a method for measuring weak lensing using the Lyman-alpha forest. Here we estimate the noise expected in weak lensing maps and power spectra for different sets of observational parameters. We find that surveys of the size and quality of the ones being done today and ones planned for the future will be able to measure the lensing power spectrum at a source redshift of z~2.5 with high precision and even be able to image the distribution of foreground matter with high fidelity on degree scales. For example, we predict that Lyman-alpha forest lensing measurement from the Dark Energy Spectroscopic Instrument survey should yield the mass fluctuation amplitude with statistical errors of 1.5%. By dividing the redshift range into multiple bins some tomographic lensing information should be accessible as well. This would allow for cosmological lensing measurements at higher redshift than are accessible with galaxy shear surveys and correspondingly better constraints on the evolution of dark energy at relatively early times.Comment: 8 pages, 8 figures, submitted to MNRA

    Diagnostic and interventional radiology: a strategy to introduce reference dose level taking into account the national practice

    Get PDF
    The purpose of this study is to present a strategy to define the reference dose levels for fluoroscopic, dose-intensive examinations. This work is a part of the project of the Federal Office of Public Health of Switzerland to translate the guidelines of the International Commission on Radiological Protection and the European Union into action. The study will also be used to set reference dose levels on the basis of a national survey. All the fluoroscopic units, involved in the survey, were equipped with a KAP (kerma-area product) meter. All KAP meters were first calibrated to ensure the comparability of the dose measurements. The doses and the dose rates together with subjective image quality measurements were acquired in all the centres. Eight types of examination were chosen by a panel of radiologists, and each of the five centres involved agreed to monitor 20 patients per examination type. A wide variation in the dose and the image quality in fixed geometry was observed. For example, the skin dose rate for abdominal examinations varied in the range of 12-42 mGy min−1 for comparable image quality. Average KAP values of 67, 178, 106, 102, 473, 205, 307 and 316 Gy cm2 were recorded for barium meal, abdominal angiography, cerebral angiography, barium enema, hepatic embolisation, biliary drainage, cerebral embolisation and femoral stenting, respectively. The values obtained in this limited study are generally higher than the ones available in the literature and strategies to optimise these studies have to be discussed. A strict control concerning the denomination of the examination type involved in such a study is mandatory to obtain reliable data. This can only be done through a close collaboration between physicians, radiographers and medical physicist
    • 

    corecore