9,246 research outputs found
New Techniques for High-Contrast Imaging with ADI: the ACORNS-ADI SEEDS Data Reduction Pipeline
We describe Algorithms for Calibration, Optimized Registration, and Nulling
the Star in Angular Differential Imaging (ACORNS-ADI), a new, parallelized
software package to reduce high-contrast imaging data, and its application to
data from the SEEDS survey. We implement several new algorithms, including a
method to register saturated images, a trimmed mean for combining an image
sequence that reduces noise by up to ~20%, and a robust and computationally
fast method to compute the sensitivity of a high-contrast observation
everywhere on the field-of-view without introducing artificial sources. We also
include a description of image processing steps to remove electronic artifacts
specific to Hawaii2-RG detectors like the one used for SEEDS, and a detailed
analysis of the Locally Optimized Combination of Images (LOCI) algorithm
commonly used to reduce high-contrast imaging data. ACORNS-ADI is written in
python. It is efficient and open-source, and includes several optional features
which may improve performance on data from other instruments. ACORNS-ADI
requires minimal modification to reduce data from instruments other than
HiCIAO. It is freely available for download at
www.github.com/t-brandt/acorns-adi under a BSD license.Comment: 15 pages, 9 figures, accepted to ApJ. Replaced with accepted version;
mostly minor changes. Software update
Data Processing Pipeline for Pointing Observations of Lunar-based Ultraviolet Telescope
We describe the data processing pipeline developed to reduce the pointing
observation data of Lunar-based Ultraviolet Telescope (LUT), which belongs to
the Chang'e-3 mission of the Chinese Lunar Exploration Program. The pointing
observation program of LUT is dedicated to monitor variable objects in a
near-ultraviolet (245-345 nm) band. LUT works in lunar daytime for sufficient
power supply, so some special data processing strategies have been developed
for the pipeline. The procedures of the pipeline include stray light removing,
astrometry, flat fielding employing superflat technique, source extraction and
cosmic rays rejection, aperture and PSF photometry, aperture correction, and
catalogues archiving, etc. It has been intensively tested and works smoothly
with observation data. The photometric accuracy is typically ~0.02 mag for LUT
10 mag stars (30 s exposure), with errors come from background noises,
residuals of stray light removing, and flat fielding related errors. The
accuracy degrades to be ~0.2 mag for stars of 13.5 mag which is the 5{\sigma}
detection limit of LUT.Comment: 10 pages, 7 figures, 4 tables. Minor changes and some expounding
words added. Version accepted for publication in Astrophysics and Space
Science (Ap&SS
The FHD/ppsilon Epoch of Reionization Power Spectrum Pipeline
Epoch of Reionization data analysis requires unprecedented levels of accuracy
in radio interferometer pipelines. We have developed an imaging power spectrum
analysis to meet these requirements and generate robust 21 cm EoR measurements.
In this work, we build a signal path framework to mathematically describe each
step in the analysis, from data reduction in the FHD package to power spectrum
generation in the ppsilon package. In particular, we focus on the
distinguishing characteristics of FHD/ppsilon: highly accurate
spectral calibration, extensive data verification products, and end-to-end
error propagation. We present our key data analysis products in detail to
facilitate understanding of the prominent systematics in image-based power
spectrum analyses. As a verification to our analysis, we also highlight a
full-pipeline analysis simulation to demonstrate signal preservation and lack
of signal loss. This careful treatment ensures that the
FHD/ppsilon power spectrum pipeline can reduce radio
interferometric data to produce credible 21 cm EoR measurements.Comment: 21 pages, 10 figures, accepted by PAS
Image-Processing Techniques for the Creation of Presentation-Quality Astronomical Images
The quality of modern astronomical data, the power of modern computers and
the agility of current image-processing software enable the creation of
high-quality images in a purely digital form. The combination of these
technological advancements has created a new ability to make color astronomical
images. And in many ways it has led to a new philosophy towards how to create
them. A practical guide is presented on how to generate astronomical images
from research data with powerful image-processing programs. These programs use
a layering metaphor that allows for an unlimited number of astronomical
datasets to be combined in any desired color scheme, creating an immense
parameter space to be explored using an iterative approach. Several examples of
image creation are presented.
A philosophy is also presented on how to use color and composition to create
images that simultaneously highlight scientific detail and are aesthetically
appealing. This philosophy is necessary because most datasets do not correspond
to the wavelength range of sensitivity of the human eye. The use of visual
grammar, defined as the elements which affect the interpretation of an image,
can maximize the richness and detail in an image while maintaining scientific
accuracy. By properly using visual grammar, one can imply qualities that a
two-dimensional image intrinsically cannot show, such as depth, motion and
energy. In addition, composition can be used to engage viewers and keep them
interested for a longer period of time. The use of these techniques can result
in a striking image that will effectively convey the science within the image,
to scientists and to the public.Comment: 104 pages, 38 figures, submitted to A
Automated reduction of submillimetre single-dish heterodyne data from the James Clerk Maxwell Telescope using ORAC-DR
With the advent of modern multi-detector heterodyne instruments that can
result in observations generating thousands of spectra per minute it is no
longer feasible to reduce these data as individual spectra. We describe the
automated data reduction procedure used to generate baselined data cubes from
heterodyne data obtained at the James Clerk Maxwell Telescope. The system can
automatically detect baseline regions in spectra and automatically determine
regridding parameters, all without input from a user. Additionally it can
detect and remove spectra suffering from transient interference effects or
anomalous baselines. The pipeline is written as a set of recipes using the
ORAC-DR pipeline environment with the algorithmic code using Starlink software
packages and infrastructure. The algorithms presented here can be applied to
other heterodyne array instruments and have been applied to data from
historical JCMT heterodyne instrumentation.Comment: 18 pages, 13 figures, submitted to Monthly Notices of the Royal
Astronomical Societ
- …