27,292 research outputs found
Highly Variable Extinction and Accretion in the Jet-driving Class I Type Young Star PTF 10nvg (V2492 Cyg, IRAS 20496+4354)
We report extensive new photometry and spectroscopy of the highly variable
young stellar object PTF 10nvg including optical and near-infrared time series
data as well as mid-infrared and millimeter data. Following the previously
reported 2010 rise, during 2011 and 2012 the source underwent additional
episodes of brightening and dimming events including prolonged faint states.
The observed high-amplitude variations are largely consistent with extinction
changes having a 220 day quasi-periodic signal. Spectral evolution includes not
only changes in the spectral slope but correlated variation in the prominence
of TiO/VO/CO bands and atomic line emission, as well as anticorrelated
variation in forbidden line emission which, along with H_2, dominates optical
and infrared spectra at faint epochs. Neutral and singly-ionized atomic species
are likely formed in an accretion flow and/or impact while the origin of
zero-velocity atomic LiI 6707 in emission is unknown. Forbidden lines,
including several rare species, exhibit blueshifted emission profiles and
likely arise from an outflow/jet. Several of these lines are also seen
spatially offset from the continuum source position, presumably in a shocked
region of an extended jet. CARMA maps resolve on larger scales a spatially
extended outflow in mm-wavelength CO. We attribute the observed photometric and
spectroscopic behavior in terms of occultation of the central star as well as
the bright inner disk and the accretion/outflow zones that renders shocked gas
in the inner part of the jet amenable to observation at the faint epochs. We
discuss PTF 10nvg as a source exhibiting both accretion-driven (perhaps
analogous to V1647 Ori) and extinction-driven (perhaps analogous to UX Ori or
GM Cep) high-amplitude variability phenomena.Comment: accepted to AJ - in press (74 pages
Parametric study of EEG sensitivity to phase noise during face processing
<b>Background: </b>
The present paper examines the visual processing speed of complex objects, here faces, by mapping the relationship between object physical properties and single-trial brain responses. Measuring visual processing speed is challenging because uncontrolled physical differences that co-vary with object categories might affect brain measurements, thus biasing our speed estimates. Recently, we demonstrated that early event-related potential (ERP) differences between faces and objects are preserved even when images differ only in phase information, and amplitude spectra are equated across image categories. Here, we use a parametric design to study how early ERP to faces are shaped by phase information. Subjects performed a two-alternative force choice discrimination between two faces (Experiment 1) or textures (two control experiments). All stimuli had the same amplitude spectrum and were presented at 11 phase noise levels, varying from 0% to 100% in 10% increments, using a linear phase interpolation technique. Single-trial ERP data from each subject were analysed using a multiple linear regression model.
<b>Results: </b>
Our results show that sensitivity to phase noise in faces emerges progressively in a short time window between the P1 and the N170 ERP visual components. The sensitivity to phase noise starts at about 120–130 ms after stimulus onset and continues for another 25–40 ms. This result was robust both within and across subjects. A control experiment using pink noise textures, which had the same second-order statistics as the faces used in Experiment 1, demonstrated that the sensitivity to phase noise observed for faces cannot be explained by the presence of global image structure alone. A second control experiment used wavelet textures that were matched to the face stimuli in terms of second- and higher-order image statistics. Results from this experiment suggest that higher-order statistics of faces are necessary but not sufficient to obtain the sensitivity to phase noise function observed in response to faces.
<b>Conclusion: </b>
Our results constitute the first quantitative assessment of the time course of phase information processing by the human visual brain. We interpret our results in a framework that focuses on image statistics and single-trial analyses
Improving and Assessing Planet Sensitivity of the GPI Exoplanet Survey with a Forward Model Matched Filter
We present a new matched filter algorithm for direct detection of point
sources in the immediate vicinity of bright stars. The stellar Point Spread
Function (PSF) is first subtracted using a Karhunen-Lo\'eve Image Processing
(KLIP) algorithm with Angular and Spectral Differential Imaging (ADI and SDI).
The KLIP-induced distortion of the astrophysical signal is included in the
matched filter template by computing a forward model of the PSF at every
position in the image. To optimize the performance of the algorithm, we conduct
extensive planet injection and recovery tests and tune the exoplanet spectra
template and KLIP reduction aggressiveness to maximize the Signal-to-Noise
Ratio (SNR) of the recovered planets. We show that only two spectral templates
are necessary to recover any young Jovian exoplanets with minimal SNR loss. We
also developed a complete pipeline for the automated detection of point source
candidates, the calculation of Receiver Operating Characteristics (ROC), false
positives based contrast curves, and completeness contours. We process in a
uniform manner more than 330 datasets from the Gemini Planet Imager Exoplanet
Survey (GPIES) and assess GPI typical sensitivity as a function of the star and
the hypothetical companion spectral type. This work allows for the first time a
comparison of different detection algorithms at a survey scale accounting for
both planet completeness and false positive rate. We show that the new forward
model matched filter allows the detection of fainter objects than a
conventional cross-correlation technique with a Gaussian PSF template for the
same false positive rate.Comment: ApJ accepte
Motion clouds: model-based stimulus synthesis of natural-like random textures for the study of motion perception
Choosing an appropriate set of stimuli is essential to characterize the
response of a sensory system to a particular functional dimension, such as the
eye movement following the motion of a visual scene. Here, we describe a
framework to generate random texture movies with controlled information
content, i.e., Motion Clouds. These stimuli are defined using a generative
model that is based on controlled experimental parametrization. We show that
Motion Clouds correspond to dense mixing of localized moving gratings with
random positions. Their global envelope is similar to natural-like stimulation
with an approximate full-field translation corresponding to a retinal slip. We
describe the construction of these stimuli mathematically and propose an
open-source Python-based implementation. Examples of the use of this framework
are shown. We also propose extensions to other modalities such as color vision,
touch, and audition
Laser diagnostics and minor species detection in combustion using resonant four-wave mixing
Peer reviewedPostprin
Visual sensitivity, blur and the sources of variability in the amplitude spectra of natural scenes
AbstractA number of researchers have suggested that in order to understand the response properties of cells in the visual pathway, we must consider the statistical structure of the natural environment. In this paper, we focus on one aspect of that structure, namely, the correlational structure which is described by the amplitude or power spectra of natural scenes. We propose that the principle insight one gains from considering the image spectra is in understanding the relative sensitivity of cells tuned to different spatial frequencies. This study employs a model in which the peak sensitivity is constant as a function of frequency with linear bandwith increasing (i.e., approximately constant in octaves). In such a model, the “response magnitude” (i.e., vector length) of cells increases as a function of their optimal (or central) spatial frequency out to about 20 cyc/deg. The result is a code in which the response to natural scenes, whose amplitude spectra typically fall as 1/f, is roughly constant out to 20 cyc/deg. An important consideration in evaluating this model of sensitivity is the fact that natural scenes show considerable variability in their amplitude spectra, with individual scenes showing falloffs which are often steeper or shallower than 1/f. Using a new measure of image structure (the “rectified contrast spectrum” or “RCS”) on a set of calibrated natural images, it is shown that a large part of the variability in the spectra is due to differences in the sparseness of local structure at different scales. That is, an image which is “in focus” will have structure (e.g., edges) which has roughly the same magnitude across scale. That is, the loss of high frequency energy in some images is due to the reduction of the number of regions that contain structure rather than the amplitude of that structure. An “in focus” image will have structure (e.g., edges) across scale that have roughly equal magnitude but may vary in the area covered by structure. The slope of the RCS was found to provide a reasonable prediction of physical blur across a variety of scenes in spite of the variability in their amplitude spectra. It was also found to produce a good prediction of perceived blur as judged by human subjects
On the effectiveness of noise masks: Naturalistic vs. un-naturalistic image statistics
AbstractIt has been argued that the human visual system is optimized for identification of broadband objects embedded in stimuli possessing orientation averaged power spectra fall-offs that obey the 1/fβ relationship typically observed in natural scene imagery (i.e., β=2.0 on logarithmic axes). Here, we were interested in whether individual spatial channels leading to recognition are functionally optimized for narrowband targets when masked by noise possessing naturalistic image statistics (β=2.0). The current study therefore explores the impact of variable β noise masks on the identification of narrowband target stimuli ranging in spatial complexity, while simultaneously controlling for physical or perceived differences between the masks. The results show that β=2.0 noise masks produce the largest identification thresholds regardless of target complexity, and thus do not seem to yield functionally optimized channel processing. The differential masking effects are discussed in the context of contrast gain control
A Data Cube Extraction Pipeline for a Coronagraphic Integral Field Spectrograph
Project 1640 is a high contrast near-infrared instrument probing the
vicinities of nearby stars through the unique combination of an integral field
spectrograph with a Lyot coronagraph and a high-order adaptive optics system.
The extraordinary data reduction demands, similar those which several new
exoplanet imaging instruments will face in the near future, have been met by
the novel software algorithms described herein. The Project 1640 Data Cube
Extraction Pipeline (PCXP) automates the translation of 3.8*10^4 closely
packed, coarsely sampled spectra to a data cube. We implement a robust
empirical model of the spectrograph focal plane geometry to register the
detector image at sub-pixel precision, and map the cube extraction. We
demonstrate our ability to accurately retrieve source spectra based on an
observation of Saturn's moon Titan.Comment: 35 pages, 15 figures; accepted for publication in PAS
- …