78,064 research outputs found

    Visibility of noise in natural images

    Get PDF
    The degree of visibility of any kind of stimulus is largely determined by the background on which it is shown, a property commonly known as masking. Many psychophysical experiments have been carried out to date to understand masking of sinusoids or Gabor targets by similar maskers and by noise, and a variety of masking models have been proposed. However, these stimuli are artificial and quite simplistic compared to natural scene content. Masking models based on such experiments may not be accurate for more complex cases of masking. We investigate the visibility of noise itself as a target and use natural images as the masker. Our targets are Gaussian white noise and band-pass filtered noise of varying energy. We conducted psychophysical experiments to determine the detection threshold of these noise targets on many different types of image content and present the results here. Potential applications include image watermarking or quality assessment

    You Don't See What I See:Individual Differences in the Perception of Meaning from Visual Stimuli

    Get PDF
    Everyone has their own unique version of the visual world and there has been growing interest in understanding the way that personality shapes one's perception. Here, we investigated meaningful visual experiences in relation to the personality dimension of schizotypy. In a novel approach to this issue, a non-clinical sample of subjects (total n = 197) were presented with calibrated images of scenes, cartoons and faces of varying visibility embedded in noise; the spatial properties of the images were constructed to mimic the natural statistics of the environment. In two experiments, subjects were required to indicate what they saw in a large number of unique images, both with and without actual meaningful structure. The first experiment employed an open-ended response paradigm and used a variety of different images in noise; the second experiment only presented a series of faces embedded in noise, and required a forced-choice response from the subjects. The results in all conditions indicated that a high positive schizotypy score was associated with an increased tendency to perceive complex meaning in images comprised purely of random visual noise. Individuals high in positive schizotypy seemed to be employing a looser criterion (response bias) to determine what constituted a 'meaningful' image, while also being significantly less sensitive at the task than those low in positive schizotypy. Our results suggest that differences in perceptual performance for individuals high in positive schizotypy are not related to increased suggestibility or susceptibility to instruction, as had previously been suggested. Instead, the observed reductions in sensitivity along with increased response bias toward seeing something that is not there, indirectly implicated subtle neurophysiological differences associated with the personality dimension of schizotypy, that are theoretically pertinent to the continuum of schizophrenia and hallucination-proneness

    The application of compressive sampling to radio astronomy I: Deconvolution

    Full text link
    Compressive sampling is a new paradigm for sampling, based on sparseness of signals or signal representations. It is much less restrictive than Nyquist-Shannon sampling theory and thus explains and systematises the widespread experience that methods such as the H\"ogbom CLEAN can violate the Nyquist-Shannon sampling requirements. In this paper, a CS-based deconvolution method for extended sources is introduced. This method can reconstruct both point sources and extended sources (using the isotropic undecimated wavelet transform as a basis function for the reconstruction step). We compare this CS-based deconvolution method with two CLEAN-based deconvolution methods: the H\"ogbom CLEAN and the multiscale CLEAN. This new method shows the best performance in deconvolving extended sources for both uniform and natural weighting of the sampled visibilities. Both visual and numerical results of the comparison are provided.Comment: Published by A&A, Matlab code can be found: http://code.google.com/p/csra/download

    Morphological analysis of the cm-wave continuum in the dark cloud LDN1622

    Full text link
    The spectral energy distribution of the dark cloud LDN1622, as measured by Finkbeiner using WMAP data, drops above 30GHz and is suggestive of a Boltzmann cutoff in grain rotation frequencies, characteristic of spinning dust emission. LDN1622 is conspicuous in the 31 GHz image we obtained with the Cosmic Background Imager, which is the first cm-wave resolved image of a dark cloud. The 31GHz emission follows the emission traced by the four IRAS bands. The normalised cross-correlation of the 31 GHz image with the IRAS images is higher by 6.6sigma for the 12um and 25um bands than for the 60um and 100um bands: C(12+25) = 0.76+/-0.02 and C(60+100) = 0.64+/-0.01. The mid-IR -- cm-wave correlation in LDN 1622 is evidence for very small grain (VSG) or continuum emission at 26-36GHz from a hot molecular phase. In dark clouds and their photon-dominated regions (PDRs) the 12um and 25um emission is attributed to stochastic heating of the VSGs. The mid-IR and cm-wave dust emissions arise in a limb-brightened shell coincident with the PDR of LDN1622, where the incident UV radiation from the Ori OB1b association heats and charges the grains, as required for spinning dust.Comment: accepted for publication in ApJ - the complete article with uncompressed figures may be downloaded from http://www.das.uchile.cl/~simon/ftp/l1622.pd

    Advances in Calibration and Imaging Techniques in Radio Interferometry

    Full text link
    This paper summarizes some of the major calibration and image reconstruction techniques used in radio interferometry and describes them in a common mathematical framework. The use of this framework has a number of benefits, ranging from clarification of the fundamentals, use of standard numerical optimization techniques, and generalization or specialization to new algorithms
    • …
    corecore