41 research outputs found
Recommended from our members
Serial dependence in timing perception
Recent sensory history affects subsequent experience. Behavioral results have demonstrated this effect in two forms: repeated exposure to the same sensory input produces negative aftereffects wherein sensory stimuli like those previously experienced are judged as less like the exposed stimulation, while singular exposures can produce positive aftereffects wherein judgments are more like previously experienced stimulation. For timing perception, there is controversy regarding the influence of recent exposure—both singular and repeated exposure produce apparently negative aftereffects—often referred to as temporal recalibration and rapid temporal recalibration, respectively. While negative aftereffects have been found following repeated exposure for all timing tasks, following a single exposure, they have only been demonstrated using synchrony judgments (SJs). Here, we examine the influence of a single presentation—serial dependence for timing—for standard timing tasks: SJ, temporal order judgments, and magnitude estimation judgments. We found that serial dependence produced apparently negative aftereffects in SJ, but positive aftereffects in temporal order judgment and magnitude estimation judgment. We propose that these findings, and those following repeated exposure, can be reconciled within a framework wherein negative aftereffects occur at sensory layers, consistent with classical depictions of sensory adaptation, and Bayesian-like positive aftereffects operate across different, higher, decision levels. These findings are consistent with the aftereffects known from other perceptual dimensions and provide a general framework for interpreting positive (serial dependence) and negative (sensory adaptation) aftereffects across different tasks
Sensory adaptation for timing perception
Recent sensory experience modifies subjective timing perception. For example, when visual events repeatedly lead auditory events, such as when the sound and video tracks of a movie are out of sync, subsequent vision-leads-audio presentations are reported as more simultaneous. This phenomenon could provide insights into the fundamental problem of how timing is represented in the brain, but the underlying mechanisms are poorly understood. Here, we show that the effect of recent experience on timing perception is not just subjective; recent sensory experience also modifies relative timing discrimination. This result indicates that recent sensory history alters the encoding of relative timing in sensory areas, excluding explanations of the subjective phenomenon based only on decision-level changes. The pattern of changes in timing discrimination suggests the existence of two sensory components, similar to those previously reported for visual spatial attributes: a lateral shift in the nonlinear transducer that maps relative timing into perceptual relative timing and an increase in transducer slope around the exposed timing. The existence of these components would suggest that previous explanations of how recent experience may change the sensory encoding of timing, such as changes in sensory latencies or simple implementations of neural population codes, cannot account for the effect of sensory adaptation on timing perception
The illusion of uniformity does not depend on the primary visual cortex: evidence from sensory adaptation
Visual experience appears richly detailed despite the poor resolution of the majority of the visual field, thanks to foveal-peripheral integration. The recently described Uniformity Illusion (UI), wherein peripheral elements of a pattern take on the appearance of foveal elements, may shed light on this integration. We examined the basis of UI by generating adaptation to a pattern of Gabors suitable for producing UI on orientation. After removing the pattern, participants reported the tilt of a single peripheral Gabor. The tilt after-effect followed the physical adapting orientation rather than the global orientation perceived under UI, even when the illusion had been reported for a long time. Conversely, a control experiment replacing illusory uniformity with a physically uniform Gabor pattern for the same durations did produce an after-effect to the global orientation. Results indicate that UI is not associated with changes in sensory encoding at V1, but likely depends on higher-level processes
Recommended from our members
Serious problems with interpreting rubber hand “Illusion” experiments
The rubber hand “illusion” (RHI), in which participants report experiences of ownership over a fake hand, appears to demonstrate that subjective ownership over one's body can be easily disrupted. It was recently shown that existing methods of controlling for suggestion effects in RHI responding are invalid. It was also shown that propensity to agree with RHI ownership statements is correlated with trait phenomenological control (response to imaginative suggestion). There is currently disagreement regarding the extent to which this relationship may cofound interpretation of RHI measures. Here we present the results of simulated experiments to demonstrate that a relationship between trait phenomenological control and RHI responding of the size reported would fundamentally change the way existing RHI results must be interpreted. Using real participant data, each simulated experiment used a sample biased in selection for trait phenomenological control. We find that using experiment samples comprised only of participants higher in trait phenomenological control almost guarantees that an experiment provides evidence consistent with RHI. By contrast, samples comprised of only participants lower in trait phenomenological control find evidence for RHI only around half the time - and of greater concern, evidence specifically for “ownership” experience just 4% of the time. These findings clearly contradict claims that the magnitude of relationship between phenomenological control and RHI responding is a minor concern, demonstrating that the presence of participants higher in trait phenomenological control in a given RHI experiment sample is critical for finding evidence consistent with RHI. Further study and theorising regarding RHI (and related effects) must take into account the role that trait phenomenological control plays in participant experience and responses during RHI experiments
Serial dependence in the perception of visual variance
The recent history of perceptual experience has been shown to influence subsequent perception. Classically, this dependence on perceptual history has been examined in sensory adaptation paradigms, wherein prolonged exposure to a particular stimulus (e.g. a vertically oriented grating) produces changes in perception of subsequently presented stimuli (e.g. the tilt aftereffect). More recently, several studies have investigated the influence of shorter perceptual exposure with effects, referred to as serial dependence, being described for a variety of low and high-level perceptual dimensions. In this study, we examined serial dependence in the processing of dispersion statistics, namely variance - a key descriptor of the environment and indicative of the precision and reliability of ensemble representations. We found two opposite serial dependencies operating at different timescales, and likely originating at different processing levels: A positive, Bayesian-like bias was driven by the most recent exposures, dependent on feature-specific decision-making and appearing only when high confidence was placed in that decision; and a longer-lasting negative bias - akin to an adaptation after-effect - becoming manifest as the positive bias declined. Both effects were independent of spatial presentation location and the similarity of other close traits, such as mean direction of the visual variance stimulus. These findings suggest that visual variance processing occurs in high-level areas, but is also subject to a combination of multi-level mechanisms balancing perceptual stability and sensitivity, as with many different perceptual dimensions
Recommended from our members
Thermal-tactile integration in object temperature perception
The brain consistently faces a challenge of whether and how to combine the available information sources to estimate the properties of an object explored by hand. While object perception is an inference process involving multisensory inputs, thermal referral (TR) is an illusion demonstrating how interaction between thermal and tactile systems can lead to deviations from physical reality – When observers touch three stimulators simultaneously with the middle three fingers of one hand but only the outer two stimulators are heated (or cooled), thermal uniformity is perceived across three fingers. Here we used TR of warmth to examine the thermal-tactile interaction in object temperature perception. We show that TR is consistent with precision-weighted averaging of thermal sensation across tactile locations. Further, we show that prolonged contact with TR stimulation results in adaptation to the local variations of veridical temperatures instead of the thermal uniformity perceived across three fingers. Our results illuminate the flexibility of processing that underlies thermal-tactile interactions and serve as a basis for thermal display design
A deep-dream virtual reality platform for studying altered perceptual phenomenology
Altered states of consciousness, such as psychotic or pharmacologically-induced hallucinations, provide a unique opportunity to examine the mechanisms underlying conscious perception. However, the phenomenological properties of these states are difficult to isolate experimentally from other, more general physiological and cognitive 36 effects of psychoactive substances or psychopathological conditions. Thus, simulating phenomenological aspects of altered states in the absence of these other more general effects provides an important experimental tool for consciousness science and psychiatry. Here we describe such a tool, which we call the Hallucination Machine. It comprises a novel combination of two powerful technologies: deep convolutional neural networks (DCNNs) and panoramic videos of natural scenes, viewed immersively through a head-mounted display (panoramic VR). By doing this, we are able to simulate visual hallucinatory experiences in a biologically plausible and ecologically valid way. Two experiments illustrate potential applications of the Hallucination Machine. First, we show that the system induces visual phenomenology qualitatively similar to classical psychedelics. In a second experiment, we find that simulated hallucinations do not evoke the temporal distortion commonly associated with altered states. Overall, the Hallucination Machine offers a valuable new technique for simulating altered phenomenology without directly altering the underlying neurophysiology
Audio-Visual Speech Cue Combination
Background: Different sources of sensory information can interact, often shaping what we think we have seen or heard. This can enhance the precision of perceptual decisions relative to those made on the basis of a single source of information. From a computational perspective, there are multiple reasons why this might happen, and each predicts a different degree of enhanced precision. Relatively slight improvements can arise when perceptual decisions are made on the basis of multiple independent sensory estimates, as opposed to just one. These improvements can arise as a consequence of probability summation. Greater improvements can occur if two initially independent estimates are summated to form a single integrated code, especially if the summation is weighted in accordance with the variance associated with each independent estimate. This form of combination is often described as a Bayesian maximum likelihood estimate. Still greater improvements are possible if the two sources of information are encoded via a common physiological process. Principal Findings: Here we show that the provision of simultaneous audio and visual speech cues can result in substantial sensitivity improvements, relative to single sensory modality based decisions. The magnitude of the improvements is greater than can be predicted on the basis of either a Bayesian maximum likelihood estimate or a probability summation. Conclusion: Our data suggest that primary estimates of speech content are determined by a physiological process that takes input from both visual and auditory processing, resulting in greater sensitivity than would be possible if initially independent audio and visual estimates were formed and then subsequently combined
Perceptual content, not physiological signals, determines perceived duration when viewing dynamic, natural scenes
The neural basis of time perception remains unknown. A prominent account is the pacemaker-accumulator model, wherein regular ticks of some physiological or neural pacemaker are read out as time. Putative candidates for the pacemaker have been suggested in physiological processes (heartbeat), or dopaminergic mid-brain neurons, whose activity has been associated with spontaneous blinking. However, such proposals have difficulty accounting for observations that time perception varies systematically with perceptual content. We examined physiological influences on human duration estimates for naturalistic videos between 1-64 seconds using cardiac and eye recordings. Duration estimates were biased by the amount of change in scene content. Contrary to previous claims, heart rate, and blinking were not related to duration estimates. Our results support a recent proposal that tracking change in perceptual classification networks provides a basis for human time perception, and suggest that previous assertions of the importance of physiological factors should be tempered
Recommended from our members
Intentional binding without intentional action
The experience of authorship over one’s actions and their consequences—sense of agency—is a fundamental aspect of conscious experience. In recent years, it has become common to use intentional binding as an implicit measure of the sense of agency. However, it remains contentious whether reported intentional-binding effects indicate the role of intention-related information in perception or merely represent a strong case of multisensory causal binding. Here, we used a novel virtual-reality setup to demonstrate identical magnitude-binding effects in both the presence and complete absence of intentional action, when perceptual stimuli were matched for temporal and spatial information. Our results demonstrate that intentional-binding-like effects are most simply accounted for by multisensory causal binding without necessarily being related to intention or agency. Future studies that relate binding effects to agency must provide evidence for effects beyond that expected for multisensory causal binding by itself