20 research outputs found
Spike-Triggered Covariance Analysis Reveals Phenomenological Diversity of Contrast Adaptation in the Retina
When visual contrast changes, retinal ganglion cells adapt by adjusting their sensitivity as well as their temporal filtering characteristics. The latter has classically been described by contrast-induced gain changes that depend on temporal frequency. Here, we explored a new perspective on contrast-induced changes in temporal filtering by using spike-triggered covariance analysis to extract multiple parallel temporal filters for individual ganglion cells. Based on multielectrode-array recordings from ganglion cells in the isolated salamander retina, we found that contrast adaptation of temporal filtering can largely be captured by contrast-invariant sets of filters with contrast-dependent weights. Moreover, differences among the ganglion cells in the filter sets and their contrast-dependent contributions allowed us to phenomenologically distinguish three types of filter changes. The first type is characterized by newly emerging features at higher contrast, which can be reproduced by computational models that contain response-triggered gain-control mechanisms. The second type follows from stronger adaptation in the Off pathway as compared to the On pathway in On-Off-type ganglion cells. Finally, we found that, in a subset of neurons, contrast-induced filter changes are governed by particularly strong spike-timing dynamics, in particular by pronounced stimulus-dependent latency shifts that can be observed in these cells. Together, our results show that the contrast dependence of temporal filtering in retinal ganglion cells has a multifaceted phenomenology and that a multi-filter analysis can provide a useful basis for capturing the underlying signal-processing dynamics
Separate spatial and temporal frequency tuning to visual motion in human MT+ measured with ECoG
Describing complex cells in primary visual cortex: a comparison of context and multifilter LN models
Speed Constancy or Only Slowness: What Drives the Kappa Effect
In the Kappa effect, two visual stimuli are given, and their spatial distance affects their perceived temporal interval. The classical model assumes constant speed while a competing Bayesian model assumes a slow speed prior. The two models are based on different assumptions about the statistical structure of the environment. Here we introduce a new visual experiment to distinguish between these models. When fit to the data, both the two models replicated human response, but the slowness model makes better behavioral predictions than the speed constancy model, and the estimated constant speed is close to the absolute threshold of speed. Our findings suggest that the Kappa effect appears to be due to slow speeds, and also modulated by spatial variance.</div
Model Constrained by Visual Hierarchy Improves Prediction of Neural Responses to Natural Scenes
Multiplexed computations in retinal ganglion cells of a single type
Retinal ganglion cell subtypes are traditionally thought to encode a single visual feature across the visual field to form a feature map. Here the authors show that fast OFF ganglion cells in fact respond to two visual features, either object position or speed, depending on the stimulus location
