5 research outputs found

    Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception

    Get PDF
    Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks

    Computing in the face of uncertainty : from neurons to behavior

    No full text
    Thesis (Ph. D.)--University of Rochester. Dept. of Brain and Cognitive Sciences, 2010.What are the computational mechanisms that underlie perceptual and cognitive behavior? Any answer to this question must start with the observation that the brain has to work with uncertain information at every level of analysis. The presence of uncertainty has the consequence that the problem of computation in the brain becomes one of probabilistic inference. Indeed, we can recast all cognitive processing as comprising sequential stages of probabilistic inference, performed over data of varying abstraction. In this framework, the goal of processing at a particular level is to infer the variable of interest given the input information and the goal of learning at a particular level is to improve the quality of the inference that is being carried out. In this thesis we explore and computationally characterize the inference that underlies cognitive processing at multiple levels, using multiple research methodologies. At the neural level, we derive a simple analytic expression that allows for the relation of network properties to the quality of the inference being carried out during neural representation and transmission. This derivation provides an important tool that can be used to elucidate mechanisms leading to efficient inference. We then use this expression to explore the neural mechanisms that underlie the improvements in behavioral performance, observed during perceptual learning. We report that perceptual learning can be neurally mediated through an improvement in the inference process in early sensory areas. Importantly, this model, in addition to accounting for the training induced changes in behavioral performance, also captures the training induced changes in neural properties. Finally, at the behavioral level, we show that human multi-sensory integration during categorical speech perception is well described by a normative model for optimal inference, thereby providing behavioral evidence for efficient inference in the brain. As opposed to previous studies, the study described here computationally and experimentally probes cue integration in categorical tasks, thereby representing an important extension of previous work since most real-world perceptual tasks involve judgments over categorical dimensions
    corecore