87 research outputs found

    A search asymmetry for interocular conflict

    Get PDF
    When two different images are presented to the two eyes, the percept will alternate between the images (a phenomenon called binocular rivalry). In the present study, we investigate the degree to which such interocular conflict is conspicuous. By using a visual search task, we show that search for interocular conflict is near efficient (15 ms/item) and can lead to a search asymmetry, depending on the contrast in the display. We reconcile our findings with those of Wolfe and Franzel (1988), who reported inefficient search for interocular conflict (26 ms/item) and found no evidence for a search asymmetry. In addition, we provide evidence for the suggestion that differences in search for interocular conflict are contingent on the degree of abnormal fusion of the dissimilar images

    Synchronized Audio-Visual Transients Drive Efficient Visual Search for Motion-in-Depth

    Get PDF
    In natural audio-visual environments, a change in depth is usually correlated with a change in loudness. In the present study, we investigated whether correlating changes in disparity and loudness would provide a functional advantage in binding disparity and sound amplitude in a visual search paradigm. To test this hypothesis, we used a method similar to that used by van der Burg et al. to show that non-spatial transient (square-wave) modulations of loudness can drastically improve spatial visual search for a correlated luminance modulation. We used dynamic random-dot stereogram displays to produce pure disparity modulations. Target and distractors were small disparity-defined squares (either 6 or 10 in total). Each square moved back and forth in depth in front of the background plane at different phases. The target’s depth modulation was synchronized with an amplitude-modulated auditory tone. Visual and auditory modulations were always congruent (both sine-wave or square-wave). In a speeded search task, five observers were asked to identify the target as quickly as possible. Results show a significant improvement in visual search times in the square-wave condition compared to the sine condition, suggesting that transient auditory information can efficiently drive visual search in the disparity domain. In a second experiment, participants performed the same task in the absence of sound and showed a clear set-size effect in both modulation conditions. In a third experiment, we correlated the sound with a distractor instead of the target. This produced longer search times, indicating that the correlation is not easily ignored

    Collective Animal Behavior from Bayesian Estimation and Probability Matching

    Get PDF
    Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is based on empirical fits to observations and we lack first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching.
In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability given by the Bayesian estimation that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior

    Multisensory Oddity Detection as Bayesian Inference

    Get PDF
    A key goal for the perceptual system is to optimally combine information from all the senses that may be available in order to develop the most accurate and unified picture possible of the outside world. The contemporary theoretical framework of ideal observer maximum likelihood integration (MLI) has been highly successful in modelling how the human brain combines information from a variety of different sensory modalities. However, in various recent experiments involving multisensory stimuli of uncertain correspondence, MLI breaks down as a successful model of sensory combination. Within the paradigm of direct stimulus estimation, perceptual models which use Bayesian inference to resolve correspondence have recently been shown to generalize successfully to these cases where MLI fails. This approach has been known variously as model inference, causal inference or structure inference. In this paper, we examine causal uncertainty in another important class of multi-sensory perception paradigm – that of oddity detection and demonstrate how a Bayesian ideal observer also treats oddity detection as a structure inference problem. We validate this approach by showing that it provides an intuitive and quantitative explanation of an important pair of multi-sensory oddity detection experiments – involving cues across and within modalities – for which MLI previously failed dramatically, allowing a novel unifying treatment of within and cross modal multisensory perception. Our successful application of structure inference models to the new ‘oddity detection’ paradigm, and the resultant unified explanation of across and within modality cases provide further evidence to suggest that structure inference may be a commonly evolved principle for combining perceptual information in the brain

    An equal right to inherit? Women's land rights, customary law and constitutional reform in Tanzania

    Get PDF
    This article explores contemporary contestations surrounding women’s inheritance of land in Africa. Legal activism has gained momentum, both in agendas for law reform and in test case litigation, which reached United Nations Committee on the Elimination of Discrimination against Women in ES and SC v. United Republic of Tanzania. Comparing the approach of Tanzania to that of its neighbours, Uganda, Kenya and Rwanda, this article explores patterns of resistance and omission towards enshrining an equal right to inherit in land and succession laws. It identifies two main reasons: neoliberal drivers for land law reform of the 1990s and sociopolitical sensitivity surrounding inheritance of land. It argues that a progressive approach to constitutional and law reform on women’s land rights requires understanding of the realities of claims to family land based on kinship relations. It calls for a holistic approach to land, marriage and inheritance law reform underpinned with constitutional rights to equality and progressive interpretations of living customary law

    Continuous Evolution of Statistical Estimators for Optimal Decision-Making

    Get PDF
    In many everyday situations, humans must make precise decisions in the presence of uncertain sensory information. For example, when asked to combine information from multiple sources we often assign greater weight to the more reliable information. It has been proposed that statistical-optimality often observed in human perception and decision-making requires that humans have access to the uncertainty of both their senses and their decisions. However, the mechanisms underlying the processes of uncertainty estimation remain largely unexplored. In this paper we introduce a novel visual tracking experiment that requires subjects to continuously report their evolving perception of the mean and uncertainty of noisy visual cues over time. We show that subjects accumulate sensory information over the course of a trial to form a continuous estimate of the mean, hindered only by natural kinematic constraints (sensorimotor latency etc.). Furthermore, subjects have access to a measure of their continuous objective uncertainty, rapidly acquired from sensory information available within a trial, but limited by natural kinematic constraints and a conservative margin for error. Our results provide the first direct evidence of the continuous mean and uncertainty estimation mechanisms in humans that may underlie optimal decision making

    The Spatial Origin of a Perceptual Transition in Binocular Rivalry

    Get PDF
    When the left and the right eye are simultaneously presented with incompatible images at overlapping retinal locations, an observer typically reports perceiving only one of the two images at a time. This phenomenon is called binocular rivalry. Perception during binocular rivalry is not stable; one of the images is perceptually dominant for a certain duration (typically in the order of a few seconds) after which perception switches towards the other image. This alternation between perceptual dominance and suppression will continue for as long the images are presented. A characteristic of binocular rivalry is that a perceptual transition from one image to the other generally occurs in a gradual manner: the image that was temporarily suppressed will regain perceptual dominance at isolated locations within the perceived image, after which its visibility spreads throughout the whole image. These gradual transitions from perceptual suppression to perceptual dominance have been labeled as traveling waves of perceptual dominance. In this study we investigate whether stimulus parameters affect the location at which a traveling wave starts. We varied the contrast, spatial frequency or motion speed in one of the rivaling images, while keeping the same parameter constant in the other image. We used a flash-suppression paradigm to force one of the rival images into perceptual suppression. Observers waited until the suppressed image became perceptually dominant again, and indicated the position at which this breakthrough from suppression occurred. Our results show that the starting point of a traveling wave during binocular rivalry is highly dependent on local stimulus parameters. More specifically, a traveling wave most likely started at the location where the contrast of the suppressed image was higher than that of the dominant one, the spatial frequency of the suppressed image was lower than that of the dominant one, and the motion speed of the suppressed image was higher than that of the dominant one. We suggest that a breakthrough from suppression to dominance occurs at the location where salience (the degree to which a stimulus element stands out relative to neighboring elements) of the suppressed image is higher than that of the dominant one. Our results further show that stimulus parameters affecting the temporal dynamics during continuous viewing of rival images described in other studies, also affect the spatial origin of traveling waves during binocular rivalry

    Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception

    Get PDF
    Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks

    Efficient Coding and Statistically Optimal Weighting of Covariance among Acoustic Attributes in Novel Sounds

    Get PDF
    To the extent that sensorineural systems are efficient, redundancy should be extracted to optimize transmission of information, but perceptual evidence for this has been limited. Stilp and colleagues recently reported efficient coding of robust correlation (r = .97) among complex acoustic attributes (attack/decay, spectral shape) in novel sounds. Discrimination of sounds orthogonal to the correlation was initially inferior but later comparable to that of sounds obeying the correlation. These effects were attenuated for less-correlated stimuli (r = .54) for reasons that are unclear. Here, statistical properties of correlation among acoustic attributes essential for perceptual organization are investigated. Overall, simple strength of the principal correlation is inadequate to predict listener performance. Initial superiority of discrimination for statistically consistent sound pairs was relatively insensitive to decreased physical acoustic/psychoacoustic range of evidence supporting the correlation, and to more frequent presentations of the same orthogonal test pairs. However, increased range supporting an orthogonal dimension has substantial effects upon perceptual organization. Connectionist simulations and Eigenvalues from closed-form calculations of principal components analysis (PCA) reveal that perceptual organization is near-optimally weighted to shared versus unshared covariance in experienced sound distributions. Implications of reduced perceptual dimensionality for speech perception and plausible neural substrates are discussed

    Bayesian Cue Integration as a Developmental Outcome of Reward Mediated Learning

    Get PDF
    Average human behavior in cue combination tasks is well predicted by Bayesian inference models. As this capability is acquired over developmental timescales, the question arises, how it is learned. Here we investigated whether reward dependent learning, that is well established at the computational, behavioral, and neuronal levels, could contribute to this development. It is shown that a model free reinforcement learning algorithm can indeed learn to do cue integration, i.e. weight uncertain cues according to their respective reliabilities and even do so if reliabilities are changing. We also consider the case of causal inference where multimodal signals can originate from one or multiple separate objects and should not always be integrated. In this case, the learner is shown to develop a behavior that is closest to Bayesian model averaging. We conclude that reward mediated learning could be a driving force for the development of cue integration and causal inference
    • …
    corecore