30 research outputs found

    Evolution and Optimality of Similar Neural Mechanisms for Perception and Action during Search

    Get PDF
    A prevailing theory proposes that the brain's two visual pathways, the ventral and dorsal, lead to differing visual processing and world representations for conscious perception than those for action. Others have claimed that perception and action share much of their visual processing. But which of these two neural architectures is favored by evolution? Successful visual search is life-critical and here we investigate the evolution and optimality of neural mechanisms mediating perception and eye movement actions for visual search in natural images. We implement an approximation to the ideal Bayesian searcher with two separate processing streams, one controlling the eye movements and the other stream determining the perceptual search decisions. We virtually evolved the neural mechanisms of the searchers' two separate pathways built from linear combinations of primary visual cortex receptive fields (V1) by making the simulated individuals' probability of survival depend on the perceptual accuracy finding targets in cluttered backgrounds. We find that for a variety of targets, backgrounds, and dependence of target detectability on retinal eccentricity, the mechanisms of the searchers' two processing streams converge to similar representations showing that mismatches in the mechanisms for perception and eye movements lead to suboptimal search. Three exceptions which resulted in partial or no convergence were a case of an organism for which the targets are equally detectable across the retina, an organism with sufficient time to foveate all possible target locations, and a strict two-pathway model with no interconnections and differential pre-filtering based on parvocellular and magnocellular lateral geniculate cell properties. Thus, similar neural mechanisms for perception and eye movement actions during search are optimal and should be expected from the effects of natural selection on an organism with limited time to search for food that is not equi-detectable across its retina and interconnected perception and action neural pathways

    Neuronalne podłoże gestów komunikacyjnych u osób leworęcznych

    No full text

    The effect of static and dynamic gesture presentation on the recognition of two manipulation gestures

    No full text
    Gesture is an important means of nonverbal communication and used in conveying messages before the advent of language. With the development of computer technology, gesture interaction has become a trend of natural and harmonious human-computer interaction. Accurate and efficient hand gesture recognition is the key to gesture interaction, not only in the interaction between human and electronic devices, but also in the interaction among users in virtual reality systems. Efficient gesture recognition demands users devote more attention to what gestures express, instead of features unrelated to gesture meaning. Therefore, the present study explored whether the processing of gesture orientation and the left/right hand information, the gesture features unrelated to gesture meaning, can be modulated by static and dynamic presentation in human&rsquo;s recognition of manipulation gestures. The results showed that gesture orientation can be processed in recognition of static gestures of function-based manipulation (for example, hold a lighter and press a switch with thumb), but not dynamic gestures. However, gesture orientation can be processed in the recognition of dynamic gestures of structure-based manipulation (for example, pick up the lighter with your thumb and forefinger), the left/right hand information can be processed in the recognition of static gestures. It indicated that static and dynamic gesture presentation affected the recognition of manipulation gestures, and had different influence on structure- and function-based manipulation gestures. It suggested that dynamic function-based manipulation gestures were better options in human computer interaction, and the information unrelated to the meaning of gestures should be taken into consideration when presenting structure-based manipulation gestures, in order to ensure the successful gesture recognition. The findings provide theoretical guidance for the design of gesture interaction methods. &copy; Springer International Publishing AG, part of Springer Nature 2018.</p
    corecore