971 research outputs found

    Temporal perception of visual-haptic events in multimodal telepresence system

    Get PDF
    Book synopsis: Haptic interfaces are divided into two main categories: force feedback and tactile. Force feedback interfaces are used to explore and modify remote/virtual objects in three physical dimensions in applications including computer-aided design, computer-assisted surgery, and computer-aided assembly. Tactile interfaces deal with surface properties such as roughness, smoothness, and temperature. Haptic research is intrinsically multi-disciplinary, incorporating computer science/engineering, control, robotics, psychophysics, and human motor control. By extending the scope of research in haptics, advances can be achieved in existing applications such as computer-aided design (CAD), tele-surgery, rehabilitation, scientific visualization, robot-assisted surgery, authentication, and graphical user interfaces (GUI), to name a few. Advances in Haptics presents a number of recent contributions to the field of haptics. Authors from around the world present the results of their research on various issues in the field of haptics

    Unimodal and crossmodal processing of visual and kinesthetic stimuli in working memory

    Get PDF
    The processing of (object) information in working memory has been intensively investigated in the visual modality (i.e. D’Esposito, 2007; Ranganath, 2006). In comparison, research on kinesthetic/haptic or crossmodal processing in working memory is still sparse. During recognition and comparison of object information across modalities, representations built from one sensory modality have to be matched with representations obtained from other senses. In the present thesis, the questions how object information is represented in unimodal and crossmodal working memory, which processes enable unimodal and crossmodal comparisons, and which neuronal correlates are associated with these processes were addressed. In particular, unimodal and crossmodal processing of visually and kinesthetically perceived object features were systematically investigated in distinct working memory phases of encoding, maintenance, and recognition. At this, the kinesthetic modality refers to the sensory perception of movement direction and spatial position, e.g. of one’s own hand, and is part of the haptic sense. Overall, the results of the present thesis suggest that modality-specific representations and modality-specific processes play a role during unimodal and crossmodal processing of object features in working memory

    Tactual perception: a review of experimental variables and procedures

    Get PDF
    This paper reviews literature on tactual perception. Throughout this review we will highlight some of the most relevant variables in touch literature: interaction between touch and other senses; type of stimuli, from abstract stimuli such as vibrations, to two- and three-dimensional stimuli, also considering concrete stimuli such as the relation between familiar and unfamiliar stimuli or the haptic perception of faces; type of participants, separating studies with blind participants, studies with children and adults, and an analysis of sex differences in performance; and finally, type of tactile exploration, considering conditions of active and passive touch, the relevance of movement in touch and the relation between exploration and time. This review intends to present an organised overview of the main variables in touch experiments, attending to the main findings described in literature, to guide the design of future works on tactual perception and memory.This work was funded by the Portuguese “Foundation for Science and Technology” through PhD scholarship SFRH/BD/35918/2007

    Tactile information improves visual object discrimination in kea, Nestor notabilis, and capuchin monkeys, Sapajus spp.

    Get PDF
    In comparative visual cognition research, the influence of information acquired by nonvisual senses has received little attention. Systematic studies focusing on how the integration of information from sight and touch can affect animal perception are sparse. Here, we investigated whether tactile input improves visual discrimination ability of a bird, the kea, and capuchin monkeys, two species with acute vision, and known for their tendency to handle objects. To this end, we assessed whether, at the attainment of a criterion, accuracy and/or learning speed in the visual modality were enhanced by haptic (i.e. active tactile) exploration of an object. Subjects were trained to select the positive stimulus between two cylinders of the same shape and size, but with different surface structures. In the Sight condition, one pair of cylinders was inserted into transparent Plexiglas tubes. This prevented animals from haptically perceiving the objects' surfaces. In the Sight and Touch condition, one pair of cylinders was not inserted into transparent Plexiglas tubes. This allowed the subjects to perceive the objects' surfaces both visually and haptically. We found that both kea and capuchins (1) showed comparable levels of accuracy at the attainment of the learning criterion in both conditions, but (2) required fewer trials to achieve the criterion in the Sight and Touch condition. Moreover, this study showed that both kea and capuchins can integrate information acquired by the visual and tactile modalities. To our knowledge, this represents the first evidence of visuotactile integration in a bird species. Overall, our findings demonstrate that the acquisition of tactile information while manipulating objects facilitates visual discrimination of objects in two phylogenetically distant species

    Bodily awareness and novel multisensory features

    Get PDF
    According to the decomposition thesis, perceptual experiences resolve without remainder into their different modality-specific components. Contrary to this view, I argue that certain cases of multisensory integration give rise to experiences representing features of a novel type. Through the coordinated use of bodily awareness—understood here as encompassing both proprioception and kinaesthesis—and the exteroceptive sensory modalities, one becomes perceptually responsive to spatial features whose instances couldn’t be represented by any of the contributing modalities functioning in isolation. I develop an argument for this conclusion focusing on two cases: 3D shape perception in haptic touch and experiencing an object’s egocentric location in crossmodally accessible, environmental space

    Size-sensitive perceptual representations underlie visual and haptic object recognition.

    Get PDF
    A variety of similarities between visual and haptic object recognition suggests that the two modalities may share common representations. However, it is unclear whether such common representations preserve low-level perceptual features or whether transfer between vision and haptics is mediated by high-level, abstract representations. Two experiments used a sequential shape-matching task to examine the effects of size changes on unimodal and crossmodal visual and haptic object recognition. Participants felt or saw 3D plastic models of familiar objects. The two objects presented on a trial were either the same size or different sizes and were the same shape or different but similar shapes. Participants were told to ignore size changes and to match on shape alone. In Experiment 1, size changes on same-shape trials impaired performance similarly for both visual-to-visual and haptic-to-haptic shape matching. In Experiment 2, size changes impaired performance on both visual-to-haptic and haptic-to-visual shape matching and there was no interaction between the cost of size changes and direction of transfer. Together the unimodal and crossmodal matching results suggest that the same, size-specific perceptual representations underlie both visual and haptic object recognition, and indicate that crossmodal memory for objects must be at least partly based on common perceptual representations

    Somatosensory processing in deaf and deafblind individuals: How does the brain adapt as a function of sensory and linguistic experience? A critical review

    Get PDF
    How do deaf and deafblind individuals process touch? This question offers a unique model to understand the prospects and constraints of neural plasticity. Our brain constantly receives and processes signals from the environment and combines them into the most reliable information content. The nervous system adapts its functional and structural organization according to the input, and perceptual processing develops as a function of individual experience. However, there are still many unresolved questions regarding the deciding factors for these changes in deaf and deafblind individuals, and so far, findings are not consistent. To date, most studies have not taken the sensory and linguistic experiences of the included participants into account. As a result, the impact of sensory deprivation vs. language experience on somatosensory processing remains inconclusive. Even less is known about the impact of deafblindness on brain development. The resulting neural adaptations could be even more substantial, but no clear patterns have yet been identified. How do deafblind individuals process sensory input? Studies on deafblindness have mostly focused on single cases or groups of late-blind individuals. Importantly, the language backgrounds of deafblind communities are highly variable and include the usage of tactile languages. So far, this kind of linguistic experience and its consequences have not been considered in studies on basic perceptual functions. Here, we will provide a critical review of the literature, aiming at identifying determinants for neuroplasticity and gaps in our current knowledge of somatosensory processing in deaf and deafblind individuals
    • …
    corecore