38 research outputs found

    Sensory mechanisms involved in obtaining frictional information for perception and grip force adjustment during object manipulation

    Full text link
    Sensory signals informing about frictional properties of a surface are used both for perception to experience material properties and for motor control to be able to handle objects using adequate manipulative forces. There are fundamental differences between these two purposes and scenarios, how sensory information typically is obtained. This thesis aims to explore the mechanisms involved in the perception of frictional properties of the touched surfaces under conditions relevant for object manipulation. Firstly, I show that in the passive touch condition, when the surface is brought in contact with immobilised finger, humans are unable to use existing friction-related mechanical cues and perceptually associate them with frictional properties. However, a submillimeter range lateral movement significantly improved the subject's ability to evaluate the frictional properties of two otherwise identical surfaces. It is demonstrated that partial slips within the contact area and fingertip tissue deformation create very potent sensory stimuli, enabling tactile afferents to signal friction-dependent mechanical effects translating into slipperiness (friction) perception. Further, I demonstrate that natural movement kinematics facilitate the development of such small skin displacements within the contact area and may play a central role in enabling the perception of surface slipperiness and adjusting grip force to friction when manipulating objects. This demonstrates intimate interdependence between the motor and sensory systems. This work significantly extends our understanding of fundamental tactile sensory processes involved in friction signaling in the context of motor control and dexterous object manipulation tasks. This knowledge and discovered friction sensing principles may assist in designing haptic rendering devices and artificial tactile sensors as well as associated control algorithms to be used in robotic grippers and hand prostheses

    How touch and hearing influence visual processing in sensory substitution, synaesthesia and cross-modal correspondences

    Get PDF
    Sensory substitution devices (SSDs) systematically turn visual dimensions into patterns of tactile or auditory stimulation. After training, a user of these devices learns to translate these audio or tactile sensations back into a mental visual picture. Most previous SSDs translate greyscale images using intuitive cross-sensory mappings to help users learn the devices. However more recent SSDs have started to incorporate additional colour dimensions such as saturation and hue. Chapter two examines how previous SSDs have translated the complexities of colour into hearing or touch. The chapter explores if colour is useful for SSD users, how SSD and veridical colour perception differ and how optimal cross-sensory mappings might be considered. After long-term training, some blind users of SSDs report visual sensations from tactile or auditory stimulation. A related phenomena is that of synaesthesia, a condition where stimulation of one modality (i.e. touch) produces an automatic, consistent and vivid sensation in another modality (i.e. vision). Tactile-visual synaesthesia is an extremely rare variant that can shed light on how the tactile-visual system is altered when touch can elicit visual sensations. Chapter three reports a series of investigations on the tactile discrimination abilities and phenomenology of tactile-vision synaesthetes, alongside questionnaire data from synaesthetes unavailable for testing. Chapter four introduces a new SSD to test if the presentation of colour information in sensory substitution affects object and colour discrimination. Chapter five presents experiments on intuitive auditory-colour mappings across a wide variety of sounds. These findings are used to predict the reported colour hallucinations resulting from LSD use while listening to these sounds. Chapter six uses a new sensory substitution device designed to test the utility of these intuitive sound-colour links for visual processing. These findings are discussed with reference to how cross-sensory links, LSD and synaesthesia can inform optimal SSD design for visual processing

    Sensory and cognitive factors in multi-digit touch, and its integration with vision

    Get PDF
    Every tactile sensation – an itch, a kiss, a hug, a pen gripped between fingers, a soft fabric brushing against the skin – is experienced in relation to the body. Normally, they occur somewhere on the body’s surface – they have spatiality. This sense of spatiality is what allows us to perceive a partner’s caress in terms of its changing location on the skin, its movement direction, speed, and extent. How this spatiality arises and how it is experienced is a thriving research topic, compelled by growing interest in the nature of tactile experiences from product design to brain-machine interfaces. The present thesis adds to this flourishing area of research by examining the unified spatial quality of touch. How does distinct spatial information converge from separate areas of the body surface to give rise to our normal unified experience of touch? After explaining the importance of this question in Chapter 1, a novel paradigm to tackle this problem will be presented, whereby participants are asked to estimate the average direction of two stimuli that are simultaneously moved across two different fingerpads. This paradigm is a laboratory analogue of the more ecological task of representing the overall movement of an object held between multiple fingers. An EEG study in Chapter 2 will reveal a brain mechanism that could facilitate such aggregated perception. Next, by characterising participants’ performance not just in terms of error rates, but by considering perceptual sensitivity, bias, precision, and signal weighting, a series of psychophysical experiments will show that this aggregation ability differs for within- and between-hand perception (Chapter 3), is independent from somatotopically-defined circuitry (Chapter 4) and arises after proprioceptive input about hand posture is accounted for (Chapter 5). Finally, inspired by the demand for integrated tactile and visual experience in virtual reality and the potential of tactile interface to aid navigation, Chapter 6 will examine the contribution of tactile spatiality on visual spatial experience. Ultimately, the present thesis will reveal sensory factors that limit precise representation of concurrently occurring dynamic tactile events. It will point to cognitive strategies the brain may employ to overcome those limitations to tactually perceive coherent objects. As such, this thesis advances somatosensory research beyond merely examining the selectivity to and discrimination between experienced tactile inputs, to considering the unified experience of touch despite distinct stimulus elements. The findings also have practical implications for the design of functional tactile interfaces

    Microscope Embedded Neurosurgical Training and Intraoperative System

    Get PDF
    In the recent years, neurosurgery has been strongly influenced by new technologies. Computer Aided Surgery (CAS) offers several benefits for patients\u27 safety but fine techniques targeted to obtain minimally invasive and traumatic treatments are required, since intra-operative false movements can be devastating, resulting in patients deaths. The precision of the surgical gesture is related both to accuracy of the available technological instruments and surgeon\u27s experience. In this frame, medical training is particularly important. From a technological point of view, the use of Virtual Reality (VR) for surgeon training and Augmented Reality (AR) for intra-operative treatments offer the best results. In addition, traditional techniques for training in surgery include the use of animals, phantoms and cadavers. The main limitation of these approaches is that live tissue has different properties from dead tissue and that animal anatomy is significantly different from the human. From the medical point of view, Low-Grade Gliomas (LGGs) are intrinsic brain tumours that typically occur in younger adults. The objective of related treatment is to remove as much of the tumour as possible while minimizing damage to the healthy brain. Pathological tissue may closely resemble normal brain parenchyma when looked at through the neurosurgical microscope. The tactile appreciation of the different consistency of the tumour compared to normal brain requires considerable experience on the part of the neurosurgeon and it is a vital point. The first part of this PhD thesis presents a system for realistic simulation (visual and haptic) of the spatula palpation of the LGG. This is the first prototype of a training system using VR, haptics and a real microscope for neurosurgery. This architecture can be also adapted for intra-operative purposes. In this instance, a surgeon needs the basic setup for the Image Guided Therapy (IGT) interventions: microscope, monitors and navigated surgical instruments. The same virtual environment can be AR rendered onto the microscope optics. The objective is to enhance the surgeon\u27s ability for a better intra-operative orientation by giving him a three-dimensional view and other information necessary for a safe navigation inside the patient. The last considerations have served as motivation for the second part of this work which has been devoted to improving a prototype of an AR stereoscopic microscope for neurosurgical interventions, developed in our institute in a previous work. A completely new software has been developed in order to reuse the microscope hardware, enhancing both rendering performances and usability. Since both AR and VR share the same platform, the system can be referred to as Mixed Reality System for neurosurgery. All the components are open source or at least based on a GPL license

    Haptic Media Scenes

    Get PDF
    The aim of this thesis is to apply new media phenomenological and enactive embodied cognition approaches to explain the role of haptic sensitivity and communication in personal computer environments for productivity. Prior theory has given little attention to the role of haptic senses in influencing cognitive processes, and do not frame the richness of haptic communication in interaction design—as haptic interactivity in HCI has historically tended to be designed and analyzed from a perspective on communication as transmissions, sending and receiving haptic signals. The haptic sense may not only mediate contact confirmation and affirmation, but also rich semiotic and affective messages—yet this is a strong contrast between this inherent ability of haptic perception, and current day support for such haptic communication interfaces. I therefore ask: How do the haptic senses (touch and proprioception) impact our cognitive faculty when mediated through digital and sensor technologies? How may these insights be employed in interface design to facilitate rich haptic communication? To answer these questions, I use theoretical close readings that embrace two research fields, new media phenomenology and enactive embodied cognition. The theoretical discussion is supported by neuroscientific evidence, and tested empirically through case studies centered on digital art. I use these insights to develop the concept of the haptic figura, an analytical tool to frame the communicative qualities of haptic media. The concept gauges rich machine- mediated haptic interactivity and communication in systems with a material solution supporting active haptic perception, and the mediation of semiotic and affective messages that are understood and felt. As such the concept may function as a design tool for developers, but also for media critics evaluating haptic media. The tool is used to frame a discussion on opportunities and shortcomings of haptic interfaces for productivity, differentiating between media systems for the hand and the full body. The significance of this investigation is demonstrating that haptic communication is an underutilized element in personal computer environments for productivity and providing an analytical framework for a more nuanced understanding of haptic communication as enabling the mediation of a range of semiotic and affective messages, beyond notification and confirmation interactivity

    Activity in area V3A predicts positions of moving objects

    Get PDF
    No description supplie

    Haptics Rendering and Applications

    Get PDF
    There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future

    Swayed by sound: sonic guidance as a neurorehabilitation strategy in the cerebellar ataxias

    Get PDF
    Cerebellar disease leads to problems in controlling movement. The most common difficulties are dysmetria and instability when standing. Recent understanding of cerebellar function has expanded to include non -motor aspects such as emotional, cognitive and sensory processing. Deficits in the acquisition and processing of sensory information are one explanation for the movement problems observed in cerebellar ataxia. Sensory deficits result in an inability to make predictions about future events; a primary function of the cerebellum. A question therefore, is whether augmenting or replacing sensory information can improve motor performance in cerebellar disease. This question is tested in this thesis by augmenting sensory information through the provision of an auditory movement guide.A variable described in motor control theory (tau) was used to develop auditory guides that were continuous and dynamic. A reaching experiment using healthy individuals showed that the timing of peak velocity, audiomotor coordination accuracy, and velocity of approach, could be altered in line with the movement parameters embedded in the auditory guides. The thesis then investigated the use of these sonic guides in a clinical population with cerebellar disease. Performance on neurorehabilitation exercises for balance control was tested in twenty people with cerebellar atrophy, with and without auditory guides. Results suggested that continuous, predictive, dynamic auditory guidance is an effective way of improving iii movement smoothness in ataxia (as measured by jerk). In addition, generating and swaying with imaginary auditory guides was also found to increase movement smoothness in cerebellar disease.Following the tests of instantaneous effects, the thesis then investigated the longterm consequences on motor behaviour of following a two -month exercise with auditory guide programme. Seven people with cerebellar atrophy were assessed pre - and post -intervention using two measures, weight -shifting and walking. The results of the weight -shifting test indicated that the sonic -guide exercise programme does not initiate long -term changes in motor behaviour. Whilst there were minor, improvements in walking, because of the weight -shifting results, these could not be attributed to the sonic guides. This finding confirms the difficulties of motor rehabilitation in people with cerebellar disease.This thesis contributes original findings to the field of neurorehabilitation by first showing that on -going and predictive stimuli are an appropriate tool for improving motor behaviour. In addition, the thesis is the first of its kind to apply externally presented guides that convey continuous meaningful information within a clinical population. Finally, findings show that sensory augmentation using the auditory domain is an effective way of improving motor coordination in some forms of cerebellar disease
    corecore