283 research outputs found

    Haptic curvature contrast in raised lines and solid shapes

    Get PDF
    It is known that our senses are influenced by contrast effects and aftereffects. For haptic perception, the curvature aftereffect has been studied in depth but little is known about curvature contrast. In this study we let observers explore two shapes simultaneously. The shape felt by the index finger could either be flat or convexly curved. The curvature at the thumb was varied to quantify the curvature of a subjectively flat shape. We found that when the index finger was presented with a convex shape, a flat shape at the thumb was also perceived to be convex. The effect is rather strong, on average 20% of the contrasting curvature. The contrast effect was present for both raised line stimuli and solid shapes. Movement measurements revealed that the curvature of the path taken by the metacarpus (part of the hand that connects the fingers) was approximately the average of the path curvatures taken by the thumb and index finger. A failure to correct for the movement of the hand could explain the contrast effect

    Wearable haptic systems for the fingertip and the hand: taxonomy, review and perspectives

    Get PDF
    In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating with the human wearers during their interaction with the environment they share, in a natural and yet private way. This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges. The paper also discusses the main technological and design challenges for the development of wearable haptic interfaces, and it reports on the future perspectives of the field. Finally, the paper includes two tables summarizing the characteristics and features of the most representative wearable haptic systems for the fingertip and the hand

    How do humans mediate with the external physical world? From perception to control of articulated objects

    Get PDF
    Many actions in our daily life involve operation with articulated tools. Despite the ubiquity of articulated objects in daily life, human ability in perceiving the properties and control of articulated objects has been merely studied. Articulated objects are composed of links and revolute or prismatic joints. Moving one part of the linkage results in the movement of the other ones. Reaching a position with the tip of a tool requires adapting the motor commands to the change of position of the endeffector different from the action of reaching the same position with the hand. The dynamic properties are complex and variant in the movement of articulated bodies. For instance, apparent mass, a quantity that measures the dynamic interaction of the articulated object, varies as a function of the changes in configuration. An actuated articulated system can generate a static, but position-dependent force field with constant torques about joints. There are evidences that internal models are involved in the perception and control of tools. In the present work, we aim to investigate several aspects of the perception and control of articulated objects and address two questions, The first question is how people perceive the kinematic and dynamic properties in the haptic interaction with articulated objects? And the second question is what effect has seeing the tool on the planning and execution of reaching movements with a complex tool? Does the visual representation of mechanism structures help in the reaching movement and how? To address these questions, 3D printed physical articulated objects and robotic systems have been designed and developed for the psychophysical studies. The present work involves three studies in different aspects of perception and control of articulated objects. We first did haptic size discrimination tasks using three different types of objects, namely, wooden boxes, actuated apparatus with two movable flat surfaces, and large-size pliers, in unimanual, bimanual grounded and bimanual free conditions. We found bimanual integration occurred in particular in the free manipulation of objects. The second study was on the visuo-motor reaching with complex tools. We found that seeing the mechanism of the tool, even briefly at the beginning of the trial, improved the reaching performance. The last study was about force perception, evidences showed that people could take use of the force field at the end-effector to induce the torque about the joints generated by the articulated system

    Haptics disambiguates vision in the perception of pictorial relief

    Full text link

    Haptic perception

    Get PDF
    Fueled by novel applications, interest in haptic perception is growing. This paper provides an overview of the state of the art of a number of important aspects of haptic perception. By means of touch we can not only perceive quite different material properties, such as roughness, compliance, friction, coldness and slipperiness, but we can also perceive spatial properties, such as shape, curvature, size and orientation. Moreover, the number of objects we have in our hand can be determined, either by counting or subitizing. All these aspects will be presented and discussed in this paper. Although our intuition tells us that touch provides us with veridical information about our environment, the existence of prominent haptic illusions will show otherwise. Knowledge about haptic perception is interesting from a fundamental viewpoint, but it also is of eminent importance in the technological development of haptic devices. At the end of this paper, a few recent applications will be presented

    A mechatronic shape display based on auxetic materials

    Get PDF
    Shape displays enable people to touch simulated surfaces. A common architecture of such devices uses a mechatronic pin-matrix. Besides their complexity and high cost, these matrix displays suffer from sharp edges due to the discreet representation which reduces their ability to render a large continuous surface when sliding the hand. We propose using an engineered auxetic material actuated by a smaller number of motors. The material bends in multiple directions, feeling smooth and rigid to touch. A prototype implementation uses nine actuators on a 220 mm square section of material. It can display a range of surface curvatures under the palm of a user without aliased edges. In this work we use an auxetic skeleton to provide rigidity on a soft material and demonstrate the potential of this class of surface through user experiments

    Global surface features contribute to human haptic roughness estimations

    Get PDF
    Previous studies have paid special attention to the relationship between local features (e.g., raised dots) and human roughness perception. However, the relationship between global features (e.g., curved surface) and haptic roughness perception is still unclear. In the present study, a series of roughness estimation experiments was performed to investigate how global features affect human roughness perception. In each experiment, participants were asked to estimate the roughness of a series of haptic stimuli that combined local features (raised dots) and global features (sinusoidal-like curves). Experiments were designed to reveal whether global features changed their haptic roughness estimation. Furthermore, the present study tested whether the exploration method (direct, indirect, and static) changed haptic roughness estimations and examined the contribution of global features to roughness estimations. The results showed that sinusoidal-like curved surfaces with small periods were perceived to be rougher than those with large periods, while the direction of finger movement and indirect exploration did not change this phenomenon. Furthermore, the influence of global features on roughness was modulated by local features, regardless of whether raised-dot surfaces or smooth surfaces were used. Taken together, these findings suggested that an object’s global features contribute to haptic roughness perceptions, while local features change the weight of the contribution that global features make to haptic roughness perceptions

    Modelling the human perception of shape-from-shading

    Get PDF
    Shading conveys information on 3-D shape and the process of recovering this information is called shape-from-shading (SFS). This thesis divides the process of human SFS into two functional sub-units (luminance disambiguation and shape computation) and studies them individually. Based on results of a series of psychophysical experiments it is proposed that the interaction between first- and second-order channels plays an important role in disambiguating luminance. Based on this idea, two versions of a biologically plausible model are developed to explain the human performances observed here and elsewhere. An algorithm sharing the same idea is also developed as a solution to the problem of intrinsic image decomposition in the field of image processing. With regard to the shape computation unit, a link between luminance variations and estimated surface norms is identified by testing participants on simple gratings with several different luminance profiles. This methodology is unconventional but can be justified in the light of past studies of human SFS. Finally a computational algorithm for SFS containing two distinct operating modes is proposed. This algorithm is broadly consistent with the known psychophysics on human SFS

    Spatial representation and visual impairement - Developmental trends and new technological tools for assessment and rehabilitation

    Get PDF
    It is well known that perception is mediated by the five sensory modalities (sight, hearing, touch, smell and taste), which allows us to explore the world and build a coherent spatio-temporal representation of the surrounding environment. Typically, our brain collects and integrates coherent information from all the senses to build a reliable spatial representation of the world. In this sense, perception emerges from the individual activity of distinct sensory modalities, operating as separate modules, but rather from multisensory integration processes. The interaction occurs whenever inputs from the senses are coherent in time and space (Eimer, 2004). Therefore, spatial perception emerges from the contribution of unisensory and multisensory information, with a predominant role of visual information for space processing during the first years of life. Despite a growing body of research indicates that visual experience is essential to develop spatial abilities, to date very little is known about the mechanisms underpinning spatial development when the visual input is impoverished (low vision) or missing (blindness). The thesis's main aim is to increase knowledge about the impact of visual deprivation on spatial development and consolidation and to evaluate the effects of novel technological systems to quantitatively improve perceptual and cognitive spatial abilities in case of visual impairments. Chapter 1 summarizes the main research findings related to the role of vision and multisensory experience on spatial development. Overall, such findings indicate that visual experience facilitates the acquisition of allocentric spatial capabilities, namely perceiving space according to a perspective different from our body. Therefore, it might be stated that the sense of sight allows a more comprehensive representation of spatial information since it is based on environmental landmarks that are independent of body perspective. Chapter 2 presents original studies carried out by me as a Ph.D. student to investigate the developmental mechanisms underpinning spatial development and compare the spatial performance of individuals with affected and typical visual experience, respectively visually impaired and sighted. Overall, these studies suggest that vision facilitates the spatial representation of the environment by conveying the most reliable spatial reference, i.e., allocentric coordinates. However, when visual feedback is permanently or temporarily absent, as in the case of congenital blindness or blindfolded individuals, respectively, compensatory mechanisms might support the refinement of haptic and auditory spatial coding abilities. The studies presented in this chapter will validate novel experimental paradigms to assess the role of haptic and auditory experience on spatial representation based on external (i.e., allocentric) frames of reference. Chapter 3 describes the validation process of new technological systems based on unisensory and multisensory stimulation, designed to rehabilitate spatial capabilities in case of visual impairment. Overall, the technological validation of new devices will provide the opportunity to develop an interactive platform to rehabilitate spatial impairments following visual deprivation. Finally, Chapter 4 summarizes the findings reported in the previous Chapters, focusing the attention on the consequences of visual impairment on the developmental of unisensory and multisensory spatial experience in visually impaired children and adults compared to sighted peers. It also wants to highlight the potential role of novel experimental tools to validate the use to assess spatial competencies in response to unisensory and multisensory events and train residual sensory modalities under a multisensory rehabilitation
    • 

    corecore