316 research outputs found

    Modeling of frictional forces during bare-finger interactions with solid surfaces

    Get PDF
    Touching an object with our fingers yields frictional forces that allow us to perceive and explore its texture, shape, and other features, facilitating grasping and manipulation. While the relevance of dynamic frictional forces to sensory and motor function in the hand is well established, the way that they reflect the shape, features, and composition of touched objects is poorly understood. Haptic displays -electronic interfaces for stimulating the sense of touch- often aim to elicit the perceptual experience of touching real surfaces by delivering forces to the fingers that mimic those felt when touching real surfaces. However, the design and applications of such displays have been limited by the lack of knowledge about what forces are felt during real touch interactions. This represents a major gap in current knowledge about tactile function and haptic engineering. This dissertation addresses some aspects that would assist in their understanding. The goal of this research was to measure, characterize, and model frictional forces produced by a bare finger sliding over surfaces of multiple shapes. The major contributions of this work are (1) the design and development of a sensing system for capturing fingertip motion and forces during tactile exploration of real surfaces; (2) measurement and characterization of contact forces and the deformation of finger tissues during sliding over relief surfaces; (3) the development of a low order model of frictional force production based on surface specifications; (4) the analysis and modeling of contact geometry, interfacial mechanics, and their effects in frictional force production during tactile exploration of relief surfaces. This research aims to guide the design of algorithms for the haptic rendering of surface textures and shape. Such algorithms can be used to enhance human-machine interfaces, such as touch-screen displays, by (1) enabling users to feel surface characteristics also presented visually; (2) facilitating interaction with these devices; and (3) reducing the need for visual input to interact with them.Ph.D., Electrical Engineering -- Drexel University, 201

    Communication of Digital Material Appearance Based on Human Perception

    Get PDF
    Im alltägliche Leben begegnen wir digitalen Materialien in einer Vielzahl von Situationen wie beispielsweise bei Computerspielen, Filmen, Reklamewänden in zB U-Bahn Stationen oder beim Online-Kauf von Kleidungen. Während einige dieser Materialien durch digitale Modelle repräsentiert werden, welche das Aussehen einer bestimmten Oberfläche in Abhängigkeit des Materials der Fläche sowie den Beleuchtungsbedingungen beschreiben, basieren andere digitale Darstellungen auf der simplen Verwendung von Fotos der realen Materialien, was zB bei Online-Shopping häufig verwendet wird. Die Verwendung von computer-generierten Materialien ist im Vergleich zu einzelnen Fotos besonders vorteilhaft, da diese realistische Erfahrungen im Rahmen von virtuellen Szenarien, kooperativem Produkt-Design, Marketing während der prototypischen Entwicklungsphase oder der Ausstellung von Möbeln oder Accesoires in spezifischen Umgebungen erlauben. Während mittels aktueller Digitalisierungsmethoden bereits eine beeindruckende Reproduktionsqualität erzielt wird, wird eine hochpräzise photorealistische digitale Reproduktion von Materialien für die große Vielfalt von Materialtypen nicht erreicht. Daher verwenden viele Materialkataloge immer noch Fotos oder sogar physikalische Materialproben um ihre Kollektionen zu repräsentieren. Ein wichtiger Grund für diese Lücke in der Genauigkeit des Aussehens von digitalen zu echten Materialien liegt darin, dass die Zusammenhänge zwischen physikalischen Materialeigenschaften und der vom Menschen wahrgenommenen visuellen Qualität noch weitgehend unbekannt sind. Die im Rahmen dieser Arbeit durchgeführten Untersuchungen adressieren diesen Aspekt. Zu diesem Zweck werden etablierte digitalie Materialmodellen bezüglich ihrer Eignung zur Kommunikation von physikalischen und sujektiven Materialeigenschaften untersucht, wobei Beobachtungen darauf hinweisen, dass ein Teil der fühlbaren/haptischen Informationen wie z.B. Materialstärke oder Härtegrad aufgrund der dem Modell anhaftenden geometrische Abstraktion verloren gehen. Folglich wird im Rahmen der Arbeit das Zusammenspiel der verschiedenen Sinneswahrnehmungen (mit Fokus auf die visuellen und akustischen Modalitäten) untersucht um festzustellen, welche Informationen während des Digitalisierungsprozesses verloren gehen. Es zeigt sich, dass insbesondere akustische Informationen in Kombination mit der visuellen Wahrnehmung die Einschätzung fühlbarer Materialeigenschaften erleichtert. Eines der Defizite bei der Analyse des Aussehens von Materialien ist der Mangel bezüglich sich an der Wahnehmung richtenden Metriken die eine Beantwortung von Fragen wie z.B. "Sind die Materialien A und B sich ähnlicher als die Materialien C und D?" erlauben, wie sie in vielen Anwendungen der Computergrafik auftreten. Daher widmen sich die im Rahmen dieser Arbeit durchgeführten Studien auch dem Vergleich von unterschiedlichen Materialrepräsentationen im Hinblick auf. Zu diesem Zweck wird eine Methodik zur Berechnung der wahrgenommenen paarweisen Ähnlichkeit von Material-Texturen eingeführt, welche auf der Verwendung von Textursyntheseverfahren beruht und sich an der Idee/dem Begriff der geradenoch-wahrnehmbaren Unterschiede orientiert. Der vorgeschlagene Ansatz erlaubt das Überwinden einiger Probleme zuvor veröffentlichter Methoden zur Bestimmung der Änhlichkeit von Texturen und führt zu sinnvollen/plausiblen Distanzen von Materialprobem. Zusammenfassend führen die im Rahmen dieser Dissertation dargestellten Inhalte/Verfahren zu einem tieferen Verständnis bezüglich der menschlichen Wahnehmung von digitalen bzw. realen Materialien über unterschiedliche Sinne, einem besseren Verständnis bzgl. der Bewertung der Ähnlichkeit von Texturen durch die Entwicklung einer neuen perzeptuellen Metrik und liefern grundlegende Einsichten für zukünftige Untersuchungen im Bereich der Perzeption von digitalen Materialien.In daily life, we encounter digital materials and interact with them in numerous situations, for instance when we play computer games, watch a movie, see billboard in the metro station or buy new clothes online. While some of these virtual materials are given by computational models that describe the appearance of a particular surface based on its material and the illumination conditions, some others are presented as simple digital photographs of real materials, as is usually the case for material samples from online retailing stores. The utilization of computer-generated materials entails significant advantages over plain images as they allow realistic experiences in virtual scenarios, cooperative product design, advertising in prototype phase or exhibition of furniture and wearables in specific environments. However, even though exceptional material reproduction quality has been achieved in the domain of computer graphics, current technology is still far away from highly accurate photo-realistic virtual material reproductions for the wide range of existing categories and, for this reason, many material catalogs still use pictures or even physical material samples to illustrate their collections. An important reason for this gap between digital and real material appearance is that the connections between physical material characteristics and the visual quality perceived by humans are far from well-understood. Our investigations intend to shed some light in this direction. Concretely, we explore the ability of state-of-the-art digital material models in communicating physical and subjective material qualities, observing that part of the tactile/haptic information (eg thickness, hardness) is missing due to the geometric abstractions intrinsic to the model. Consequently, in order to account for the information deteriorated during the digitization process, we investigate the interplay between different sensing modalities (vision and hearing) and discover that particular sound cues, in combination with visual information, facilitate the estimation of such tactile material qualities. One of the shortcomings when studying material appearance is the lack of perceptually-derived metrics able to answer questions like "are materials A and B more similar than C and D?", which arise in many computer graphics applications. In the absence of such metrics, our studies compare different appearance models in terms of how capable are they to depict/transmit a collection of meaningful perceptual qualities. To address this problem, we introduce a methodology to compute the perceived pairwise similarity between textures from material samples that makes use of patch-based texture synthesis algorithms and is inspired on the notion of Just-Noticeable Differences. Our technique is able to overcome some of the issues posed by previous texture similarity collection methods and produces meaningful distances between samples. In summary, with the contents presented in this thesis we are able to delve deeply in how humans perceive digital and real materials through different senses, acquire a better understanding of texture similarity by developing a perceptually-based metric and provide a groundwork for further investigations in the perception of digital materials

    Abstracts from CIP 2007: Segundo Congreso Ibérico de Percepción

    Get PDF
    Sin resumenSin resume

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Adaptation to moving tactile stimuli and its effects on perceived speed and direction

    Get PDF
    Like other senses, tactile perception is subject to adaptation effects in which systematic changes in the pattern of sensory input lead to predictable changes in perception. In this thesis, aftereffects of adaptation to tactile motion are used to reveal the processes that give rise to tactile motion perception from the relevant sensory inputs. The first aftereffect is the tactile speed aftereffect (tSAE), in which the speed of motion appears slower following exposure to a moving surface. Perceived speed of a test surface was reduced by about 30% regardless of the direction of the adapting stimulus, indicating that the tSAE is not direction sensitive. Additionally, higher adapting speeds produced a stronger tSAE, and this dependence on adapting speed could not be attributed to differences in temporal frequency or spatial period that accompanied the different adapting speeds. The second motion aftereffect that was investigated is the dynamic tactile motion aftereffect (tMAE), in which a direction-neutral test stimulus appears to move in the opposite direction to previously felt adapting motion. The strength of the tMAE depended on the speed of the adapting motion, with higher speeds producing a stronger aftereffect. Both the tSAE and the tMAE showed evidence of an intensive speed code in their underlying neural populations, with faster adapting speeds resulting in stronger aftereffects. In neither case was any evidence of speed tuning found, that is, neither aftereffect was strongest with a match between the speeds of the adapting and test stimuli. This is compatible with the response properties of motion sensitive neurons in the primary somatosensory cortex. Despite these shared features, speed and direction are unlikely to be jointly coded in the same neurons because the lack of direction sensitivity of the tSAE requires neural adaptation effects to be uniform across neurons preferring all directions, whereas the tMAE requires direction selective adaptation

    Adaptation to moving tactile stimuli and its effects on perceived speed and direction

    Get PDF
    Like other senses, tactile perception is subject to adaptation effects in which systematic changes in the pattern of sensory input lead to predictable changes in perception. In this thesis, aftereffects of adaptation to tactile motion are used to reveal the processes that give rise to tactile motion perception from the relevant sensory inputs. The first aftereffect is the tactile speed aftereffect (tSAE), in which the speed of motion appears slower following exposure to a moving surface. Perceived speed of a test surface was reduced by about 30% regardless of the direction of the adapting stimulus, indicating that the tSAE is not direction sensitive. Additionally, higher adapting speeds produced a stronger tSAE, and this dependence on adapting speed could not be attributed to differences in temporal frequency or spatial period that accompanied the different adapting speeds. The second motion aftereffect that was investigated is the dynamic tactile motion aftereffect (tMAE), in which a direction-neutral test stimulus appears to move in the opposite direction to previously felt adapting motion. The strength of the tMAE depended on the speed of the adapting motion, with higher speeds producing a stronger aftereffect. Both the tSAE and the tMAE showed evidence of an intensive speed code in their underlying neural populations, with faster adapting speeds resulting in stronger aftereffects. In neither case was any evidence of speed tuning found, that is, neither aftereffect was strongest with a match between the speeds of the adapting and test stimuli. This is compatible with the response properties of motion sensitive neurons in the primary somatosensory cortex. Despite these shared features, speed and direction are unlikely to be jointly coded in the same neurons because the lack of direction sensitivity of the tSAE requires neural adaptation effects to be uniform across neurons preferring all directions, whereas the tMAE requires direction selective adaptation

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Haptic perception in virtual reality in sighted and blind individuals

    Get PDF
    The incorporation of the sense of touch into virtual reality is an exciting development. However, research into this topic is in its infancy. This experimental programme investigated both the perception of virtual object attributes by touch and the parameters that influence touch perception in virtual reality with a force feedback device called the PHANTOM (TM) (www.sensable.com). The thesis had three main foci. Firstly, it aimed to provide an experimental account of the perception of the attributes of roughness, size and angular extent by touch via the PHANTOM (TM) device. Secondly, it aimed to contribute to the resolution of a number of other issues important in developing an understanding of the parameters that exert an influence on touch in virtual reality. Finally, it aimed to compare touch in virtual reality between sighted and blind individuals. This thesis comprises six experiments. Experiment one examined the perception of the roughness of virtual textures with the PHANTOM (TM) device. The effect of the following factors was addressed: the groove width of the textured stimuli; the endpoint used (stylus or thimble) with the PHANTOM (TM); the specific device used (PHANTOM (TM) vs. IE3000) and the visual status (sighted or blind) of the participants. Experiment two extended the findings of experiment one by addressing the impact of an exploration related factor on perceived roughness, that of the contact force an individual applies to a virtual texture. The interaction between this variable and the factors of groove width, endpoint, and visual status was also addressed. Experiment three examined the perception of the size and angular extent of virtual 3-D objects via the PHANTOM (TM). With respect to the perception of virtual object size, the effect of the following factors was addressed: the size of the object (2.7,3.6,4.5 cm); the type of virtual object (cube vs. sphere); the mode in which the virtual objects were presented; the endpoint used with the PHANTOM (TM) and the visual status of the participants. With respect to the perception of virtual object angular extent, the effect of the following factors was addressed: the angular extent of the object (18,41 and 64°); the endpoint used with the PHANTOM (TM) and the visual status of the participants. Experiment four examined the perception of the size and angular extent of real counterparts to the virtual 3-D objects used in experiment three. Experiment four manipulated the conditions under which participants examined the real objects. Participants were asked to give judgements of object size and angular extent via the deactivated PHANTOM (TM), a stylus probe, a bare index finger and without any constraints on their exploration. In addition to the above exploration type factor, experiment four examined the impact of the same factors on perceived size and angular extent in the real world as had been examined in virtual reality. Experiments five and six examined the consistency of the perception of linear extent across the 3-D axes in virtual space. Both experiments manipulated the following factors: Line extent (2.7,3.6 and 4.5cm); line dimension (x, y and z axis); movement type (active vs. passive movement) and visual status. Experiment six additionally manipulated the direction of movement within the 3-D axes. Perceived roughness was assessed by the method of magnitude estimation. The perceived size and angular extent of the various virtual stimuli and their real counterparts was assessed by the method of magnitude reproduction. This technique was also used to assess perceived extent across the 3-D axes. Touch perception via the PHANTOM (TM) was found to be broadly similar for sighted and blind participants. Touch perception in virtual reality was also found to be broadly similar between two different 3-D force feedback devices (the PHANTOM (TM) and the IE3000). However, the endpoint used with the PHANTOM (TM) device was found to exert significant, but inconsistent effects on the perception of virtual object attributes. Touch perception with the PHANTOM (TM) across the 3-D axes was found to be anisotropic in a similar way to the real world, with the illusion that radial extents were perceived as longer than equivalent tangential extents. The perception of 3-D object size and angular extent was found to be comparable between virtual reality and the real world, particularly under conditions where the participants' exploration of the real objects was constrained to a single point of contact. An intriguing touch illusion, whereby virtual objects explored from the inside were perceived to be larger than the same objects perceived from the outside was found to occur widely in virtual reality, in addition to the real world. This thesis contributes to knowledge of touch perception in virtual reality. The findings have interesting implications for theories of touch perception, both virtual and real
    corecore