125 research outputs found

    Determining the Haptic Feedback Position for Optimizing the Targeting Performance on Ultrasonic Tactile Displays

    Get PDF
    International audienceAlongside questions of how to create haptic effects on displays via alternative hardware, recent work has explored rendering options with respect to haptic effects, i.e. when and where to provide haptic feedback. In particular, recent work by Zhang and Harrison for electrostatic haptic feedback noted that the optimal technique for haptic feedback during interaction is the Fill technique, where haptic effects are rendered at all times when a user's finger is within the bounds of the target. In this paper, we explore whether this result generalizes to an alternative haptic rendering technology that uses ultrasonic vibrations to create haptic sensations, a technique called the " Squeeze Film Effect ". In contrast to prior work, our results indicate that positioning the haptic feedback as a discrete linear stimulus centred on the target provides an optimal trade-off between speed, accuracy, and user preference. We highlight the implications of this work to the generalizability of haptic feedback: Haptic feedback can improve time, errors, and user satisfaction during interaction, but only if the correct form of feedback is used for the specific haptic effect generated by the hardware

    Exploring Fingers' Limitation of Texture Density Perception on Ultrasonic Haptic Displays

    Get PDF
    International audienceRecent research in haptic feedback is motivated by the crucial role that tactile perception plays in everyday touch interactions. In this paper, we describe psychophysical experiments to investigate the perceptual threshold of individual fingers on both the right and left hand of right-handed participants using active dynamic touch for spatial period discrimination of both sinusoidal and square-wave gratings on ultra-sonic haptic touchscreens. Both one-finger and multi-finger touch were studied and compared. Our results indicate that users' finger identity (index finger, middle finger, etc.) significantly affect the perception of both gratings in the case of one-finger exploration. We show that index finger and thumb are the most sensitive in all conditions whereas little finger followed by ring are the least sensitive for haptic perception. For multi-finger exploration, the right hand was found to be more sensitive than the left hand for both gratings. Our findings also demonstrate similar perception sensitivity between multi-finger exploration and the index finger of users' right hands (i.e. dominant hand in our study), while significant difference was found between single and multi-finger perception sensitivity for the left hand

    Modern Applications of Electrostatics and Dielectrics

    Get PDF
    Electrostatics and dielectric materials have important applications in modern society. As such, they require improved characteristics. More and more equipment needs to operate at high frequency, high voltage, high temperature, and other harsh conditions. This book presents an overview of modern applications of electrostatics and dielectrics as well as research progress in the field

    Multimodal interaction: developing an interaction concept for a touchscreen incorporating tactile feedback

    Get PDF
    The touchscreen, as an alternative user interface for applications that normally require mice and keyboards, has become more and more commonplace, showing up on mobile devices, on vending machines, on ATMs and in the control panels of machines in industry, where conventional input devices cannot provide intuitive, rapid and accurate user interaction with the content of the display. The exponential growth in processing power on the PC, together with advances in understanding human communication channels, has had a significant effect on the design of usable, human-factored interfaces on touchscreens, and on the number and complexity of applications available on touchscreens. Although computer-driven touchscreen interfaces provide programmable and dynamic displays, the absence of the expected tactile cues on the hard and static surfaces of conventional touchscreens is challenging interface design and touchscreen usability, in particular for distracting, low-visibility environments. Current technology allows the human tactile modality to be used in touchscreens. While the visual channel converts graphics and text unidirectionally from the computer to the end user, tactile communication features a bidirectional information flow to and from the user as the user perceives and acts on the environment and the system responds to changing contextual information. Tactile sensations such as detents and pulses provide users with cues that make selecting and controlling a more intuitive process. Tactile features can compensate for deficiencies in some of the human senses, especially in tasks which carry a heavy visual or auditory burden. In this study, an interaction concept for tactile touchscreens is developed with a view to employing the key characteristics of the human sense of touch effectively and efficiently, especially in distracting environments where vision is impaired and hearing is overloaded. As a first step toward improving the usability of touchscreens through the integration of tactile effects, different mechanical solutions for producing motion in tactile touchscreens are investigated, to provide a basis for selecting suitable vibration directions when designing tactile displays. Building on these results, design know-how regarding tactile feedback patterns is further developed to enable dynamic simulation of UI controls, in order to give users a sense of perceiving real controls on a highly natural touch interface. To study the value of adding tactile properties to touchscreens, haptically enhanced UI controls are then further investigated with the aim of mapping haptic signals to different usage scenarios to perform primary and secondary tasks with touchscreens. The findings of the study are intended for consideration and discussion as a guide to further development of tactile stimuli, haptically enhanced user interfaces and touchscreen applications

    Ambient Intelligence for Next-Generation AR

    Full text link
    Next-generation augmented reality (AR) promises a high degree of context-awareness - a detailed knowledge of the environmental, user, social and system conditions in which an AR experience takes place. This will facilitate both the closer integration of the real and virtual worlds, and the provision of context-specific content or adaptations. However, environmental awareness in particular is challenging to achieve using AR devices alone; not only are these mobile devices' view of an environment spatially and temporally limited, but the data obtained by onboard sensors is frequently inaccurate and incomplete. This, combined with the fact that many aspects of core AR functionality and user experiences are impacted by properties of the real environment, motivates the use of ambient IoT devices, wireless sensors and actuators placed in the surrounding environment, for the measurement and optimization of environment properties. In this book chapter we categorize and examine the wide variety of ways in which these IoT sensors and actuators can support or enhance AR experiences, including quantitative insights and proof-of-concept systems that will inform the development of future solutions. We outline the challenges and opportunities associated with several important research directions which must be addressed to realize the full potential of next-generation AR.Comment: This is a preprint of a book chapter which will appear in the Springer Handbook of the Metavers

    Enhancing interaction in mixed reality

    Get PDF
    With continuous technological innovation, we observe mixed reality emerging from research labs into the mainstream. The arrival of capable mixed reality devices transforms how we are entertained, consume information, and interact with computing systems, with the most recent being able to present synthesized stimuli to any of the human senses and substantially blur the boundaries between the real and virtual worlds. In order to build expressive and practical mixed reality experiences, designers, developers, and stakeholders need to understand and meet its upcoming challenges. This research contributes a novel taxonomy for categorizing mixed reality experiences and guidelines for designing mixed reality experiences. We present the results of seven studies examining the challenges and opportunities of mixed reality experiences, the impact of modalities and interaction techniques on the user experience, and how to enhance the experiences. We begin with a study determining user attitudes towards mixed reality in domestic and educational environments, followed by six research probes that each investigate an aspect of reality or virtuality. In the first, a levitating steerable projector enables us to investigate how the real world can be enhanced without instrumenting the user. We show that the presentation of in-situ instructions for navigational tasks leads to a significantly higher ability to observe and recall real-world landmarks. With the second probe, we enhance the perception of reality by superimposing information usually not visible to the human eye. In amplifying the human vision, we enable users to perceive thermal radiation visually. Further, we examine the effect of substituting physical components with non-functional tangible proxies or entirely virtual representations. With the third research probe, we explore how to enhance virtuality to enable a user to input text on a physical keyboard while being immersed in the virtual world. Our prototype tracked the user’s hands and keyboard to enable generic text input. Our analysis of text entry performance showed the importance and effect of different hand representations. We then investigate how to touch virtuality by simulating generic haptic feedback for virtual reality and show how tactile feedback through quadcopters can significantly increase the sense of presence. Our final research probe investigates the usability and input space of smartphones within mixed reality environments, pairing the user’s smartphone as an input device with a secondary physical screen. Based on our learnings from these individual research probes, we developed a novel taxonomy for categorizing mixed reality experiences and guidelines for designing mixed reality experiences. The taxonomy is based on the human sensory system and human capabilities of articulation. We showcased its versatility and set our research probes into perspective by organizing them inside the taxonomic space. The design guidelines are divided into user-centered and technology-centered. It is our hope that these will contribute to the bright future of mixed reality systems while emphasizing the new underlining interaction paradigm.Mixed Reality (vermischte Realitäten) gehen aufgrund kontinuierlicher technologischer Innovationen langsam von der reinen Forschung in den Massenmarkt über. Mit der Einführung von leistungsfähigen Mixed-Reality-Geräten verändert sich die Art und Weise, wie wir Unterhaltungsmedien und Informationen konsumieren und wie wir mit Computersystemen interagieren. Verschiedene existierende Geräte sind in der Lage, jeden der menschlichen Sinne mit synthetischen Reizen zu stimulieren. Hierdurch verschwimmt zunehmend die Grenze zwischen der realen und der virtuellen Welt. Um eindrucksstarke und praktische Mixed-Reality-Erfahrungen zu kreieren, müssen Designer und Entwicklerinnen die künftigen Herausforderungen und neuen Möglichkeiten verstehen. In dieser Dissertation präsentieren wir eine neue Taxonomie zur Kategorisierung von Mixed-Reality-Erfahrungen sowie Richtlinien für die Gestaltung von solchen. Wir stellen die Ergebnisse von sieben Studien vor, in denen die Herausforderungen und Chancen von Mixed-Reality-Erfahrungen, die Auswirkungen von Modalitäten und Interaktionstechniken auf die Benutzererfahrung und die Möglichkeiten zur Verbesserung dieser Erfahrungen untersucht werden. Wir beginnen mit einer Studie, in der die Haltung der nutzenden Person gegenüber Mixed Reality in häuslichen und Bildungsumgebungen analysiert wird. In sechs weiteren Fallstudien wird jeweils ein Aspekt der Realität oder Virtualität untersucht. In der ersten Fallstudie wird mithilfe eines schwebenden und steuerbaren Projektors untersucht, wie die Wahrnehmung der realen Welt erweitert werden kann, ohne dabei die Person mit Technologie auszustatten. Wir zeigen, dass die Darstellung von in-situ-Anweisungen für Navigationsaufgaben zu einer deutlich höheren Fähigkeit führt, Sehenswürdigkeiten der realen Welt zu beobachten und wiederzufinden. In der zweiten Fallstudie erweitern wir die Wahrnehmung der Realität durch Überlagerung von Echtzeitinformationen, die für das menschliche Auge normalerweise unsichtbar sind. Durch die Erweiterung des menschlichen Sehvermögens ermöglichen wir den Anwender:innen, Wärmestrahlung visuell wahrzunehmen. Darüber hinaus untersuchen wir, wie sich das Ersetzen von physischen Komponenten durch nicht funktionale, aber greifbare Replikate oder durch die vollständig virtuelle Darstellung auswirkt. In der dritten Fallstudie untersuchen wir, wie virtuelle Realitäten verbessert werden können, damit eine Person, die in der virtuellen Welt verweilt, Text auf einer physischen Tastatur eingeben kann. Unser Versuchsdemonstrator detektiert die Hände und die Tastatur, zeigt diese in der vermischen Realität an und ermöglicht somit die verbesserte Texteingaben. Unsere Analyse der Texteingabequalität zeigte die Wichtigkeit und Wirkung verschiedener Handdarstellungen. Anschließend untersuchen wir, wie man Virtualität berühren kann, indem wir generisches haptisches Feedback für virtuelle Realitäten simulieren. Wir zeigen, wie Quadrokopter taktiles Feedback ermöglichen und dadurch das Präsenzgefühl deutlich steigern können. Unsere letzte Fallstudie untersucht die Benutzerfreundlichkeit und den Eingaberaum von Smartphones in Mixed-Reality-Umgebungen. Hierbei wird das Smartphone der Person als Eingabegerät mit einem sekundären physischen Bildschirm verbunden, um die Ein- und Ausgabemodalitäten zu erweitern. Basierend auf unseren Erkenntnissen aus den einzelnen Fallstudien haben wir eine neuartige Taxonomie zur Kategorisierung von Mixed-Reality-Erfahrungen sowie Richtlinien für die Gestaltung von solchen entwickelt. Die Taxonomie basiert auf dem menschlichen Sinnessystem und den Artikulationsfähigkeiten. Wir stellen die vielseitige Verwendbarkeit vor und setzen unsere Fallstudien in Kontext, indem wir sie innerhalb des taxonomischen Raums einordnen. Die Gestaltungsrichtlinien sind in nutzerzentrierte und technologiezentrierte Richtlinien unterteilt. Es ist unsere Anliegen, dass diese Gestaltungsrichtlinien zu einer erfolgreichen Zukunft von Mixed-Reality-Systemen beitragen und gleichzeitig die neuen Interaktionsparadigmen hervorheben

    Investigating the Feasibility of Using Focussed Airborne Ultrasound as Tactile Feedback in Medical Simulators

    Get PDF
    Novice medical practitioners commonly practice on live patients in real medical procedures. However, due to the inexperience of the practitioner, mistakes are likely which exposes the patient to undue risk. To improve the training of novices, medical simulators create a virtual patient providing a safe environment for the user to practice within. An important clinical skill is palpation, a physical examination technique. The practitioners use their hands to feel the body of the patient to make diagnosis. A virtual patient has a visual representation but as it is virtual, the patient is not physically present. Haptics technology provide additional benefits to the training session by stimulating the physical sense of touch. A novel technique has recently emerged for stimulating tactile sensation called acoustic radiation pressure from focussed airborne ultrasound. Acoustic radiation creates a focal point of concentrated acoustic pressure in a three-dimensional field producing a force in mid-air. Airborne ultrasound has several advantages over conventional technologies. It was also initially theorised that using airborne ultrasound to simulate palpation compared to a previous system called PalpSim which consists of a rubber tube filled with water permanently embedded in a block of silicone, will offer better controllability over the displayed sensation to simulate various tactile sensations. The thesis has investigated the feasibility of using focussed airborne ultrasound as tactile feedback in medical simulators. A tactile device called UltraSendo was completely custom built to simulate an arterial pulse and a thrill sensation. UltraSendo was integrated with an augmented reality simulator displaying a virtual patient for user interaction. The simulator was brought to Ysbyty Glan Clwyd hospital for user feedback. A wide range of user responses were gathered. The majority of responses felt the arterial pulse was not sufficiently realistic whilst there were higher ratings for the thrill sensation which is acceptably realistic. Positive feedback suggests that airborne ultrasound can indeed provide tactile feedback in a medical context and is better at simulating a thrill sensation compared to a pulse sensation

    Modeling of frictional forces during bare-finger interactions with solid surfaces

    Get PDF
    Touching an object with our fingers yields frictional forces that allow us to perceive and explore its texture, shape, and other features, facilitating grasping and manipulation. While the relevance of dynamic frictional forces to sensory and motor function in the hand is well established, the way that they reflect the shape, features, and composition of touched objects is poorly understood. Haptic displays -electronic interfaces for stimulating the sense of touch- often aim to elicit the perceptual experience of touching real surfaces by delivering forces to the fingers that mimic those felt when touching real surfaces. However, the design and applications of such displays have been limited by the lack of knowledge about what forces are felt during real touch interactions. This represents a major gap in current knowledge about tactile function and haptic engineering. This dissertation addresses some aspects that would assist in their understanding. The goal of this research was to measure, characterize, and model frictional forces produced by a bare finger sliding over surfaces of multiple shapes. The major contributions of this work are (1) the design and development of a sensing system for capturing fingertip motion and forces during tactile exploration of real surfaces; (2) measurement and characterization of contact forces and the deformation of finger tissues during sliding over relief surfaces; (3) the development of a low order model of frictional force production based on surface specifications; (4) the analysis and modeling of contact geometry, interfacial mechanics, and their effects in frictional force production during tactile exploration of relief surfaces. This research aims to guide the design of algorithms for the haptic rendering of surface textures and shape. Such algorithms can be used to enhance human-machine interfaces, such as touch-screen displays, by (1) enabling users to feel surface characteristics also presented visually; (2) facilitating interaction with these devices; and (3) reducing the need for visual input to interact with them.Ph.D., Electrical Engineering -- Drexel University, 201

    A reconfigurable tactile display based on polymer MEMS technology

    Get PDF
    This research focuses on the development of polymer microfabrication technologies for the realization of two major components of a pneumatic tactile display: a microactuator array and a complementary microvalve (control) array. The concept, fabrication, and characterization of a kinematically-stabilized polymeric microbubble actuator (¡°endoskeletal microbubble actuator¡±) were presented. A systematic design and modeling procedure was carried out to generate an optimized geometry of the corrugated diaphragm to satisfy membrane deflection, force, and stability requirements set forth by the tactile display goals. A refreshable Braille cell as a tactile display prototype has been developed based on a 2x3 endoskeletal microbubble array and an array of commercial valves. The prototype can provide both a static display (which meets the displacement and force requirement of a Braille display) and vibratory tactile sensations. Along with the above capabilities, the device was designed to meet the criteria of lightness and compactness to permit portable operation. The design is scalable with respect to the number of tactile actuators while still being simple to fabricate. In order to further reduce the size and cost of the tactile display, a microvalve array can be integrated into the tactile display system to control the pneumatic fluid that actuates the microbubble actuator. A piezoelectrically-driven and hydraulically-amplified polymer microvalve has been designed, fabricated, and tested. An incompressible elastomer was used as a solid hydraulic medium to convert the small axial displacement of a piezoelectric actuator into a large valve head stroke while maintaining a large blocking force. The function of the microvalve as an on-off switch for a pneumatic microbubble tactile actuator was demonstrated. To further reduce the cost of the microvalve, a laterally-stacked multilayer PZT actuator has been fabricated using diced PZT multilayer, high aspect ratio SU-8 photolithography, and molding of electrically conductive polymer composite electrodes.Ph.D.Committee Chair: Allen,Mark; Committee Member: Bucknall,David; Committee Member: Book,Wayne; Committee Member: Griffin,Anselm; Committee Member: Yao,Donggan
    • …
    corecore