37 research outputs found

    RealPen: Providing Realism in Handwriting Tasks on Touch Surfaces using Auditory-Tactile Feedback

    Get PDF
    We present RealPen, an augmented stylus for capacitive tablet screens that recreates the physical sensation of writing on paper with a pencil, ball-point pen or marker pen. The aim is to create a more engaging experience when writing on touch surfaces, such as screens of tablet computers. This is achieved by regenerating the friction-induced oscillation and sound of a real writing tool in contact with paper. To generate realistic tactile feedback, our algorithm analyzes the frequency spectrum of the friction oscillation generated when writing with traditional tools, extracts principal frequencies, and uses the actuator's frequency response profile for an adjustment weighting function. We enhance the realism by providing the sound feedback aligned with the writing pressure and speed. Furthermore, we investigated the effects of superposition and fluctuation of several frequencies on human tactile perception, evaluated the performance of RealPen, and characterized users' perception and preference of each feedback type

    An Overview of Wearable Haptic Technologies and Their Performance in Virtual Object Exploration.

    Get PDF
    We often interact with our environment through manual handling of objects and exploration of their properties. Object properties (OP), such as texture, stiffness, size, shape, temperature, weight, and orientation provide necessary information to successfully perform interactions. The human haptic perception system plays a key role in this. As virtual reality (VR) has been a growing field of interest with many applications, adding haptic feedback to virtual experiences is another step towards more realistic virtual interactions. However, integrating haptics in a realistic manner, requires complex technological solutions and actual user-testing in virtual environments (VEs) for verification. This review provides a comprehensive overview of recent wearable haptic devices (HDs) categorized by the OP exploration for which they have been verified in a VE. We found 13 studies which specifically addressed user-testing of wearable HDs in healthy subjects. We map and discuss the different technological solutions for different OP exploration which are useful for the design of future haptic object interactions in VR, and provide future recommendations

    Touchmover: Actuated 3d touchscreen with haptic feedback

    Get PDF
    ABSTRACT This paper presents the design and development of a novel visual+haptic device that co-locates 3D stereo visualization, direct touch and touch force sensing with a robotically actuated display. Our actuated immersive 3D display, called TouchMover, is capable of providing 1D movement (up to 36cm) and force feedback (up to 230N) in a single dimension, perpendicular to the screen plane. In addition to describing the details of our design, we showcase how TouchMover allows the user to: 1) interact with 3D objects by pushing them on the screen with realistic force feedback, 2) touch and feel the contour of a 3D object, 3) explore and annotate volumetric medical images (e.g., MRI brain scans) and 4) experience different activation forces and stiffness when interacting with common 2D on-screen elements (e.g., buttons). We also contribute the results of an experiment which demonstrates the effectiveness of the haptic output of our device. Our results show that people are capable of disambiguating between 10 different 3D shapes with the same 2D footprint by touching alone and without any visual feedback (85% recognition rate, 12 participants)

    Development of a telepresence manipulation system

    Get PDF
    Master'sMASTER OF ENGINEERIN

    Design and Effect of Continuous Wearable Tactile Displays

    Get PDF
    Our sense of touch is one of our core senses and while not as information rich as sight and hearing, it tethers us to reality. Our skin is the largest sensory organ in our body and we rely on it so much that we don\u27t think about it most of the time. Tactile displays - with the exception of actuators for notifications on smartphones and smartwatches - are currently understudied and underused. Currently tactile cues are mostly used in smartphones and smartwatches to notify the user of an incoming call or text message. Specifically continuous displays - displays that do not just send one notification but stay active for an extended period of time and continuously communicate information - are rarely studied. This thesis aims at exploring the utilization of our vibration perception to create continuous tactile displays. Transmitting a continuous stream of tactile information to a user in a wearable format can help elevate tactile displays from being mostly used for notifications to becoming more like additional senses enabling us to perceive our environment in new ways. This work provides a serious step forward in design, effect and use of continuous tactile displays and their use in human-computer interaction. The main contributions include: Exploration of Continuous Wearable Tactile Interfaces This thesis explores continuous tactile displays in different contexts and with different types of tactile information systems. The use-cases were explored in various domains for tactile displays - Sports, Gaming and Business applications. The different types of continuous tactile displays feature one- or multidimensional tactile patterns, temporal patterns and discrete tactile patterns. Automatic Generation of Personalized Vibration Patterns In this thesis a novel approach of designing vibrotactile patterns without expert knowledge by leveraging evolutionary algorithms to create personalized vibration patterns - is described. This thesis presents the design of an evolutionary algorithm with a human centered design generating abstract vibration patterns. The evolutionary algorithm was tested in a user study which offered evidence that interactive generation of abstract vibration patterns is possible and generates diverse sets of vibration patterns that can be recognized with high accuracy. Passive Haptic Learning for Vibration Patterns Previous studies in passive haptic learning have shown surprisingly strong results for learning Morse Code. If these findings could be confirmed and generalized, it would mean that learning a new tactile alphabet could be made easier and learned in passing. Therefore this claim was investigated in this thesis and needed to be corrected and contextualized. A user study was conducted to study the effects of the interaction design and distraction tasks on the capability to learn stimulus-stimulus-associations with Passive Haptic Learning. This thesis presents evidence that Passive Haptic Learning of vibration patterns induces only a marginal learning effect and is not a feasible and efficient way to learn vibration patterns that include more than two vibrations. Influence of Reference Frames for Spatial Tactile Stimuli Designing wearable tactile stimuli that contain spatial information can be a challenge due to the natural body movement of the wearer. An important consideration therefore is what reference frame to use for spatial cues. This thesis investigated allocentric versus egocentric reference frames on the wrist and compared them for induced cognitive load, reaction time and accuracy in a user study. This thesis presents evidence that using an allocentric reference frame drastically lowers cognitive load and slightly lowers reaction time while keeping the same accuracy as an egocentric reference frame, making a strong case for the utilization of allocentric reference frames in tactile bracelets with several tactile actuators

    Interaction Methods for Smart Glasses : A Survey

    Get PDF
    Since the launch of Google Glass in 2014, smart glasses have mainly been designed to support micro-interactions. The ultimate goal for them to become an augmented reality interface has not yet been attained due to an encumbrance of controls. Augmented reality involves superimposing interactive computer graphics images onto physical objects in the real world. This survey reviews current research issues in the area of human-computer interaction for smart glasses. The survey first studies the smart glasses available in the market and afterwards investigates the interaction methods proposed in the wide body of literature. The interaction methods can be classified into hand-held, touch, and touchless input. This paper mainly focuses on the touch and touchless input. Touch input can be further divided into on-device and on-body, while touchless input can be classified into hands-free and freehand. Next, we summarize the existing research efforts and trends, in which touch and touchless input are evaluated by a total of eight interaction goals. Finally, we discuss several key design challenges and the possibility of multi-modal input for smart glasses.Peer reviewe

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Multimodal interaction: developing an interaction concept for a touchscreen incorporating tactile feedback

    Get PDF
    The touchscreen, as an alternative user interface for applications that normally require mice and keyboards, has become more and more commonplace, showing up on mobile devices, on vending machines, on ATMs and in the control panels of machines in industry, where conventional input devices cannot provide intuitive, rapid and accurate user interaction with the content of the display. The exponential growth in processing power on the PC, together with advances in understanding human communication channels, has had a significant effect on the design of usable, human-factored interfaces on touchscreens, and on the number and complexity of applications available on touchscreens. Although computer-driven touchscreen interfaces provide programmable and dynamic displays, the absence of the expected tactile cues on the hard and static surfaces of conventional touchscreens is challenging interface design and touchscreen usability, in particular for distracting, low-visibility environments. Current technology allows the human tactile modality to be used in touchscreens. While the visual channel converts graphics and text unidirectionally from the computer to the end user, tactile communication features a bidirectional information flow to and from the user as the user perceives and acts on the environment and the system responds to changing contextual information. Tactile sensations such as detents and pulses provide users with cues that make selecting and controlling a more intuitive process. Tactile features can compensate for deficiencies in some of the human senses, especially in tasks which carry a heavy visual or auditory burden. In this study, an interaction concept for tactile touchscreens is developed with a view to employing the key characteristics of the human sense of touch effectively and efficiently, especially in distracting environments where vision is impaired and hearing is overloaded. As a first step toward improving the usability of touchscreens through the integration of tactile effects, different mechanical solutions for producing motion in tactile touchscreens are investigated, to provide a basis for selecting suitable vibration directions when designing tactile displays. Building on these results, design know-how regarding tactile feedback patterns is further developed to enable dynamic simulation of UI controls, in order to give users a sense of perceiving real controls on a highly natural touch interface. To study the value of adding tactile properties to touchscreens, haptically enhanced UI controls are then further investigated with the aim of mapping haptic signals to different usage scenarios to perform primary and secondary tasks with touchscreens. The findings of the study are intended for consideration and discussion as a guide to further development of tactile stimuli, haptically enhanced user interfaces and touchscreen applications
    corecore