1,616 research outputs found

    Collection of Design Directions for the Realization of a Visual Interface with Haptic Feedback to Convey the Notion of Sonic Grain to DHH Students

    Get PDF
    This paper presents the results of a survey campaign aimed at distilling design directions for the realization of a visual interface with haptic feedback. The scope of the interface is to ease the conveyance of the concept of "sonic grain" to deaf and hard of hearing music students. Results from the questionnaire were leveraged for the realization of a prototype which exploits cross-modal associations among images, colors, sounds and textures to render different types of sonic grains and offer a multisensorial perceptual experience to the users. Such prototype represents a promising starting point for further investigation on how to jointly exploit visual, auditory and haptic feedback to support more inclusive pedagogical approaches to music teaching

    HapBead: on-skin microfluidic haptic interface using tunable bead

    Get PDF
    On-skin haptic interfaces using soft elastomers which are thin and flexible have significantly improved in recent years. Many are focused on vibrotactile feedback that requires complicated parameter tuning. Another approach is based on mechanical forces created via piezoelectric devices and other methods for non-vibratory haptic sensations like stretching, twisting. These are often bulky with electronic components and associated drivers are complicated with limited control of timing and precision. This paper proposes HapBead, a new on-skin haptic interface that is capable of rendering vibration like tactile feedback using microfluidics. HapBead leverages a microfluidic channel to precisely and agilely oscillate a small bead via liquid flow, which then generates various motion patterns in channel that creates highly tunable haptic sensations on skin. We developed a proof-of-concept design to implement thin, flexible and easily affordable HapBead platform, and verified its haptic rendering capabilities via attaching it to users’ fingertips. A study was carried out and confirmed that participants could accurately tell six different haptic patterns rendered by HapBead. HapBead enables new wearable display applications with multiple integrated functionalities such as on-skin haptic doodles, mixed reality haptics and visual-haptic displays

    TACTILE TEXTURES FOR BACK OF SCREEN GESTURE DETECTION USING MOTION SENSOR DATA AND MACHINE LEARNING

    Get PDF
    A computing device is described that uses motion data from motion sensors to detect gestures or user inputs, such as out-of-screen user inputs for mobile devices. In other words, the computing device detects gestures or user touch inputs at locations of the device that do not include a touch screen, such as anywhere on the surface of the housing or the case of the device. A tactile texture is applied to a housing of the computing device or a case that is coupled to the housing. The tactile texture causes the computing device to move in response to a user input applied to the tactile texture, such as when a user’s finger slides over the tactile texture. A motion sensor (e.g., an inertial measurement unit (IMU), accelerometer, gyroscope, etc.) generates motion data in response to detecting the motion of the computing device. The motion data is processed by an artificial neural network to infer attributes of the user input. In other words, the computing device applies a machine-learned model to the motion data (also referred to as sensor data or motion sensor data) to classify or label the various attributes, characteristics, or qualities of the input. In this way, the computing device utilizes machine learning and motion data to classify attributes of the user input or gesture utilizing motion sensors without the need for additional hardware, such as touch-sensitive devices and sensors

    Multichannel electrotactile feedback with spatial and mixed coding for closed-loop control of grasping force in hand prostheses

    Get PDF
    Providing somatosensory feedback to the user of a myoelectric prosthesis is an important goal since it can improve the utility as well as facilitate the embodiment of the assistive system. Most often, the grasping force was selected as the feedback variable and communicated through one or more individual single channel stimulation units (e.g., electrodes, vibration motors). In the present study, an integrated, compact, multichannel solution comprising an array electrode and a programmable stimulator was presented. Two co ding schemes (15 levels), spatial and mixed (spatial and frequency) modulation, were tested in able-bodied subjects, psychometrically and in force control with routine grasping and force tracking using real and simulated prosthesis. The results demonstrated that mixed and spatial coding, although substantially different in psychometric tests, resulted in a similar performance during both force control tasks. Furthermore, the ideal, visual feedback was not better than the tactile feedback in routine grasping. To explain the observed results, a conceptual model was proposed emphasizing that the performance depends on multiple factors, including feedback uncertainty, nature of the task and the reliability of the feedforward control. The study outcomes, specific conclusions and the general model, are relevant for the design of closed-loop myoelectric prostheses utilizing tactile feedback

    On the critical role of the sensorimotor loop on the design of interaction techniques and interactive devices

    Get PDF
    People interact with their environment thanks to their perceptual and motor skills. This is the way they both use objects around them and perceive the world around them. Interactive systems are examples of such objects. Therefore to design such objects, we must understand how people perceive them and manipulate them. For example, haptics is both related to the human sense of touch and what I call the motor ability. I address a number of research questions related to the design and implementation of haptic, gestural, and touch interfaces and present examples of contributions on these topics. More interestingly, perception, cognition, and action are not separated processes, but an integrated combination of them called the sensorimotor loop. Interactive systems follow the same overall scheme, with differences that make the complementarity of humans and machines. The interaction phenomenon is a set of connections between human sensorimotor loops, and interactive systems execution loops. It connects inputs with outputs, users and systems, and the physical world with cognition and computing in what I call the Human-System loop. This model provides a complete overview of the interaction phenomenon. It helps to identify the limiting factors of interaction that we can address to improve the design of interaction techniques and interactive devices.Les humains interagissent avec leur environnement grâce à leurs capacités perceptives et motrices. C'est ainsi qu'ils utilisent les objets qui les entourent et perçoivent le monde autour d'eux. Les systèmes interactifs sont des exemples de tels objets. Par conséquent, pour concevoir de tels objets, nous devons comprendre comment les gens les perçoivent et les manipulent. Par exemple, l'haptique est à la fois liée au sens du toucher et à ce que j'appelle la capacité motrice. J'aborde un certain nombre de questions de recherche liées à la conception et à la mise en œuvre d'interfaces haptiques, gestuelles et tactiles et je présente des exemples de contributions sur ces sujets. Plus intéressant encore, la perception, la cognition et l'action ne sont pas des processus séparés, mais une combinaison intégrée d'entre eux appelée la boucle sensorimotrice. Les systèmes interactifs suivent le même schéma global, avec des différences qui forme la complémentarité des humains et des machines. Le phénomène d'interaction est un ensemble de connexions entre les boucles sensorimotrices humaines et les boucles d'exécution des systèmes interactifs. Il relie les entrées aux sorties, les utilisateurs aux systèmes, et le monde physique à la cognition et au calcul dans ce que j'appelle la boucle Humain-Système. Ce modèle fournit un aperçu complet du phénomène d'interaction. Il permet d'identifier les facteurs limitatifs de l'interaction que nous pouvons aborder pour améliorer la conception des techniques d'interaction et des dispositifs interactifs

    Electroencephalographic Responses to Frictional Stimuli: Measurement Setup and Processing Pipeline

    Get PDF
    Tactility is a key sense in the human interaction with the environment. The understanding of tactile perception has become an exciting area in industrial, medical and scienti c research with an emphasis on the development of new haptic technologies. Surprisingly, the quanti cation of tactile perception has, compared to other senses, only recently become a eld of scienti c investigation. The overall goal of this emerging scienti c discipline is an understanding of the causal chain from the contact of the skin with materials to the brain dynamics representing recognition of and emotional reaction to the materials. Each link in this chain depends on individual and environmental factors ranging from the in uence of humidity on contact formation to the role of attention for the perception of touch. This thesis reports on the research of neural correlates to the frictional stimulation of the human ngertip. Event-related electroencephalographic potentials (ERPs) upon the change in ngertip friction are measured and studied, when pins of a programmable Braille-display were brought into skin contact. In order to contribute to the understanding of the causal chain mentioned above, this work combines two research areas which are usually not connected to each other, namely tribology and neuroscience. The goal of the study is to evaluate contributions of friction to the process of haptic perception. Key contributions of this thesis are: 1) Development of a setup to simultaneously record physical forces and ERPs upon tactile stimulation. 2) Implementation of a dedicated signal processing pipeline for the statistical analysis of ERP -amplitudes, -latencies and -instantaneous phases. 3) Interpretation of skin friction data and extraction of neural correlates with respect to varying friction intensities. The tactile stimulation of the ngertip upon raising and lowering of di erent lines of Braille-pins (one, three and ve) caused pronounced N50 and P100 components in the event-related ERPsequences, which is in line with the current literature. Friction between the ngertip and the Braille-system exhibited a characteristic temporal development which is attributed to viscoelastic skin relaxation. Although the force stimuli varied by a factor of two between the di erent Braillepatterns, no signi cant di erences were observed between the amplitudes and latencies of ERPs after standard across-trial averaging. Thus, for the rst time a phase measure for estimating singletrial interactions of somatosensory potentials is proposed. Results show that instantaneous phase coherency is evoked by friction, and that higher friction induces stronger and more time-localized phase coherencyDie Taktilität ist ein zentraler Sinn in der Interaktion mit unserer Umwelt. Das Bestreben, fundierte Erkenntnisse hinsichtlich der taktilenWahrnehmung zu gewinnen erhält groÿen Zuspruch in der industriellen, medizinischen und wissenschaftlichen Forschung, meist mit einem Fokus auf der Entwicklung von haptischen Technologien. Erstaunlicherweise ist jedoch die wissenschaftliche Quanti zierung der taktilen Wahrnehmung, verglichen mit anderen Sinnesmodalitäten, erst seit kurzem ein sich entwickelnder Forschungsbereich. Fokus dieser Disziplin ist es, die kognitive und emotionale Reaktion nach physischem Kontakt mit Materialien zu beschreiben, und die kausale Wirkungskette von der Berührung bis zur Reaktion zu verstehen. Dabei unterliegen die einzelnen Faktoren dieser Kette sowohl individuellen als auch externen Ein üssen, welche von der Luftfeuchtigkeit während des Kontaktes bis hin zur Rolle der Aufmerksamkeit für die Wahrnehmung reichen. Die vorliegende Arbeit beschäftigt sich mit der Untersuchung von neuronalen Korrelaten nach Reibungsstimulation des menschlichen Fingers. Dazu wurden Reibungsänderungen, welche durch den Kontakt der menschlichen Fingerspitze mit schaltbaren Stiften eines Braille-Display erzeugt wurden, untersucht und die entsprechenden neuronalen Korrelate aufgezeichnet. Um zu dem Verst ändnis der oben erwähnten Wirkungskette beizutragen, werden Ansätze aus zwei für gewöhnlich nicht zusammenhängenden Forschungsbereichen, nämlich der Tribologie und der Neurowissenschaft, kombiniert. Folgende Beiträge sind Hauptbestandteile dieser Arbeit: 1) Realisierung einer Messumgebung zur simultanen Ableitung von Kräften und ereigniskorrelierten Potentialen nach taktiler Stimulation der Fingerspitze. 2) Aufbau einer speziellen Signalverarbeitungskette zur statistischen Analyse von stimulationsabh ängigen EEG -Amplituden, -Latenzen und -instantanen Phasen. 3) Interpretation der erhobenen Reibungsdaten und Extraktion neuronaler Korrelate hinsichtlich variierender Stimulationsintensitäten. Unsere Resultate zeigen, dass die taktile Stimulation der Fingerspitze nach Anheben und Senken von Braille-Stiften zu signi kanten N50 und P100 Komponenten in den ereigniskorrelierten Potentialen führt, im Einklang mit der aktuellen Literatur. Die Reibung zwischen der Fingerspitze und dem Braille-System zeigte einen charakteristischen Signalverlauf, welcher auf viskoelastische Hautrelaxation zurückzuführen ist. Trotz der um einen Faktor zwei verschiedenen Intensit ätsunterschiede zwischen den Stimulationsmustern zeigten sich keine signi kanten Unterschiede zwischen den einfach gemittelten Amplituden der evozierten Potentialen. Erstmalig wurde ein Phasen-Maÿ zur Identi zierung von Unterschieden zwischen somatosensorischen "single-trial" Interaktionen angewandt. Diese Phasenanalyse zeigte, im Gegensatz zur Amplituden- und Latenzanalyse, deutlichere und signi kantere Unterschiede zwischen den Stimulationsparadigmen. Es wird gefolgert, dass Kohärenz zwischen den Momentanphasen durch Reibungsereignisse herbeigef ührt wird und dass durch stärkere Reibung diese Kohärenz, im zeitlichen Verlauf, stärker und lokalisierter wird

    Tac-tiles: multimodal pie charts for visually impaired users

    Get PDF
    Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which is augmented with a tangible overlay tile to guide user exploration. Dynamic feedback is provided by a tactile pin-array at the fingertips, and through speech/non-speech audio cues. In designing the system, we seek to preserve the affordances and metaphors of traditional, low-tech teaching media for the blind, and combine this with the benefits of a digital representation. Traditional tangible media allow rapid, non-sequential access to data, promote easy and unambiguous access to resources such as axes and gridlines, allow the use of external memory, and preserve visual conventions, thus promoting collaboration with sighted colleagues. A prototype system was evaluated with visually impaired users, and recommendations for multimodal design were derived

    Understanding Users' Perception of Simultaneous Tactile Textures

    Get PDF
    International audienceWe study users' perception of simultaneous tactile textures in ultrasonic devices. We investigate how relevant is providing the user with different complementary and simultaneous textures with respect to the different fingers that can be used to touch the surface. We show through a controlled experiment that users are able to distinguish the number of different textures independently of using fingers from one or two hands. However, our findings indicate that users are not able to differentiate between two different textures, that is to correctly identify each of them, when using fingers from the same hand. Based on our findings, we are then able to outline three relevant guidelines to assist multi-finger tactile feedback ergonomic and devices design

    Printgets: an Open-Source Toolbox for Designing Vibrotactile Widgets with Industrial-Grade Printed Actuators and Sensors

    Get PDF
    International audienceNew technologies for printing sensors and actuators combine the flexibility of interface layouts of touchscreens with localized vibrotactile feedback, but their fabrication still requires industrial-grade facilities. Until these technologies become easily replicable, interaction designers need material for ideation. We propose an open-source hardware and software toolbox providing maker-grade tools for iterative design of vibrotactile widgets with industrial-grade printed sensors and actuators. Our hardware toolbox provides a mechanical structure to clamp and stretch printed sheets, and electronic boards to drive sensors and actuators. Our software toolbox expands the design space of haptic interaction techniques by reusing the wide palette of available audio processing algorithms to generate real-time vibrotactile signals. We validate our toolbox with the implementation of three exemplar interface elements with tactile feedback: buttons, sliders, touchpads
    • …
    corecore