894 research outputs found

    Auditory, graphical and haptic contact cues for a reach, grasp, and place task in an augmented environment

    Get PDF
    An experiment was conducted to investigate how performance of a reach, grasp and place task was influenced by added auditory and graphical cues. The cues were presented at points in the task, specifically when making contact for grasping or placing the object, and were presented in single or in combined modalities. Haptic feedback was present always during physical interaction with the object. The auditory and graphical cues provided enhanced feedback about making contact between hand and object and between object and table. Also, the task was performed with or without vision of hand. Movements were slower without vision of hand. Providing auditory cues clearly facilitated performance, while graphical contact cues had no additional effect. Implications are discussed for various uses of auditory displays in virtual environments

    When Ears Drive Hands: The Influence of Contact Sound on Reaching to Grasp

    Get PDF
    Background Most research on the roles of auditory information and its interaction with vision has focused on perceptual performance. Little is known on the effects of sound cues on visually-guided hand movements. Methodology/Principal Findings We recorded the sound produced by the fingers upon contact as participants grasped stimulus objects which were covered with different materials. Then, in a further session the pre-recorded contact sounds were delivered to participants via headphones before or following the initiation of reach-to-grasp movements towards the stimulus objects. Reach-to-grasp movement kinematics were measured under the following conditions: (i) congruent, in which the presented contact sound and the contact sound elicited by the to-be-grasped stimulus corresponded; (ii) incongruent, in which the presented contact sound was different to that generated by the stimulus upon contact; (iii) control, in which a synthetic sound, not associated with a real event, was presented. Facilitation effects were found for congruent trials; interference effects were found for incongruent trials. In a second experiment, the upper and the lower parts of the stimulus were covered with different materials. The presented sound was always congruent with the material covering either the upper or the lower half of the stimulus. Participants consistently placed their fingers on the half of the stimulus that corresponded to the presented contact sound. Conclusions/Significance Altogether these findings offer a substantial contribution to the current debate about the type of object representations elicited by auditory stimuli and on the multisensory nature of the sensorimotor transformations underlying action

    Haptic Media Scenes

    Get PDF
    The aim of this thesis is to apply new media phenomenological and enactive embodied cognition approaches to explain the role of haptic sensitivity and communication in personal computer environments for productivity. Prior theory has given little attention to the role of haptic senses in influencing cognitive processes, and do not frame the richness of haptic communication in interaction design—as haptic interactivity in HCI has historically tended to be designed and analyzed from a perspective on communication as transmissions, sending and receiving haptic signals. The haptic sense may not only mediate contact confirmation and affirmation, but also rich semiotic and affective messages—yet this is a strong contrast between this inherent ability of haptic perception, and current day support for such haptic communication interfaces. I therefore ask: How do the haptic senses (touch and proprioception) impact our cognitive faculty when mediated through digital and sensor technologies? How may these insights be employed in interface design to facilitate rich haptic communication? To answer these questions, I use theoretical close readings that embrace two research fields, new media phenomenology and enactive embodied cognition. The theoretical discussion is supported by neuroscientific evidence, and tested empirically through case studies centered on digital art. I use these insights to develop the concept of the haptic figura, an analytical tool to frame the communicative qualities of haptic media. The concept gauges rich machine- mediated haptic interactivity and communication in systems with a material solution supporting active haptic perception, and the mediation of semiotic and affective messages that are understood and felt. As such the concept may function as a design tool for developers, but also for media critics evaluating haptic media. The tool is used to frame a discussion on opportunities and shortcomings of haptic interfaces for productivity, differentiating between media systems for the hand and the full body. The significance of this investigation is demonstrating that haptic communication is an underutilized element in personal computer environments for productivity and providing an analytical framework for a more nuanced understanding of haptic communication as enabling the mediation of a range of semiotic and affective messages, beyond notification and confirmation interactivity

    Tilt simulation : virtual reality based upper extremity stroke rehabilitation

    Get PDF
    The primary objective of this research is to design a recreational rehabilitation videogame that interactively encourages purposeful upper extremity gross motor movements. The simulation is also capable of continuous game modification to fit changing therapy goals, to match the needs of the players, and to provide continued motivation while capturing the interactive repetition. This thesis explains the design and features of this latest simulation - Tilt. Tilt uses physics to develop an engaging training experience and provides a realistic approach to virtual reality simulation including friction, elasticity and collisions between objects. It is designed to train upper extremity function as a unit involving multiple modalities simultaneously, either unilaterally or bilaterally. It is the latest addition to the NJIT Robot Assisted Virtual Rehabilitation (RAVR) system. It Employs the Cyber Glove and Flock of Birds systems to interface with the real world. This allows training motor function of patients that come to use in day to day life like making use of hands, fingers and shoulders to pick small objects on table, moving them and placing them elsewhere

    The cockpit for the 21st century

    Get PDF
    Interactive surfaces are a growing trend in many domains. As one possible manifestation of Mark Weiser’s vision of ubiquitous and disappearing computers in everywhere objects, we see touchsensitive screens in many kinds of devices, such as smartphones, tablet computers and interactive tabletops. More advanced concepts of these have been an active research topic for many years. This has also influenced automotive cockpit development: concept cars and recent market releases show integrated touchscreens, growing in size. To meet the increasing information and interaction needs, interactive surfaces offer context-dependent functionality in combination with a direct input paradigm. However, interfaces in the car need to be operable while driving. Distraction, especially visual distraction from the driving task, can lead to critical situations if the sum of attentional demand emerging from both primary and secondary task overextends the available resources. So far, a touchscreen requires a lot of visual attention since its flat surface does not provide any haptic feedback. There have been approaches to make direct touch interaction accessible while driving for simple tasks. Outside the automotive domain, for example in office environments, concepts for sophisticated handling of large displays have already been introduced. Moreover, technological advances lead to new characteristics for interactive surfaces by enabling arbitrary surface shapes. In cars, two main characteristics for upcoming interactive surfaces are largeness and shape. On the one hand, spatial extension is not only increasing through larger displays, but also by taking objects in the surrounding into account for interaction. On the other hand, the flatness inherent in current screens can be overcome by upcoming technologies, and interactive surfaces can therefore provide haptically distinguishable surfaces. This thesis describes the systematic exploration of large and shaped interactive surfaces and analyzes their potential for interaction while driving. Therefore, different prototypes for each characteristic have been developed and evaluated in test settings suitable for their maturity level. Those prototypes were used to obtain subjective user feedback and objective data, to investigate effects on driving and glance behavior as well as usability and user experience. As a contribution, this thesis provides an analysis of the development of interactive surfaces in the car. Two characteristics, largeness and shape, are identified that can improve the interaction compared to conventional touchscreens. The presented studies show that large interactive surfaces can provide new and improved ways of interaction both in driver-only and driver-passenger situations. Furthermore, studies indicate a positive effect on visual distraction when additional static haptic feedback is provided by shaped interactive surfaces. Overall, various, non-exclusively applicable, interaction concepts prove the potential of interactive surfaces for the use in automotive cockpits, which is expected to be beneficial also in further environments where visual attention needs to be focused on additional tasks.Der Einsatz von interaktiven Oberflächen weitet sich mehr und mehr auf die unterschiedlichsten Lebensbereiche aus. Damit sind sie eine mögliche Ausprägung von Mark Weisers Vision der allgegenwärtigen Computer, die aus unserer direkten Wahrnehmung verschwinden. Bei einer Vielzahl von technischen Geräten des täglichen Lebens, wie Smartphones, Tablets oder interaktiven Tischen, sind berührungsempfindliche Oberflächen bereits heute in Benutzung. Schon seit vielen Jahren arbeiten Forscher an einer Weiterentwicklung der Technik, um ihre Vorteile auch in anderen Bereichen, wie beispielsweise der Interaktion zwischen Mensch und Automobil, nutzbar zu machen. Und das mit Erfolg: Interaktive Benutzeroberflächen werden mittlerweile serienmäßig in vielen Fahrzeugen eingesetzt. Der Einbau von immer größeren, in das Cockpit integrierten Touchscreens in Konzeptfahrzeuge zeigt, dass sich diese Entwicklung weiter in vollem Gange befindet. Interaktive Oberflächen ermöglichen das flexible Anzeigen von kontextsensitiven Inhalten und machen eine direkte Interaktion mit den Bildschirminhalten möglich. Auf diese Weise erfüllen sie die sich wandelnden Informations- und Interaktionsbedürfnisse in besonderem Maße. Beim Einsatz von Bedienschnittstellen im Fahrzeug ist die gefahrlose Benutzbarkeit während der Fahrt von besonderer Bedeutung. Insbesondere visuelle Ablenkung von der Fahraufgabe kann zu kritischen Situationen führen, wenn Primär- und Sekundäraufgaben mehr als die insgesamt verfügbare Aufmerksamkeit des Fahrers beanspruchen. Herkömmliche Touchscreens stellen dem Fahrer bisher lediglich eine flache Oberfläche bereit, die keinerlei haptische Rückmeldung bietet, weshalb deren Bedienung besonders viel visuelle Aufmerksamkeit erfordert. Verschiedene Ansätze ermöglichen dem Fahrer, direkte Touchinteraktion für einfache Aufgaben während der Fahrt zu nutzen. Außerhalb der Automobilindustrie, zum Beispiel für Büroarbeitsplätze, wurden bereits verschiedene Konzepte für eine komplexere Bedienung großer Bildschirme vorgestellt. Darüber hinaus führt der technologische Fortschritt zu neuen möglichen Ausprägungen interaktiver Oberflächen und erlaubt, diese beliebig zu formen. Für die nächste Generation von interaktiven Oberflächen im Fahrzeug wird vor allem an der Modifikation der Kategorien Größe und Form gearbeitet. Die Bedienschnittstelle wird nicht nur durch größere Bildschirme erweitert, sondern auch dadurch, dass Objekte wie Dekorleisten in die Interaktion einbezogen werden können. Andererseits heben aktuelle Technologieentwicklungen die Restriktion auf flache Oberflächen auf, so dass Touchscreens künftig ertastbare Strukturen aufweisen können. Diese Dissertation beschreibt die systematische Untersuchung großer und nicht-flacher interaktiver Oberflächen und analysiert ihr Potential für die Interaktion während der Fahrt. Dazu wurden für jede Charakteristik verschiedene Prototypen entwickelt und in Testumgebungen entsprechend ihres Reifegrads evaluiert. Auf diese Weise konnten subjektives Nutzerfeedback und objektive Daten erhoben, und die Effekte auf Fahr- und Blickverhalten sowie Nutzbarkeit untersucht werden. Diese Dissertation leistet den Beitrag einer Analyse der Entwicklung von interaktiven Oberflächen im Automobilbereich. Weiterhin werden die Aspekte Größe und Form untersucht, um mit ihrer Hilfe die Interaktion im Vergleich zu herkömmlichen Touchscreens zu verbessern. Die durchgeführten Studien belegen, dass große Flächen neue und verbesserte Bedienmöglichkeiten bieten können. Außerdem zeigt sich ein positiver Effekt auf die visuelle Ablenkung, wenn zusätzliches statisches, haptisches Feedback durch nicht-flache Oberflächen bereitgestellt wird. Zusammenfassend zeigen verschiedene, untereinander kombinierbare Interaktionskonzepte das Potential interaktiver Oberflächen für den automotiven Einsatz. Zudem können die Ergebnisse auch in anderen Bereichen Anwendung finden, in denen visuelle Aufmerksamkeit für andere Aufgaben benötigt wird

    Touch- and Walkable Virtual Reality to Support Blind and Visually Impaired Peoples‘ Building Exploration in the Context of Orientation and Mobility

    Get PDF
    Der Zugang zu digitalen Inhalten und Informationen wird immer wichtiger für eine erfolgreiche Teilnahme an der heutigen, zunehmend digitalisierten Zivilgesellschaft. Solche Informationen werden meist visuell präsentiert, was den Zugang für blinde und sehbehinderte Menschen einschränkt. Die grundlegendste Barriere ist oft die elementare Orientierung und Mobilität (und folglich die soziale Mobilität), einschließlich der Erlangung von Kenntnissen über unbekannte Gebäude vor deren Besuch. Um solche Barrieren zu überbrücken, sollten technische Hilfsmittel entwickelt und eingesetzt werden. Es ist ein Kompromiss zwischen technologisch niedrigschwellig zugänglichen und verbreitbaren Hilfsmitteln und interaktiv-adaptiven, aber komplexen Systemen erforderlich. Die Anpassung der Technologie der virtuellen Realität (VR) umfasst ein breites Spektrum an Entwicklungs- und Entscheidungsoptionen. Die Hauptvorteile der VR-Technologie sind die erhöhte Interaktivität, die Aktualisierbarkeit und die Möglichkeit, virtuelle Räume und Modelle als Abbilder von realen Räumen zu erkunden, ohne dass reale Gefahren und die begrenzte Verfügbarkeit von sehenden Helfern auftreten. Virtuelle Objekte und Umgebungen haben jedoch keine physische Beschaffenheit. Ziel dieser Arbeit ist es daher zu erforschen, welche VR-Interaktionsformen sinnvoll sind (d.h. ein angemessenes Verbreitungspotenzial bieten), um virtuelle Repräsentationen realer Gebäude im Kontext von Orientierung und Mobilität berührbar oder begehbar zu machen. Obwohl es bereits inhaltlich und technisch disjunkte Entwicklungen und Evaluationen zur VR-Technologie gibt, fehlt es an empirischer Evidenz. Zusätzlich bietet diese Arbeit einen Überblick über die verschiedenen Interaktionen. Nach einer Betrachtung der menschlichen Physiologie, Hilfsmittel (z.B. taktile Karten) und technologischen Eigenschaften wird der aktuelle Stand der Technik von VR vorgestellt und die Anwendung für blinde und sehbehinderte Nutzer und der Weg dorthin durch die Einführung einer neuartigen Taxonomie diskutiert. Neben der Interaktion selbst werden Merkmale des Nutzers und des Geräts, der Anwendungskontext oder die nutzerzentrierte Entwicklung bzw. Evaluation als Klassifikatoren herangezogen. Begründet und motiviert werden die folgenden Kapitel durch explorative Ansätze, d.h. im Bereich 'small scale' (mit sogenannten Datenhandschuhen) und im Bereich 'large scale' (mit einer avatargesteuerten VR-Fortbewegung). Die folgenden Kapitel führen empirische Studien mit blinden und sehbehinderten Nutzern durch und geben einen formativen Einblick, wie virtuelle Objekte in Reichweite der Hände mit haptischem Feedback erfasst werden können und wie verschiedene Arten der VR-Fortbewegung zur Erkundung virtueller Umgebungen eingesetzt werden können. Daraus werden geräteunabhängige technologische Möglichkeiten und auch Herausforderungen für weitere Verbesserungen abgeleitet. Auf der Grundlage dieser Erkenntnisse kann sich die weitere Forschung auf Aspekte wie die spezifische Gestaltung interaktiver Elemente, zeitlich und räumlich kollaborative Anwendungsszenarien und die Evaluation eines gesamten Anwendungsworkflows (d.h. Scannen der realen Umgebung und virtuelle Erkundung zu Trainingszwecken sowie die Gestaltung der gesamten Anwendung in einer langfristig barrierefreien Weise) konzentrieren.Access to digital content and information is becoming increasingly important for successful participation in today's increasingly digitized civil society. Such information is mostly presented visually, which restricts access for blind and visually impaired people. The most fundamental barrier is often basic orientation and mobility (and consequently, social mobility), including gaining knowledge about unknown buildings before visiting them. To bridge such barriers, technological aids should be developed and deployed. A trade-off is needed between technologically low-threshold accessible and disseminable aids and interactive-adaptive but complex systems. The adaptation of virtual reality (VR) technology spans a wide range of development and decision options. The main benefits of VR technology are increased interactivity, updatability, and the possibility to explore virtual spaces as proxies of real ones without real-world hazards and the limited availability of sighted assistants. However, virtual objects and environments have no physicality. Therefore, this thesis aims to research which VR interaction forms are reasonable (i.e., offering a reasonable dissemination potential) to make virtual representations of real buildings touchable or walkable in the context of orientation and mobility. Although there are already content and technology disjunctive developments and evaluations on VR technology, there is a lack of empirical evidence. Additionally, this thesis provides a survey between different interactions. Having considered the human physiology, assistive media (e.g., tactile maps), and technological characteristics, the current state of the art of VR is introduced, and the application for blind and visually impaired users and the way to get there is discussed by introducing a novel taxonomy. In addition to the interaction itself, characteristics of the user and the device, the application context, or the user-centered development respectively evaluation are used as classifiers. Thus, the following chapters are justified and motivated by explorative approaches, i.e., in the group of 'small scale' (using so-called data gloves) and in the scale of 'large scale' (using an avatar-controlled VR locomotion) approaches. The following chapters conduct empirical studies with blind and visually impaired users and give formative insight into how virtual objects within hands' reach can be grasped using haptic feedback and how different kinds of VR locomotion implementation can be applied to explore virtual environments. Thus, device-independent technological possibilities and also challenges for further improvements are derived. On the basis of this knowledge, subsequent research can be focused on aspects such as the specific design of interactive elements, temporally and spatially collaborative application scenarios, and the evaluation of an entire application workflow (i.e., scanning the real environment and exploring it virtually for training purposes, as well as designing the entire application in a long-term accessible manner)

    Multisensory texture exploration at the tip of the pen

    Get PDF
    A tool for the multisensory stylus-based exploration of virtual textures was used to investigate how different feedback modalities (static or dynamically deformed images, vibration, sound) affect exploratory gestures. To this end, we ran an experiment where participants had to steer a path with the stylus through a curved corridor on the surface of a graphic tablet/display, and we measured steering time, dispersion of trajectories, and applied force. Despite the variety of subjective impressions elicited by the different feedback conditions, we found that only nonvisual feedback induced significant variations in trajectories and an increase in movement time. In a post-experiment, using a paper-and-wood physical realization of the same texture, we recorded a variety of gestural behaviors markedly different from those found with the virtual texture. With the physical setup, movement time was shorter and texture-dependent lateral accelerations could be observed. This work highlights the limits of multisensory pseudo-haptic techniques in the exploration of surface textures

    Sensory Augmentation for Balance Rehabilitation Using Skin Stretch Feedback

    Get PDF
    This dissertation focuses on the development and evaluation of portable sensory augmentation systems that render skin-stretch feedback of posture for standing balance training and for postural control improvement. Falling is one of the main causes of fatal injuries among all members of the population. The high incidence of fall-related injuries also leads to high medical expenses, which cost approximately $34 billion annually in the United States. People with neurological diseases, e.g., stroke, multiple sclerosis, spinal cord injuries, and the elderly are more prone to falling when compared to healthy individuals. Falls among these populations can also lead to hip fracture, or even death. Thus, several balance and gait rehabilitation approaches have been developed to reduce the risk of falling. Traditionally, a balance-retraining program includes a series of exercises for trainees to strengthen their sensorimotor and musculoskeletal systems. Recent advances in technology have incorporated biofeedback such as visual, auditory, or haptic feedback to provide the users with extra cues about their postural sway. Studies have also demonstrated the positive effects of biofeedback on balance control. However, current applications of biofeedback for interventions in people with impaired balance are still lacking some important characteristics such as portability (in-home care), small-size, and long-term viability. Inspired by the concept of light touch, a light, small, and wearable sensory augmentation system that detects body sway and supplements skin stretch on one’s fingertip pad was first developed. The addition of a shear tactile display could significantly enhance the sensation to body movement. Preliminary results have shown that the application of passive skin stretch feedback at the fingertip enhanced standing balance of healthy young adults. Based on these findings, two research directions were initiated to investigate i) which dynamical information of postural sway could be more effectively conveyed by skin stretch feedback, and ii) how can such feedback device be easily used in the clinical setting or on a daily basis. The major sections of this research are focused on understanding how the skin stretch feedback affects the standing balance and on quantifying the ability of humans to interpret the cutaneous feedback as the cues of their physiological states. Experimental results from both static and dynamic balancing tasks revealed that healthy subjects were able to respond to the cues and subsequently correct their posture. However, it was observed that the postural sway did not generally improve in healthy subjects due to skin stretch feedback. A possible reason was that healthy subjects already had good enough quality sensory information such that the additional artificial biofeedback may have interfered with other sensory cues. Experiments incorporating simulated sensory deficits were further conducted and it was found that subjects with perturbed sensory systems (e.g., unstable surface) showed improved balance due to skin stretch feedback when compared to the neutral standing conditions. Positive impacts on balance performance have also been demonstrated among multiple sclerosis patients when they receive skin stretch feedback from a sensory augmentation walker. The findings in this research indicated that the skin stretch feedback rendered by the developed devices affected the human balance and can potentially compensate underlying neurological or musculoskeletal disorders, therefore enhancing quiet standing postural control
    • …
    corecore