65 research outputs found

    Crossmodal audio and tactile interaction with mobile touchscreens

    Get PDF
    Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device. This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective. A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent. Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established. The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems

    The Graphical Access Challenge for People with Visual Impairments: Positions and Pathways Forward

    Get PDF
    Graphical access is one of the most pressing challenges for individuals who are blind or visually impaired. This chapter discusses some of the factors underlying the graphics access challenge, reviews prior approaches to addressing this long-standing information access barrier, and describes some promising new solutions. We specifically focus on touchscreen-based smart devices, a relatively new class of information access technologies, which our group believes represent an exemplary model of user-centered, needs-based design. We highlight both the challenges and the vast potential of these technologies for alleviating the graphics accessibility gap and share the latest results in this line of research. We close with recommendations on ideological shifts in mindset about how we approach solving this vexing access problem, which will complement both technological and perceptual advancements that are rapidly being uncovered through a growing research community in this domain

    Tactile discrimination of material properties: application to virtual buttons for professional appliances

    Get PDF
    An experiment is described that tested the possibility to classify wooden, plastic, and metallic objects based on reproduced auditory and vibrotactile stimuli. The results show that recognition rates are considerably above chance level with either unimodal auditory or vibrotactile feedback. Supported by those findings, the possibility to render virtual buttons for professional appliances with different tactile properties was tested. To this end, a touchscreen device was provided with various types of vibrotactile feedback in response to the sensed pressing force and location of a finger. Different virtual buttons designs were tested by user panels who performed a subjective evaluation on perceived tactile properties and materials. In a first implementation, virtual buttons were designed reproducing the vibration recordings of real materials used in the classification experiment: mainly due to hardware limitations of our prototype and the consequent impossibility to render complex vibratory signals, this approach did not prove successful. A second implementation was then optimized for the device capabilities, moreover introducing surface compliance effects and button release cues: the new design led to generally high quality ratings, clear discrimination of different buttons and unambiguous material classification. The lesson learned was that various material and physical properties of virtual buttons can be successfully rendered by characteristic frequency and decay cues if correctly reproduced by the device

    Remote tactile feedback on interactive surfaces

    Get PDF
    Direct touch input on interactive surfaces has become a predominating standard for the manipulation of digital information in our everyday lives. However, compared to our rich interchange with the physical world, the interaction with touch-based systems is limited in terms of flexibility of input and expressiveness of output. Particularly, the lack of tactile feedback greatly reduces the general usability of a touch-based system and hinders from a productive entanglement of the virtual information with the physical world. This thesis proposes remote tactile feedback as a novel method to provide programmed tactile stimuli supporting direct touch interactions. The overall principle is to spatially decouple the location of touch input (e.g. fingertip or hand) and the location of the tactile sensation on the user's body (e.g. forearm or back). Remote tactile feedback is an alternative concept which avoids particular challenges of existing approaches. Moreover, the principle provides inherent characteristics which can accommodate for the requirements of current and future touch interfaces. To define the design space, the thesis provides a structured overview of current forms of touch surfaces and identifies trends towards non-planar and non-rigid forms with more versatile input mechanisms. Furthermore, a classification highlights limitations of the current methods to generate tactile feedback on touch-based systems. The proposed notion of tactile sensory relocation is a form of sensory substitution. Underlying neurological and psychological principles corroborate the approach. Thus, characteristics of the human sense of touch and principles from sensory substitution help to create a technical and conceptual framework for remote tactile feedback. Three consecutive user studies measure and compare the effects of both direct and remote tactile feedback on the performance and the subjective ratings of the user. Furthermore, the experiments investigate different body locations for the application of tactile stimuli. The results show high subjective preferences for tactile feedback, regardless of its type of application. Additionally, the data reveals no significant differences between the effects of direct and remote stimuli. The results back the feasibility of the approach and provide parameters for the design of stimuli and the effective use of the concept. The main part of the thesis describes the systematical exploration and analysis of the inherent characteristics of remote tactile feedback. Four specific features of the principle are identified: (1) the simplification of the integration of cutaneous stimuli, (2) the transmission of proactive, reactive and detached feedback, (3) the increased expressiveness of tactile sensations and (4) the provision of tactile feedback during multi-touch. In each class, several prototypical remote tactile interfaces are used in evaluations to analyze the concept. For example, the PhantomStation utilizes psychophysical phenomena to reduce the number of single tactile actuators. An evaluation with the prototype compares standard actuator technologies with each other in order to enable simple and scalable implementations. The ThermalTouch prototype creates remote thermal stimuli to reproduce material characteristics on standard touchscreens. The results show a stable rate of virtual object discrimination based on remotely applied temperature profiles. The AutmotiveRTF system is implemented in a vehicle and supports the driver's input on the in-vehicle-infotainment system. A field study with the system focuses on evaluating the effects of proactive and reactive feedback on the user's performance. The main contributions of the dissertation are: First, the thesis introduces the principle of remote tactile feedback and defines a design space for this approach as an alternative method to provide non-visual cues on interactive surfaces. Second, the thesis describes technical examples to rapidly prototype remote tactile feedback systems. Third, these prototypes are deployed in several evaluations which highlight the beneficial subjective and objective effects of the approach. Finally, the thesis presents features and inherent characteristics of remote tactile feedback as a means to support the interaction on today's touchscreens and future interactive surfaces.Die Interaktion mit berührungsempfindlichen Oberflächen ist heute ein Standard für die Manipulation von digitaler Information. Jedoch weist die Bedienung dieser interaktiven Bildschirme starke Einschränkungen hinsichtlich der Flexibilität bei der Eingabe und der Ausdruckskraft der Ausgabe auf, wenn man sie mit den vielfältigen Möglichkeiten des Umgangs mit Objekten in unserer Alltagswelt vergleicht. Besonders die nicht vorhandenen Tastsinnesrückmeldungen vermindern stark die Benutzbarkeit solcher Systeme und verhindern eine effektive Verknüpfung von virtueller Information und physischer Welt. Die vorliegende Dissertation beschreibt den Ansatz der 'distalen taktilen Rückmeldungen' als neuartige Möglichkeit zur Vermittlung programmierter Tastsinnesreize an Benutzer interaktiver Oberflächen. Das Grundprinzip dabei ist die räumliche Trennung zwischen der Eingabe durch Berührung (z.B. mit der Fingerspitze) und dem daraus resultierenden taktilen Reiz am Körper der Benutzer (z.B. am Rücken). Dabei vermeidet das Konzept der distalen taktilen Rückmeldungen einzelne technische und konzeptionelle Nachteile existierender Ansätze. Zusätzlich bringt es Interaktionsmöglichkeiten mit sich, die den Eigenheiten der Interaktion mit aktuellen und auch zukünftigen berührungsempfindlichen Oberflächen Rechnung tragen. Zu Beginn zeigt ein Überblick zu relevanten Arbeiten den aktuellen Forschungstrend hin zu nicht-flachen und verformbaren berührungsempfindlichen Oberflächen sowie zu vielfältigeren Eingabemethoden. Eine Klassifizierung ordnet existierende technische Verfahren zur Erzeugung von künstlichen Tastsinnesreizen und stellt jeweils konzeptuelle und technische Herausforderungen dar. Der in dieser Arbeit vorgeschlagene Ansatz der Verlagerung von Tastsinnesreizen ist eine Form der sensorischen Substitution, zugrunde liegende neurologische und psychologische Prinzipien untermauern das Vorgehen. Die Wirkprinzipien des menschlichen Tastsinnes und die Systeme zur sensorischen Substitution liefern daher konzeptionelle und technische Richtlinien zur Umsetzung der distalen taktilen Rückmeldungen. Drei aufeinander aufbauende Benutzerstudien vergleichen die Auswirkungen von direkten und distalen taktilen Rückmeldungen auf die Leistung und das Verhalten von Benutzern sowie deren subjektive Bewertung der Interaktion. Außerdem werden in den Experimenten die Effekte von Tastsinnesreizen an verschiedenen Körperstellen untersucht. Die Ergebnisse zeigen starke Präferenzen für Tastsinnesrückmeldungen, unabhängig von deren Applikationsort. Die Daten ergeben weiterhin keine signifikanten Unterschiede bei den quantitativen Effekten von direktem und distalen Rückmeldungen. Diese Ergebnisse befürworten die Realisierbarkeit des Ansatzes und zeigen Richtlinien für weitere praktische Umsetzungen auf. Der Hauptteil der Dissertation beschreibt die systematische Untersuchung und Analyse der inhärenten Möglichkeiten, die sich aus der Vermittlung distaler taktiler Rückmeldungen ergeben. Vier verschiedene Charakteristika werden identifiziert: (1) die vereinfachte Integration von Tastsinnesreizen, (2) die Vermittlung von proaktiven, reaktiven und entkoppelten Rückmeldungen, (3) die erhöhte Bandbreite der taktilen Signale und (4) die Darstellung von individuellen Tastsinnesreizen für verschiedene Kontaktpunkte mit der berührungsempfindlichen Oberfläche. Jedes dieser Prinzipien wird durch prototypische Systeme umgesetzt und in Benutzerstudien analysiert. Beispielsweise nutzt das System PhantomStation psychophysikalische Illusionen, um die Anzahl der einzelnen Reizgeber zu reduzieren. In einer Evaluierung des Prototypen werden mehrere Aktuatortechnologien verglichen, um einfache und skalierbare Ansätze zu identifizieren. Der ThermalTouch-Prototyp wird dazu genutzt, distale thermale Reize zu vermitteln, um so Materialeigenschaften auf Berührungsbildschirmen darstellen zu können. Eine Benutzerstudie zeigt, dass sich auf Basis dieser Temperaturverläufe virtuelle Objekte unterscheiden lassen. Das AutomotiveRTF-System wird schließlich in ein Kraftfahrzeug integriert, um den Fahrer bei der Eingabe auf dem Informations- und Unterhaltungssystem zu unterstützen. Eine Feldstudie untersucht die Auswirkungen der proaktiven und reaktiven Rückmeldungen auf die Benutzerleistung. Die vorliegende Dissertation leistet mehrere Beiträge zur Mensch-Maschine-Interaktion: Das Prinzip der distalen taktilen Rückmeldungen wird eingeführt als Alternative zur Erzeugung nicht-visueller Rückmeldungen auf interaktiven Oberflächen. Es werden technische Verfahrensweisen zur prototypischen Implementierung solcher Systeme vorgeschlagen. Diese technischen Prototypen werden in einer Vielzahl verschiedener Benutzerstudien eingesetzt, welche die quantitativen und qualitativen Vorteile des Ansatzes aufzeigen. Schließlich wird gezeigt, wie sich das Prinzip zur Unterstützung heutiger und zukünftiger Interaktionsformen mit berührungsempfindlichen Bildschirmen nutzen lässt

    Supporting Eyes-Free Human–Computer Interaction with Vibrotactile Haptification

    Get PDF
    The sense of touch is a crucial sense when using our hands in complex tasks. Some tasks we learn to do even without sight by just using the sense of touch in our fingers and hands. Modern touchscreen devices, however, have lost some of that tactile feeling while removing physical controls from the interaction. Touch is also a sense that is underutilized in interactions with technology and could provide new ways of interaction to support users. While users are using information technology in certain situations, they cannot visually and mentally focus completely during the interaction. Humans can utilize their sense of touch more comprehensively in interactions and learn to understand tactile information while interacting with information technology. This thesis introduces a set of experiments that evaluate human capabilities to understand and notice tactile information provided by current actuator technology and further introduces a couple of examples of haptic user interfaces (HUIs) to use under eyes-free use scenarios. These experiments evaluate the benefits of such interfaces for users and concludes with some guidelines and methods for how to create this kind of user interfaces. The experiments in this thesis can be divided into three groups. In the first group, with the first two experiments, the detection of vibrotactile stimuli and interpretation of the abstract meaning of vibrotactile feedback was evaluated. Experiments in the second group evaluated how to design rhythmic vibrotactile tactons to be basic vibrotactile primitives for HUIs. The last group of two experiments evaluated how these HUIs benefit the users in the distracted and eyes-free interaction scenarios. The primary aim for this series of experiments was to evaluate if utilizing the current level of actuation technology could be used more comprehensively than in current-day solutions with simple haptic alerts and notifications. Thus, to find out if the comprehensive use of vibrotactile feedback in interactions would provide additional benefits for the users, compared to the current level of haptic interaction methods and nonhaptic interaction methods. The main finding of this research is that while using more comprehensive HUIs in eyes-free distracted-use scenarios, such as while driving a car, the user’s main task, driving, is performed better. Furthermore, users liked the comprehensively haptified user interfaces

    Designing Haptic Clues for Touchscreen Kiosks

    Get PDF
    Most interactive touchscreen kiosks are a challenge to accessibility: if graphics and sound fail in communication, the interaction process halts. In such a case, turning to the only remaining environmentally suited sense - the touch - is an intuitive option. To reinforce the interaction with interactive touchscreen kiosks it is possible to add haptic (touchable) feedback into the features of the device. The range of touchscreen-suited haptic technologies already enables some touch feedback from touchscreen surfaces and significant leaps still forward are being made at a constant rate. Due to this development it is relevant to review the human-centred factors affecting the design of haptic touchscreen in public kiosks. This thesis offers an overview for designing haptic clues for touchscreen kiosks. It emphasizes context sensitivity and the meaningfulness and communicability of different haptic design variants. As the main contribution, this thesis collects together the important considerations for the conscious design of haptic features in interactive kiosks and offers points of multimodal design considerations for designers intending to enrich their touchscreen interaction with haptic features

    Multimodaalinen joustavuus mobiilissa tekstinsyöttötehtävässä

    Get PDF
    Mobiili käytettävyys riippuu informaation määrästä jonka käyttäjä pystyy tavoittamaan ja välittämään käyttöliittymän avulla liikkeellä ollessaan. Informaation siirtokapasiteetti ja onnistunut siirto taas riippuvat siitä, kuinka joustavasti käyttöliittymää voi käyttää erilaisissa mobiileissa käyttökonteksteissa. Multimodaalisen joustavuuden tutkimus on keskittynyt lähinnä modaliteettien hyödyntämistapoihin ja niiden integrointiin käyttöliittymiin. Useimmat evaluoivat tutkimukset multimodaalisen joustavuuden alueella mittaavat vuorovaikutusten vaikutuksia toisiinsa. Kuitenkin ongelmana on, että ensinnäkään käyttöliittymän suorituksen arviointi tietyssä kontekstissa ei yleisty muihin mahdollisiin konteksteihin, ja toiseksi, suorituksen vertaaminen tilanteeseen jossa kahta tehtävää suoritetaan samanaikaisesti, paljastaa ennemminkin tehtävien välillä vallitsevan tasapainoilun, kuin itse vuorovaikutusten vaikutukset. Vastatakseen näihin ongelmiin multimodaalisen joustavuuden mittaamisessa, tämä diplomityö eristää modaliteettien hyödyntämisen vaikutuksen vuorovaikutuksessa mobiilin käyttöliittymän kanssa. Samanaikaisten, toissijaisten tehtävien sijaan modaliteettien hyödyntämisen mahdollisuus suljetaan kokonaan vuorovaikutuksesta. Multimodaalisen joustavuuden arvioinnin metodia [1] käytettiin tutkimuksessa osoittamaan kolmen aistikanavan (näön, kuulon ja tunnon) käyttöasteita mobiilissa tekstinsyöttötehtävässä kolmella laitteella; ITU-12 näppäimistöllä, sekä fyysisellä ja kosketusnäytöllisellä Qwerty -näppäimistöllä. Työn tavoitteena oli määrittää näiden käyttöliittymien multimodaalinen joustavuus ja yksittäisten aistikanavien arvo vuorovaikutukselle, sekä tutkia aistien yhteistoimintaa tekstinsyöttötehtävässä. Tutkimuksen tulokset osoittavat, että huolimatta ITU-12 näppäimistön hitaudesta kirjoittaa häiriöttömässä tilassa, sillä on ylivertainen mukautumiskyky toimia erilaisten häiriöiden vaikuttaessa, kuten oikeissa mobiileissa konteksteissa. Kaikki käyttöliittymät todettiin hyvin riippuvaisiksi näöstä. Qwerty -näppäimistöjen suoriutuminen heikkeni yli 80% kun näkö suljettiin vuorovaikutukselta. ITU-12 oli vähiten riippuvainen näöstä, suorituksen heiketessä noin 50 %. Aistikanavien toiminnan tarkastelu tekstinsyöttötehtävässä vihjaa, että näkö ja tunto toimivat yhdessä lisäten suorituskykyä jopa enemmän kuin käytettynä erikseen. Auraalinen palaute sen sijaan ei näyttänyt tuovan lisäarvoa vuorovaikutukseen lainkaan.The mobile usability of an interface depends on the amount of information a user is able to retrieve or transmit while on the move. Furthermore, the information transmission capacity and successful transmissions depend on how flexibly usable the interface is across varying real world contexts. Major focus in research of multimodal flexibility has been on facilitation of modalities to the interface. Most evaluative studies have measured effects that the interactions cause to each other. However, assessing these effects under a limited number of conditions does not generalize to other possible conditions in the real world. Moreover, studies have often compared single-task conditions to dual-tasking, measuring the trade-off between the tasks, not the actual effects the interactions cause. To contribute to the paradigm of measuring multimodal flexibility, this thesis isolates the effect of modality utilization in the interaction with the interface; instead of using a secondary task, modalities are withdrawn from the interaction. The multimodal flexibility method [1] was applied in this study to assess the utilization of three sensory modalities (vision, audition and tactition) in a text input task with three mobile interfaces; a 12-digit keypad, a physical Qwerty-keyboard and a touch screen virtual Qwerty-keyboard. The goal of the study was to compare multimodal flexibility of these interfaces, assess the values of utilized sensory modalities to the interaction, and examine the cooperation of modalities in a text input task. The results imply that the alphabetical 12-digit keypad is the multimodally most flexible of the three compared interfaces. Although the 12-digit keypad is relatively inefficient to type when all modalities are free to be allocated to the interaction, it is the most flexible in performing under constraints that the real world might set on sensory modalities. In addition, all the interfaces are shown to be highly dependent on vision. The performance of both Qwerty-keyboards dropped by approximately 80% as a result of withdrawing the vision from the interaction, and the performance of ITU-12 suffered approximately 50%. Examining cooperation of the modalities in the text input task, vision was shown to work in synergy with tactition, but audition did not provide any extra value for the interaction
    corecore