327 research outputs found

    Making Spatial Information Accessible on Touchscreens for Users who are Blind and Visually Impaired

    Get PDF
    Touchscreens have become a de facto standard of input for mobile devices as they most optimally use the limited input and output space that is imposed by their form factor. In recent years, people who are blind and visually impaired have been increasing their usage of smartphones and touchscreens. Although basic access is available, there are still many accessibility issues left to deal with in order to bring full inclusion to this population. One of the important challenges lies in accessing and creating of spatial information on touchscreens. The work presented here provides three new techniques, using three different modalities, for accessing spatial information on touchscreens. The first system makes geometry and diagram creation accessible on a touchscreen through the use of text-to-speech and gestural input. This first study is informed by a qualitative study of how people who are blind and visually impaired currently access and create graphs and diagrams. The second system makes directions through maps accessible using multiple vibration sensors without any sound or visual output. The third system investigates the use of binaural sound on a touchscreen to make various types of applications accessible such as physics simulations, astronomy, and video games

    SpeciFingers

    Get PDF
    The inadequate use of finger properties has limited the input space of touch interaction. By leveraging the category of contacting fingers, finger-specific interaction is able to expand input vocabulary. However, accurate finger identification remains challenging, as it requires either additional sensors or limited sets of identifiable fingers to achieve ideal accuracy in previous works. We introduce SpeciFingers, a novel approach to identify fingers with the capacitive raw data on touchscreens. We apply a neural network of an encoder-decoder architecture, which captures the spatio-temporal features in capacitive image sequences. To assist users in recovering from misidentification, we propose a correction mechanism to replace the existing undo-redo process. Also, we present a design space of finger-specific interaction with example interaction techniques. In particular, we designed and implemented a use case of optimizing the performance in pointing on small targets. We evaluated our identification model and error correction mechanism in our use case

    The cockpit for the 21st century

    Get PDF
    Interactive surfaces are a growing trend in many domains. As one possible manifestation of Mark Weiser’s vision of ubiquitous and disappearing computers in everywhere objects, we see touchsensitive screens in many kinds of devices, such as smartphones, tablet computers and interactive tabletops. More advanced concepts of these have been an active research topic for many years. This has also influenced automotive cockpit development: concept cars and recent market releases show integrated touchscreens, growing in size. To meet the increasing information and interaction needs, interactive surfaces offer context-dependent functionality in combination with a direct input paradigm. However, interfaces in the car need to be operable while driving. Distraction, especially visual distraction from the driving task, can lead to critical situations if the sum of attentional demand emerging from both primary and secondary task overextends the available resources. So far, a touchscreen requires a lot of visual attention since its flat surface does not provide any haptic feedback. There have been approaches to make direct touch interaction accessible while driving for simple tasks. Outside the automotive domain, for example in office environments, concepts for sophisticated handling of large displays have already been introduced. Moreover, technological advances lead to new characteristics for interactive surfaces by enabling arbitrary surface shapes. In cars, two main characteristics for upcoming interactive surfaces are largeness and shape. On the one hand, spatial extension is not only increasing through larger displays, but also by taking objects in the surrounding into account for interaction. On the other hand, the flatness inherent in current screens can be overcome by upcoming technologies, and interactive surfaces can therefore provide haptically distinguishable surfaces. This thesis describes the systematic exploration of large and shaped interactive surfaces and analyzes their potential for interaction while driving. Therefore, different prototypes for each characteristic have been developed and evaluated in test settings suitable for their maturity level. Those prototypes were used to obtain subjective user feedback and objective data, to investigate effects on driving and glance behavior as well as usability and user experience. As a contribution, this thesis provides an analysis of the development of interactive surfaces in the car. Two characteristics, largeness and shape, are identified that can improve the interaction compared to conventional touchscreens. The presented studies show that large interactive surfaces can provide new and improved ways of interaction both in driver-only and driver-passenger situations. Furthermore, studies indicate a positive effect on visual distraction when additional static haptic feedback is provided by shaped interactive surfaces. Overall, various, non-exclusively applicable, interaction concepts prove the potential of interactive surfaces for the use in automotive cockpits, which is expected to be beneficial also in further environments where visual attention needs to be focused on additional tasks.Der Einsatz von interaktiven Oberflächen weitet sich mehr und mehr auf die unterschiedlichsten Lebensbereiche aus. Damit sind sie eine mögliche Ausprägung von Mark Weisers Vision der allgegenwärtigen Computer, die aus unserer direkten Wahrnehmung verschwinden. Bei einer Vielzahl von technischen Geräten des täglichen Lebens, wie Smartphones, Tablets oder interaktiven Tischen, sind berührungsempfindliche Oberflächen bereits heute in Benutzung. Schon seit vielen Jahren arbeiten Forscher an einer Weiterentwicklung der Technik, um ihre Vorteile auch in anderen Bereichen, wie beispielsweise der Interaktion zwischen Mensch und Automobil, nutzbar zu machen. Und das mit Erfolg: Interaktive Benutzeroberflächen werden mittlerweile serienmäßig in vielen Fahrzeugen eingesetzt. Der Einbau von immer größeren, in das Cockpit integrierten Touchscreens in Konzeptfahrzeuge zeigt, dass sich diese Entwicklung weiter in vollem Gange befindet. Interaktive Oberflächen ermöglichen das flexible Anzeigen von kontextsensitiven Inhalten und machen eine direkte Interaktion mit den Bildschirminhalten möglich. Auf diese Weise erfüllen sie die sich wandelnden Informations- und Interaktionsbedürfnisse in besonderem Maße. Beim Einsatz von Bedienschnittstellen im Fahrzeug ist die gefahrlose Benutzbarkeit während der Fahrt von besonderer Bedeutung. Insbesondere visuelle Ablenkung von der Fahraufgabe kann zu kritischen Situationen führen, wenn Primär- und Sekundäraufgaben mehr als die insgesamt verfügbare Aufmerksamkeit des Fahrers beanspruchen. Herkömmliche Touchscreens stellen dem Fahrer bisher lediglich eine flache Oberfläche bereit, die keinerlei haptische Rückmeldung bietet, weshalb deren Bedienung besonders viel visuelle Aufmerksamkeit erfordert. Verschiedene Ansätze ermöglichen dem Fahrer, direkte Touchinteraktion für einfache Aufgaben während der Fahrt zu nutzen. Außerhalb der Automobilindustrie, zum Beispiel für Büroarbeitsplätze, wurden bereits verschiedene Konzepte für eine komplexere Bedienung großer Bildschirme vorgestellt. Darüber hinaus führt der technologische Fortschritt zu neuen möglichen Ausprägungen interaktiver Oberflächen und erlaubt, diese beliebig zu formen. Für die nächste Generation von interaktiven Oberflächen im Fahrzeug wird vor allem an der Modifikation der Kategorien Größe und Form gearbeitet. Die Bedienschnittstelle wird nicht nur durch größere Bildschirme erweitert, sondern auch dadurch, dass Objekte wie Dekorleisten in die Interaktion einbezogen werden können. Andererseits heben aktuelle Technologieentwicklungen die Restriktion auf flache Oberflächen auf, so dass Touchscreens künftig ertastbare Strukturen aufweisen können. Diese Dissertation beschreibt die systematische Untersuchung großer und nicht-flacher interaktiver Oberflächen und analysiert ihr Potential für die Interaktion während der Fahrt. Dazu wurden für jede Charakteristik verschiedene Prototypen entwickelt und in Testumgebungen entsprechend ihres Reifegrads evaluiert. Auf diese Weise konnten subjektives Nutzerfeedback und objektive Daten erhoben, und die Effekte auf Fahr- und Blickverhalten sowie Nutzbarkeit untersucht werden. Diese Dissertation leistet den Beitrag einer Analyse der Entwicklung von interaktiven Oberflächen im Automobilbereich. Weiterhin werden die Aspekte Größe und Form untersucht, um mit ihrer Hilfe die Interaktion im Vergleich zu herkömmlichen Touchscreens zu verbessern. Die durchgeführten Studien belegen, dass große Flächen neue und verbesserte Bedienmöglichkeiten bieten können. Außerdem zeigt sich ein positiver Effekt auf die visuelle Ablenkung, wenn zusätzliches statisches, haptisches Feedback durch nicht-flache Oberflächen bereitgestellt wird. Zusammenfassend zeigen verschiedene, untereinander kombinierbare Interaktionskonzepte das Potential interaktiver Oberflächen für den automotiven Einsatz. Zudem können die Ergebnisse auch in anderen Bereichen Anwendung finden, in denen visuelle Aufmerksamkeit für andere Aufgaben benötigt wird

    Extending mobile touchscreen interaction

    Get PDF
    Touchscreens have become a de facto interface for mobile devices, and are penetrating further beyond their core application domain of smartphones. This work presents a design space for extending touchscreen interaction, to which new solutions may be mapped. Specific touchscreen enhancements in the domains of manual input, visual output and haptic feedback are explored and quantitative and experiental findings reported. Particular areas covered are unintentional interaction, screen locking, stereoscopic displays and picoprojection. In addition, the novel interaction approaches of finger identification and onscreen physical guides are also explored. The use of touchscreens in the domains of car dashboards and smart handbags are evaluated as domain specific use cases. This work draws together solutions from the broad area of mobile touchscreen interaction. Fruitful directions for future research are identified, and information is provided for future researchers addressing those topics.Kosketusnäytöistä on muodostunut mobiililaitteiden pääasiallinen käyttöliittymä, ja ne ovat levinneet alkuperäiseltä ydinsovellusalueeltaan, matkapuhelimista, myös muihin laitteisiin. Työssä tutkitaan uusia vuorovaikutuksen, visualisoinnin ja käyttöliittymäpalautteen keinoja, jotka laajentavat perinteistä kosketusnäytön avulla tapahtuvaa vuorovaikutusta. Näihin liittyen väitöskirjassa esitetään sekä kvantitatiivisia tuloksia että uutta kartoittavia löydöksiä. Erityisesti työ tarkastelee tahatonta kosketusnäytön käyttöä, kosketusnäytön lukitusta, stereoskooppisia kosketusnäyttöjä ja pikoprojektoreiden hyödyntämistä. Lisäksi kartoitetaan uusia vuorovaikutustapoja, jotka liittyvät sormien identifioimiseen vuorovaikutuksen yhteydessä, ja fyysisiin, liikettä ohjaaviin rakenteisiin kosketusnäytöllä. Kosketusnäytön käyttöä autossa sekä osana älykästä käsilaukkua tarkastellaan esimerkkeinä käyttökonteksteista. Väitöskirjassa esitetään vuorovaikutussuunnittelun viitekehys, joka laajentaa kosketusnäyttöjen kautta tapahtuvaa vuorovaikutusta mobiililaitteen kanssa, ja johon työssä esitellyt, uudet vuorovaikutustavat voidaan sijoittaa. Väitöskirja yhdistää kosketusnäyttöihin liittyviä käyttöliittymäsuunnittelun ratkaisuja laajalta alueelta. Työ esittelee potentiaalisia suuntaviivoja tulevaisuuden tutkimuksille ja tuo uutta tutkimustietoa, jota mobiililaitteiden vuorovaikutuksen tutkijat ja käyttöliittymäsuunnittelijat voivat hyödyntää

    Extending mobile touchscreen interaction

    Get PDF
    Touchscreens have become a de facto interface for mobile devices, and are penetrating further beyond their core application domain of smartphones. This work presents a design space for extending touchscreen interaction, to which new solutions may be mapped. Specific touchscreen enhancements in the domains of manual input, visual output and haptic feedback are explored and quantitative and experiental findings reported. Particular areas covered are unintentional interaction, screen locking, stereoscopic displays and picoprojection. In addition, the novel interaction approaches of finger identification and onscreen physical guides are also explored. The use of touchscreens in the domains of car dashboards and smart handbags are evaluated as domain specific use cases. This work draws together solutions from the broad area of mobile touchscreen interaction. Fruitful directions for future research are identified, and information is provided for future researchers addressing those topics.Kosketusnäytöistä on muodostunut mobiililaitteiden pääasiallinen käyttöliittymä, ja ne ovat levinneet alkuperäiseltä ydinsovellusalueeltaan, matkapuhelimista, myös muihin laitteisiin. Työssä tutkitaan uusia vuorovaikutuksen, visualisoinnin ja käyttöliittymäpalautteen keinoja, jotka laajentavat perinteistä kosketusnäytön avulla tapahtuvaa vuorovaikutusta. Näihin liittyen väitöskirjassa esitetään sekä kvantitatiivisia tuloksia että uutta kartoittavia löydöksiä. Erityisesti työ tarkastelee tahatonta kosketusnäytön käyttöä, kosketusnäytön lukitusta, stereoskooppisia kosketusnäyttöjä ja pikoprojektoreiden hyödyntämistä. Lisäksi kartoitetaan uusia vuorovaikutustapoja, jotka liittyvät sormien identifioimiseen vuorovaikutuksen yhteydessä, ja fyysisiin, liikettä ohjaaviin rakenteisiin kosketusnäytöllä. Kosketusnäytön käyttöä autossa sekä osana älykästä käsilaukkua tarkastellaan esimerkkeinä käyttökonteksteista. Väitöskirjassa esitetään vuorovaikutussuunnittelun viitekehys, joka laajentaa kosketusnäyttöjen kautta tapahtuvaa vuorovaikutusta mobiililaitteen kanssa, ja johon työssä esitellyt, uudet vuorovaikutustavat voidaan sijoittaa. Väitöskirja yhdistää kosketusnäyttöihin liittyviä käyttöliittymäsuunnittelun ratkaisuja laajalta alueelta. Työ esittelee potentiaalisia suuntaviivoja tulevaisuuden tutkimuksille ja tuo uutta tutkimustietoa, jota mobiililaitteiden vuorovaikutuksen tutkijat ja käyttöliittymäsuunnittelijat voivat hyödyntää

    UsaGame – A new methodology to support user- centered design of touchscreen game applications

    Get PDF
    Dissertação para Obtenção do Grau de Mestre em Engenharia e Gestão IndustrialTouchscreen mobile devices growth resulted in an explosion of the mobile applications. Focusing on touch mobile game applications this study aims to fulfill a research gap, creating appropriate usability guidelines for these applications. Concerns about usability, touch technologies, mobile devices and game testing, provided the background needs for this study. Initial game application tests allowed for the creation and implementation of such proposed usability guidelines into a support checklist (UsaGame), designed to help applications developers. An evaluation test was performed with 20 users in order to assess the validity of the proposed guidelines. Results from the test of the two builds from the same game application allowed comparisons that led to the assessment of the importance of some of the guidelines implemented into the application. Results suggested a usability improvement on the game application implemented with the guidelines. Furthermore results allowed commenting on all proposed usability guidelines

    Effects of age on smartphone and tablet usability, based on eye-movement tracking and touch-gesture interactions

    Get PDF
    The aim of this thesis is to provide an insight into the effects of user age on interactions with smartphones and tablets applications. The study considered two interaction methods to investigate the effects of user age on the usability of smartphones and tablets of different sizes: 1) eye-movements/browsing and 2) touch-gesture interactions. In eye movement studies, an eye tracker was used to trace and record users’ eye movements which were later analysed to understand the effects of age and screen-size on browsing effectiveness. Whilst in gesture interactions, an application developed for smartphones traced and recorded users’ touch-gestures data, which were later analysed to investigate the effects of age and screensize on touch-gesture performance. The motivation to conduct our studies is summarised as follows: 1) increasing number of elderly people in our society, 2) widespread use of smartphone technology across the world, 3) understanding difficulties for elderly when interacting smartphones technology, and 4) provide the existing body of literature with new understanding on the effects of ageing on smartphone usability. The work of this thesis includes five research projects conducted in two stages. Stage One included two researches used eye movement analysis to investigate the effects of user age and the influence of screen size on browsing smartphone interfaces. The first research examined the scan-paths dissimilarity of browsing smartphones applications or elderly users (60+) and younger users (20-39). The results revealed that the scan-paths dissimilarity in browsing smartphone applications was higher for elderly users (i.e., age-driven) than the younger users. The results also revealed that browsing smartphone applications were stimulus-driven rather than screen size-driven. The second study was conducted to understand the difficulties of information processing when browsing smartphone applications for elderly (60+), middle-age (40-59) and younger (20-39) users. The evaluation was performed using three different screen sizes of smartphone and tablet devices. The results revealed that processing of both local and global information on a smartphone/tablet interfaces was more difficult for elderly users than it was for the other age groups. Across all age groups, browsing on the smaller smartphone size proved to be more difficult compared to the larger screen sizes. Stage Two included three researches to investigate: the difficulties in interacting with gesture-based applications for elderly compared to younger users; and to evaluate the possibility of classifying user’s age-group based on on-screen gestures. The first research investigated the effects of user age and screen size on performing gesture swiping intuitively for four swiping directions: down, left, right, and up. The results revealed that the performance of gesture swiping was influenced by user age, screen size, as well as by the swiping orientation. The purpose of the second research was to investigate the effects of user age, screen sizes, and gesture complexity in performing accurate gestures on smartphones and tablets using gesture-based features. The results revealed that the elderly were less accurate, less efficient, slower, and exerted more pressure on the touch-screen when performing gestures than the younger users. On a small smartphone, all users were less accurate in gesture performance – more so for elderly – compared to mini-sized tablets. Also, the users, especially the elderly, were less efficient and less accurate when performing complex gestures on the small smartphone compared to the mini-tablet. The third research investigated the possibility of classifying a user’s age-group using touch gesture-based features (i.e., gesture speed, gesture accuracy, movement time, and finger pressure) on smartphones. In the third research, we provide evidence for the possibility of classifying a user’s age-group using gesture-based applications on smartphones for user-dependent and user-independent scenarios. The accuracy of age-group classification on smaller screens was higher than that on devices with larger screens due to larger screens being much easier to use for all users across both age groups. In addition, it was found that the age-group classification accuracy was higher for younger users than elderly users. This was due to the fact that some elderly users performed the gestures in the same way as the younger users do, which could be due to their longer experience in using smartphones than the typical elderly user. Overall, our results provided evidence that elderly users encounter difficulties when interacting with smartphones and tablet devices compared to younger users. Also, it was possible to classify user’s age-group based on users’ ability to perform touch-gestures on smartphones and tablets. The designers of smartphone interfaces should remove barriers that make browsing and processing local and global information on smartphones’ applications difficult. Furthermore, larger screen sizes should be considered for elderly users. Also, smartphones could include automatically customisable user interfaces to suite elderly users' abilities to accommodate their needs so that they can be equally efficient as younger users. The outcomes of this research could enhance the design of smartphones and tablets as well the applications that run on such devices, especially those that are aimed at elderly users. Such devices and applications could play an effective role in enhancing elderly peoples’ activities of daily lives

    Electrostatic Friction Displays to Enhance Touchscreen Experience

    Get PDF
    Touchscreens are versatile devices that can display visual content and receive touch input, but they lack the ability to provide programmable tactile feedback. This limitation has been addressed by a few approaches generally called surface haptics technology. This technology modulates the friction between a user’s fingertip and a touchscreen surface to create different tactile sensations when the finger explores the touchscreen. This functionality enables the user to see and feel digital content simultaneously, leading to improved usability and user experiences. One major approach in surface haptics relies on the electrostatic force induced between the finger and an insulating surface on the touchscreen by supplying high AC voltage. The use of AC also induces a vibrational sensation called electrovibration to the user. Electrostatic friction displays require only electrical components and provide uniform friction over the screen. This tactile feedback technology not only allows easy and lightweight integration into touchscreen devices but also provides dynamic, rich, and satisfactory user interfaces. In this chapter, we review the fundamental operation of the electrovibration technology as well as applications have been built upon

    Pseudo-haptics survey: Human-computer interaction in extended reality & teleoperation

    Get PDF
    Pseudo-haptic techniques are becoming increasingly popular in human-computer interaction. They replicate haptic sensations by leveraging primarily visual feedback rather than mechanical actuators. These techniques bridge the gap between the real and virtual worlds by exploring the brain’s ability to integrate visual and haptic information. One of the many advantages of pseudo-haptic techniques is that they are cost-effective, portable, and flexible. They eliminate the need for direct attachment of haptic devices to the body, which can be heavy and large and require a lot of power and maintenance. Recent research has focused on applying these techniques to extended reality and mid-air interactions. To better understand the potential of pseudo-haptic techniques, the authors developed a novel taxonomy encompassing tactile feedback, kinesthetic feedback, and combined categories in multimodal approaches, ground not covered by previous surveys. This survey highlights multimodal strategies and potential avenues for future studies, particularly regarding integrating these techniques into extended reality and collaborative virtual environments.info:eu-repo/semantics/publishedVersio
    • …
    corecore