631 research outputs found

    Finger orientation as an additional input dimension for touchscreens

    Get PDF
    Since the first digital computer in 1941 and the first personal computer back in 1975, the way we interact with computers has radically changed. The keyboard is still one of the two main input devices for desktop computers which is accompanied most of the time by a mouse or trackpad. However, the interaction with desktop and laptop computers today only make up a small percentage of current interaction with computing devices. Today, we mostly interact with ubiquitous computing devices, and while the first ubiquitous devices were controlled via buttons, this changed with the invention of touchscreens. Moreover, the phone as the most prominent ubiquitous computing device is heavily relying on touch interaction as the dominant input mode. Through direct touch, users can directly interact with graphical user interfaces (GUIs). GUI controls can directly be manipulated by simply touching them. However, current touch devices reduce the richness of touch input to two-dimensional positions on the screen. In this thesis, we investigate the potential of enriching a simple touch with additional information about the finger touching the screen. We propose to use the user’s finger orientation as two additional input dimensions. We investigate four key areas which make up the foundation to fully understand finger orientation as an additional input technique. With these insights, we provide designers with the foundation to design new gestures sets and use cases which take the finger orientation into account. We first investigate approaches to recognize finger orientation input and provide ready-to-deploy models to recognize the orientation. Second, we present design guidelines for a comfortable use of finger orientation. Third, we present a method to analyze applications in social settings to design use cases with possible conversation disruption in mind. Lastly, we present three ways how new interaction techniques like finger orientation input can be communicated to the user. This thesis contributes these four key insights to fully understand finger orientation as an additional input technique. Moreover, we combine the key insights to lay the foundation to evaluate every new interaction technique based on the same in-depth evaluation

    The cockpit for the 21st century

    Get PDF
    Interactive surfaces are a growing trend in many domains. As one possible manifestation of Mark Weiser’s vision of ubiquitous and disappearing computers in everywhere objects, we see touchsensitive screens in many kinds of devices, such as smartphones, tablet computers and interactive tabletops. More advanced concepts of these have been an active research topic for many years. This has also influenced automotive cockpit development: concept cars and recent market releases show integrated touchscreens, growing in size. To meet the increasing information and interaction needs, interactive surfaces offer context-dependent functionality in combination with a direct input paradigm. However, interfaces in the car need to be operable while driving. Distraction, especially visual distraction from the driving task, can lead to critical situations if the sum of attentional demand emerging from both primary and secondary task overextends the available resources. So far, a touchscreen requires a lot of visual attention since its flat surface does not provide any haptic feedback. There have been approaches to make direct touch interaction accessible while driving for simple tasks. Outside the automotive domain, for example in office environments, concepts for sophisticated handling of large displays have already been introduced. Moreover, technological advances lead to new characteristics for interactive surfaces by enabling arbitrary surface shapes. In cars, two main characteristics for upcoming interactive surfaces are largeness and shape. On the one hand, spatial extension is not only increasing through larger displays, but also by taking objects in the surrounding into account for interaction. On the other hand, the flatness inherent in current screens can be overcome by upcoming technologies, and interactive surfaces can therefore provide haptically distinguishable surfaces. This thesis describes the systematic exploration of large and shaped interactive surfaces and analyzes their potential for interaction while driving. Therefore, different prototypes for each characteristic have been developed and evaluated in test settings suitable for their maturity level. Those prototypes were used to obtain subjective user feedback and objective data, to investigate effects on driving and glance behavior as well as usability and user experience. As a contribution, this thesis provides an analysis of the development of interactive surfaces in the car. Two characteristics, largeness and shape, are identified that can improve the interaction compared to conventional touchscreens. The presented studies show that large interactive surfaces can provide new and improved ways of interaction both in driver-only and driver-passenger situations. Furthermore, studies indicate a positive effect on visual distraction when additional static haptic feedback is provided by shaped interactive surfaces. Overall, various, non-exclusively applicable, interaction concepts prove the potential of interactive surfaces for the use in automotive cockpits, which is expected to be beneficial also in further environments where visual attention needs to be focused on additional tasks.Der Einsatz von interaktiven Oberflächen weitet sich mehr und mehr auf die unterschiedlichsten Lebensbereiche aus. Damit sind sie eine mögliche Ausprägung von Mark Weisers Vision der allgegenwärtigen Computer, die aus unserer direkten Wahrnehmung verschwinden. Bei einer Vielzahl von technischen Geräten des täglichen Lebens, wie Smartphones, Tablets oder interaktiven Tischen, sind berührungsempfindliche Oberflächen bereits heute in Benutzung. Schon seit vielen Jahren arbeiten Forscher an einer Weiterentwicklung der Technik, um ihre Vorteile auch in anderen Bereichen, wie beispielsweise der Interaktion zwischen Mensch und Automobil, nutzbar zu machen. Und das mit Erfolg: Interaktive Benutzeroberflächen werden mittlerweile serienmäßig in vielen Fahrzeugen eingesetzt. Der Einbau von immer größeren, in das Cockpit integrierten Touchscreens in Konzeptfahrzeuge zeigt, dass sich diese Entwicklung weiter in vollem Gange befindet. Interaktive Oberflächen ermöglichen das flexible Anzeigen von kontextsensitiven Inhalten und machen eine direkte Interaktion mit den Bildschirminhalten möglich. Auf diese Weise erfüllen sie die sich wandelnden Informations- und Interaktionsbedürfnisse in besonderem Maße. Beim Einsatz von Bedienschnittstellen im Fahrzeug ist die gefahrlose Benutzbarkeit während der Fahrt von besonderer Bedeutung. Insbesondere visuelle Ablenkung von der Fahraufgabe kann zu kritischen Situationen führen, wenn Primär- und Sekundäraufgaben mehr als die insgesamt verfügbare Aufmerksamkeit des Fahrers beanspruchen. Herkömmliche Touchscreens stellen dem Fahrer bisher lediglich eine flache Oberfläche bereit, die keinerlei haptische Rückmeldung bietet, weshalb deren Bedienung besonders viel visuelle Aufmerksamkeit erfordert. Verschiedene Ansätze ermöglichen dem Fahrer, direkte Touchinteraktion für einfache Aufgaben während der Fahrt zu nutzen. Außerhalb der Automobilindustrie, zum Beispiel für Büroarbeitsplätze, wurden bereits verschiedene Konzepte für eine komplexere Bedienung großer Bildschirme vorgestellt. Darüber hinaus führt der technologische Fortschritt zu neuen möglichen Ausprägungen interaktiver Oberflächen und erlaubt, diese beliebig zu formen. Für die nächste Generation von interaktiven Oberflächen im Fahrzeug wird vor allem an der Modifikation der Kategorien Größe und Form gearbeitet. Die Bedienschnittstelle wird nicht nur durch größere Bildschirme erweitert, sondern auch dadurch, dass Objekte wie Dekorleisten in die Interaktion einbezogen werden können. Andererseits heben aktuelle Technologieentwicklungen die Restriktion auf flache Oberflächen auf, so dass Touchscreens künftig ertastbare Strukturen aufweisen können. Diese Dissertation beschreibt die systematische Untersuchung großer und nicht-flacher interaktiver Oberflächen und analysiert ihr Potential für die Interaktion während der Fahrt. Dazu wurden für jede Charakteristik verschiedene Prototypen entwickelt und in Testumgebungen entsprechend ihres Reifegrads evaluiert. Auf diese Weise konnten subjektives Nutzerfeedback und objektive Daten erhoben, und die Effekte auf Fahr- und Blickverhalten sowie Nutzbarkeit untersucht werden. Diese Dissertation leistet den Beitrag einer Analyse der Entwicklung von interaktiven Oberflächen im Automobilbereich. Weiterhin werden die Aspekte Größe und Form untersucht, um mit ihrer Hilfe die Interaktion im Vergleich zu herkömmlichen Touchscreens zu verbessern. Die durchgeführten Studien belegen, dass große Flächen neue und verbesserte Bedienmöglichkeiten bieten können. Außerdem zeigt sich ein positiver Effekt auf die visuelle Ablenkung, wenn zusätzliches statisches, haptisches Feedback durch nicht-flache Oberflächen bereitgestellt wird. Zusammenfassend zeigen verschiedene, untereinander kombinierbare Interaktionskonzepte das Potential interaktiver Oberflächen für den automotiven Einsatz. Zudem können die Ergebnisse auch in anderen Bereichen Anwendung finden, in denen visuelle Aufmerksamkeit für andere Aufgaben benötigt wird

    AUGMENTED TOUCH INTERACTIONS WITH FINGER CONTACT SHAPE AND ORIENTATION

    Get PDF
    Touchscreen interactions are far less expressive than the range of touch that human hands are capable of - even considering technologies such as multi-touch and force-sensitive surfaces. Recently, some touchscreens have added the capability to sense the actual contact area of a finger on the touch surface, which provides additional degrees of freedom - the size and shape of the touch, and the finger's orientation. These additional sensory capabilities hold promise for increasing the expressiveness of touch interactions - but little is known about whether users can successfully use the new degrees of freedom. To provide this baseline information, we carried out a study with a finger-contact-sensing touchscreen, and asked participants to produce a range of touches and gestures with different shapes and orientations, with both one and two fingers. We found that people are able to reliably produce two touch shapes and three orientations across a wide range of touches and gestures - a result that was confirmed in another study that used the augmented touches for a screen lock application

    An infotainment table for interactive learning based on Infra-red touch overlays using TUIO protocol

    Get PDF
    Development of multimedia technologies coupled with need of making the learning process engaging has led to the emergence of many interactive applications in recent times. By utilizing the ease of using touch based applications, it has become possible to make teaching-learning system more interesting for students, by incorporating these technologies in the process of imparting education. In this paper, we propose to develop an interactive game that runs on an Infra-red touch based multi touch surface that would enable elementary students to learn basic geometry. The software would utilize Tangible User Interface Object (TUIO) protocol for communication

    Breaking the Screen: Interaction Across Touchscreen Boundaries in Virtual Reality for Mobile Knowledge Workers.

    Get PDF
    Virtual Reality (VR) has the potential to transform knowledge work. One advantage of VR knowledge work is that it allows extending 2D displays into the third dimension, enabling new operations, such as selecting overlapping objects or displaying additional layers of information. On the other hand, mobile knowledge workers often work on established mobile devices, such as tablets, limiting interaction with those devices to a small input space. This challenge of a constrained input space is intensified in situations when VR knowledge work is situated in cramped environments, such as airplanes and touchdown spaces. In this paper, we investigate the feasibility of interacting jointly between an immersive VR head-mounted display and a tablet within the context of knowledge work. Specifically, we 1) design, implement and study how to interact with information that reaches beyond a single physical touchscreen in VR; 2) design and evaluate a set of interaction concepts; and 3) build example applications and gather user feedback on those applications.Comment: 10 pages, 8 figures, ISMAR 202

    Extending mobile touchscreen interaction

    Get PDF
    Touchscreens have become a de facto interface for mobile devices, and are penetrating further beyond their core application domain of smartphones. This work presents a design space for extending touchscreen interaction, to which new solutions may be mapped. Specific touchscreen enhancements in the domains of manual input, visual output and haptic feedback are explored and quantitative and experiental findings reported. Particular areas covered are unintentional interaction, screen locking, stereoscopic displays and picoprojection. In addition, the novel interaction approaches of finger identification and onscreen physical guides are also explored. The use of touchscreens in the domains of car dashboards and smart handbags are evaluated as domain specific use cases. This work draws together solutions from the broad area of mobile touchscreen interaction. Fruitful directions for future research are identified, and information is provided for future researchers addressing those topics.Kosketusnäytöistä on muodostunut mobiililaitteiden pääasiallinen käyttöliittymä, ja ne ovat levinneet alkuperäiseltä ydinsovellusalueeltaan, matkapuhelimista, myös muihin laitteisiin. Työssä tutkitaan uusia vuorovaikutuksen, visualisoinnin ja käyttöliittymäpalautteen keinoja, jotka laajentavat perinteistä kosketusnäytön avulla tapahtuvaa vuorovaikutusta. Näihin liittyen väitöskirjassa esitetään sekä kvantitatiivisia tuloksia että uutta kartoittavia löydöksiä. Erityisesti työ tarkastelee tahatonta kosketusnäytön käyttöä, kosketusnäytön lukitusta, stereoskooppisia kosketusnäyttöjä ja pikoprojektoreiden hyödyntämistä. Lisäksi kartoitetaan uusia vuorovaikutustapoja, jotka liittyvät sormien identifioimiseen vuorovaikutuksen yhteydessä, ja fyysisiin, liikettä ohjaaviin rakenteisiin kosketusnäytöllä. Kosketusnäytön käyttöä autossa sekä osana älykästä käsilaukkua tarkastellaan esimerkkeinä käyttökonteksteista. Väitöskirjassa esitetään vuorovaikutussuunnittelun viitekehys, joka laajentaa kosketusnäyttöjen kautta tapahtuvaa vuorovaikutusta mobiililaitteen kanssa, ja johon työssä esitellyt, uudet vuorovaikutustavat voidaan sijoittaa. Väitöskirja yhdistää kosketusnäyttöihin liittyviä käyttöliittymäsuunnittelun ratkaisuja laajalta alueelta. Työ esittelee potentiaalisia suuntaviivoja tulevaisuuden tutkimuksille ja tuo uutta tutkimustietoa, jota mobiililaitteiden vuorovaikutuksen tutkijat ja käyttöliittymäsuunnittelijat voivat hyödyntää

    Performance, Characteristics, and Error Rates of Cursor Control Devices for Aircraft Cockpit Interaction

    Get PDF
    This document is the Accepted Manuscript version of the following article: Peter R. Thomas, 'Performance, Characteristics, and Error Rates of Cursor Control Devices for Aircraft Cockpit Interaction', International Journal of Human-Computer Studies, Vol. 109: 41-53, available online 31 August 2017. Under embargo. Embargo end date: 31 August 2018. Published by Elsevier. © 2017 Elsevier Ltd. All rights reserved.This paper provides a comparative performance analysis of a hands-on-throttle-and-stick (HOTAS) cursor control device (CCD) with other suitable CCDs for an aircraft cockpit: an isotonic thumbstick, a trackpad, a trackball, and touchscreen input. The performance and characteristics of these five CCDs were investigated in terms of throughput, movement accuracy, and error rate using the ISO 9241-9 standard task. Results show statistically significant differences (p < 0.001) between three groupings of the devices, with the HOTAS having the lowest throughput (0.7 bits/s) and the touchscreen the highest (3.7 bits/s). Errors for all devices were shown to increase with decreasing target size (p < 0.001) and, to a lesser effect, increasing target distance (p < 0.01). The trackpad was found to be the most accurate of the five devices, being significantly better than the HOTAS fingerstick and touchscreen (p < 0.05) with the touchscreen performing poorly on selecting smaller targets (p < 0.05). These results would be useful to cockpit human-machine interface designers and provides evidence of the need to move away from, or significantly augment the capabilities of, this type of HOTAS CCD in order to improve pilot task throughput in increasingly data-rich cockpits.Peer reviewedFinal Accepted Versio
    corecore