5,994 research outputs found

    Making Gestural Interaction Accessible to Visually Impaired People

    Get PDF
    International audienceAs touch screens become widely spread, making them more accessible to visually impaired people is an important task. Touch displays possess a poor accessibility for visually impaired people. One possibility to make them more accessible without sight is through gestural interaction. Yet, there are still few studies on using gestural interaction for visually impaired people. In this paper we present a comprehensive summary of existing projects investigating accessible gestural interaction. We also highlight the limits of current approaches and propose future working directions. Then, we present the design of an interactive map prototype that includes both a raised-line map overlay and gestural interaction for accessing different types of information (e.g., opening hours, distances). Preliminary results of our project show that basic gestural interaction techniques can be successfully used in interactive maps for visually impaired people

    Investigating Human-Rare Historic Book Interaction among Young Adults

    Get PDF
    This paper reports on research conducted to improve understanding of human-rare historic book interaction as a necessary first step in order to design and develop physical-virtual renderings of rare books that provide integrated haptic, audio, olfactory, visual and cognitive human-rare book interaction for the public. Our synthesis of relevant literature proposes that current research and technology can be categorized according to five characteristics: expected users, content and content management, navigation, presentation, and interaction control. Our research investigates how young adults (novices) in northern Europe interact with a rare historic book and their reflections about their interaction. Results indicate that interaction engendered appreciation and curiosity regarding individual human behaviour and social practices, and regarding design and technology for novices. Interaction also had an affective impact, eliciting personal memories and emotions. Participants reported that interacting only visually with books or their representations would not have afforded the same results. The results suggest several design recommendations for future physical-virtual renderings of rare historic books

    Multi-touch For General-purpose Computing An Examination Of Text Entry

    Get PDF
    In recent years, multi-touch has been heralded as a revolution in humancomputer interaction. Multi-touch provides features such as gestural interaction, tangible interfaces, pen-based computing, and interface customization – features embraced by an increasingly tech-savvy public. However, multi-touch platforms have not been adopted as everyday computer interaction devices; that is, multi-touch has not been applied to general-purpose computing. The questions this thesis seeks to address are: Will the general public adopt these systems as their chief interaction paradigm? Can multi-touch provide such a compelling platform that it displaces the desktop mouse and keyboard? Is multi-touch truly the next revolution in human-computer interaction? As a first step toward answering these questions, we observe that generalpurpose computing relies on text input, and ask: Can multi-touch, without a text entry peripheral, provide a platform for efficient text entry? And, by extension, is such a platform viable for general-purpose computing? We investigate these questions through four user studies that collected objective and subjective data for text entry and word processing tasks. The first of these studies establishes a benchmark for text entry performance on a multi-touch platform, across a variety of input modes. The second study attempts to improve this performance by iv examining an alternate input technique. The third and fourth studies include mousestyle interaction for formatting rich-text on a multi-touch platform, in the context of a word processing task. These studies establish a foundation for future efforts in general-purpose computing on a multi-touch platform. Furthermore, this work details deficiencies in tactile feedback with modern multi-touch platforms, and describes an exploration of audible feedback. Finally, the thesis conveys a vision for a general-purpose multi-touch platform, its design and rationale

    Personalized Interaction with High-Resolution Wall Displays

    Get PDF
    Fallende Hardwarepreise sowie eine zunehmende Offenheit gegenĂŒber neuartigen InteraktionsmodalitĂ€ten haben in den vergangen Jahren den Einsatz von wandgroßen interaktiven Displays möglich gemacht, und in der Folge ist ihre Anwendung, unter anderem in den Bereichen Visualisierung, Bildung, und der UnterstĂŒtzung von Meetings, erfolgreich demonstriert worden. Aufgrund ihrer GrĂ¶ĂŸe sind Wanddisplays fĂŒr die Interaktion mit mehreren Benutzern prĂ€destiniert. Gleichzeitig kann angenommen werden, dass Zugang zu persönlichen Daten und Einstellungen — mithin personalisierte Interaktion — weiterhin essentieller Bestandteil der meisten AnwendungsfĂ€lle sein wird. Aktuelle Benutzerschnittstellen im Desktop- und Mobilbereich steuern Zugriffe ĂŒber ein initiales Login. Die Annahme, dass es nur einen Benutzer pro Bildschirm gibt, zieht sich durch das gesamte System, und ermöglicht unter anderem den Zugriff auf persönliche Daten und Kommunikation sowie persönliche Einstellungen. Gibt es hingegen mehrere Benutzer an einem großen Bildschirm, mĂŒssen hierfĂŒr Alternativen gefunden werden. Die daraus folgende Forschungsfrage dieser Dissertation lautet: Wie können wir im Kontext von Mehrbenutzerinteraktion mit wandgroßen Displays personalisierte Schnittstellen zur VerfĂŒgung stellen? Die Dissertation befasst sich sowohl mit personalisierter Interaktion in der NĂ€he (mit Touch als EingabemodalitĂ€t) als auch in etwas weiterer Entfernung (unter Nutzung zusĂ€tzlicher mobiler GerĂ€te). Grundlage fĂŒr personalisierte Mehrbenutzerinteraktion sind technische Lösungen fĂŒr die Zuordnung von Benutzern zu einzelnen Interaktionen. Hierzu werden zwei Alternativen untersucht: In der ersten werden Nutzer via Kamera verfolgt, und in der zweiten werden MobilgerĂ€te anhand von Ultraschallsignalen geortet. Darauf aufbauend werden Interaktionstechniken vorgestellt, die personalisierte Interaktion unterstĂŒtzen. Diese nutzen zusĂ€tzliche MobilgerĂ€te, die den Zugriff auf persönliche Daten sowie Interaktion in einigem Abstand von der Displaywand ermöglichen. Einen weiteren Teil der Arbeit bildet die Untersuchung der praktischen Auswirkungen der Ausgabe- und InteraktionsmodalitĂ€ten fĂŒr personalisierte Interaktion. Hierzu wird eine qualitative Studie vorgestellt, die Nutzerverhalten anhand des kooperativen Mehrbenutzerspiels Miners analysiert. Der abschließende Beitrag beschĂ€ftigt sich mit dem Analyseprozess selber: Es wird das Analysetoolkit fĂŒr Wandinteraktionen GIAnT vorgestellt, das Nutzerbewegungen, Interaktionen, und Blickrichtungen visualisiert und dadurch die Untersuchung der Interaktionen stark vereinfacht.An increasing openness for more diverse interaction modalities as well as falling hardware prices have made very large interactive vertical displays more feasible, and consequently, applications in settings such as visualization, education, and meeting support have been demonstrated successfully. Their size makes wall displays inherently usable for multi-user interaction. At the same time, we can assume that access to personal data and settings, and thus personalized interaction, will still be essential in most use-cases. In most current desktop and mobile user interfaces, access is regulated via an initial login and the complete user interface is then personalized to this user: Access to personal data, configurations and communications all assume a single user per screen. In the case of multiple people using one screen, this is not a feasible solution and we must find alternatives. Therefore, this thesis addresses the research question: How can we provide personalized interfaces in the context of multi-user interaction with wall displays? The scope spans personalized interaction both close to the wall (using touch as input modality) and further away (using mobile devices). Technical solutions that identify users at each interaction can replace logins and enable personalized interaction for multiple users at once. This thesis explores two alternative means of user identification: Tracking using RGB+depth-based cameras and leveraging ultrasound positioning of the users' mobile devices. Building on this, techniques that support personalized interaction using personal mobile devices are proposed. In the first contribution on interaction, HyDAP, we examine pointing from the perspective of moving users, and in the second, SleeD, we propose using an arm-worn device to facilitate access to private data and personalized interface elements. Additionally, the work contributes insights on practical implications of personalized interaction at wall displays: We present a qualitative study that analyses interaction using a multi-user cooperative game as application case, finding awareness and occlusion issues. The final contribution is a corresponding analysis toolkit that visualizes users' movements, touch interactions and gaze points when interacting with wall displays and thus allows fine-grained investigation of the interactions

    ICMI'12:Proceedings of the ACM SIGCHI 14th International Conference on Multimodal Interaction

    Get PDF

    Multi-Sensory Interaction for Blind and Visually Impaired People

    Get PDF
    This book conveyed the visual elements of artwork to the visually impaired through various sensory elements to open a new perspective for appreciating visual artwork. In addition, the technique of expressing a color code by integrating patterns, temperatures, scents, music, and vibrations was explored, and future research topics were presented. A holistic experience using multi-sensory interaction acquired by people with visual impairment was provided to convey the meaning and contents of the work through rich multi-sensory appreciation. A method that allows people with visual impairments to engage in artwork using a variety of senses, including touch, temperature, tactile pattern, and sound, helps them to appreciate artwork at a deeper level than can be achieved with hearing or touch alone. The development of such art appreciation aids for the visually impaired will ultimately improve their cultural enjoyment and strengthen their access to culture and the arts. The development of this new concept aids ultimately expands opportunities for the non-visually impaired as well as the visually impaired to enjoy works of art and breaks down the boundaries between the disabled and the non-disabled in the field of culture and arts through continuous efforts to enhance accessibility. In addition, the developed multi-sensory expression and delivery tool can be used as an educational tool to increase product and artwork accessibility and usability through multi-modal interaction. Training the multi-sensory experiences introduced in this book may lead to more vivid visual imageries or seeing with the mind’s eye

    Sheet Music Unbound: A fluid approach to sheet music display and annotation on a multi-touch screen

    Get PDF
    In this thesis we present the design and prototype implementation of a Digital Music Stand that focuses on fluid music layout management and free-form digital ink annotation. An analysis of user constraints and available technology lead us to select a 21.5” multi-touch monitor as the preferred input and display device. This comfortably displays two A4 pages of music side by side with space for a control panel. The analysis also identified single handed input as a viable choice for musicians. Finger input was chosen to avoid the need for any additional input equipment. To support layout reflow and zooming we develop a vector based music representation, based around the bar structure. This representation supports animation of transitions, in such a way as to give responsive dynamic interaction with multi-touch gesture input. In developing the prototype, particular attention was paid to the problem of drawing small, intricate annotation accurately located on the music using a fingertip. The zoomable nature of the music structure was leveraged to accomplish this, and an evaluation carried out to establish the best level of magnification. The thesis demonstrates, in the context of music, that annotation and layout management (typically treated as two distinct tasks) can be integrated into a single task yielding fluid and natural interaction

    All the Noises:Hijacking Listening Machines for Performative Research

    Get PDF
    Research into machine listening has intensified in recent years creating a variety of techniques for recognising musical features suitable, for example, in musicological analysis or commercial application in song recognition. Within NIME, several projects exist seeking to make these techniques useful in real-time music making. However, we debate whether the functionally-oriented approaches inherited from engineering domains that much machine listening research manifests is fully suited to the exploratory, divergent, boundary-stretching, uncertainty-seeking, playful and irreverent orientations of many artists. To explore this, we engaged in a concerted collaborative design exercise in which many different listening algorithms were implemented and presented with input which challenged their customary range of application and the implicit norms of musicality which research can take for granted. An immersive 3D spatialised multichannel environment was created in which the algorithms could be explored in a hybrid installation/performance/lecture form of research presentation. The paper closes with reflections on the creative value of 'hijacking' formal approaches into deviant contexts, the typically undocumented practical know-how required to make algorithms work, the productivity of a playfully irreverent relationship between engineering and artistic approaches to NIME, and a sketch of a sonocybernetic aesthetics for our work

    CREATING TOUCHPANEL GRAPHICS FOR CONTROL SYSTEMS

    Get PDF
    More often than system designers would like to admit, a discrepancy lies between the implementation of audiovisual control systems and their apparent ease of use to a novice or casual user. System designers and programmers are often hampered by the software tools provided by industry manufacturers and cannot reliably create desirable graphical interfaces that match the level of systems they are asked to program and install. Popular consumer trends in portable touchscreen devices, pioneered on devices such as the Apple iPhone, light a way forward into a new era of elegantly solving the audiovisual control system graphical user interface problem. Since expensive specialized hardware can be replaced by readily available consumer devices and a wide variety of tools exists with which to create content, possible alternatives to the current methods of designing the graphical user interface for the audiovisual system are ripe for discovery. Using the latest release of Autodesk Maya 2011, with features such as Python and Pymel, we have developed scripts to generate graphical user interface content for use with audiovisual control systems hardware. Also explored is the potential for a standalone development environment such that audiovisual designers and programmers are not required to operate Maya or adjust scripts to generate content. Given this new level of control over the graphical user interface, coupled with the flexibility of the control system central processor programming, a truly powerful, intuitive, and groundbreaking control interface can finally be realized
    • 

    corecore