15,058 research outputs found

    Tangible user interfaces : past, present and future directions

    Get PDF
    In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research

    Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures

    No full text
    Mobile communication devices, such as mobile phones and networked personal digital assistants (PDAs), allow users to be constantly connected and communicate anywhere and at any time, often resulting in personal and private communication taking place in public spaces. This private -- public contrast can be problematic. As a remedy, we promote intimate interfaces: interfaces that allow subtle and minimal mobile interaction, without disruption of the surrounding environment. In particular, motionless gestures sensed through the electromyographic (EMG) signal have been proposed as a solution to allow subtle input in a mobile context. In this paper we present an expansion of the work on EMG-based motionless gestures including (1) a novel study of their usability in a mobile context for controlling a realistic, multimodal interface and (2) a formal assessment of how noticeable they are to informed observers. Experimental results confirm that subtle gestures can be profitably used within a multimodal interface and that it is difficult for observers to guess when someone is performing a gesture, confirming the hypothesis of subtlety

    A mobile fitness companion

    Get PDF
    The paper introduces a Mobile Companion prototype, which helps users to plan and keep track of their exercise activities via an interface based mainly on speech input and output. The Mobile Companion runs on a PDA and is based on a stand-alone, speaker-independent solution, making it fairly unique among mobile spoken dialogue systems, where the common solution is to run the ASR on a separate server or to restrict the speech input to some specific set of users. The prototype uses a GPS receiver to collect position, distance and speed data while the user is exercising, and allows the data to be compared to previous exercises. It communicates over the mobile network with a stationary system, placed in the user’s home. This allows plans for exercise activities to be downloaded from the stationary to the mobile system, and exercise result data to be uploaded once an exercise has been completed

    An Evaluation of Input Controls for In-Car Interactions

    Get PDF
    The way drivers operate in-car systems is rapidly changing as traditional physical controls, such as buttons and dials, are being replaced by touchscreens and touch-sensing surfaces. This has the potential to increase driver distraction and error as controls may be harder to find and use. This paper presents an in-car, on the road driving study which examined three key types of input controls to investigate their effects: a physical dial, pressure-based input on a touch surface and touch input on a touchscreen. The physical dial and pressure-based input were also evaluated with and without haptic feedback. The study was conducted with users performing a list-based targeting task using the different controls while driving on public roads. Eye-gaze was recorded to measure distraction from the primary task of driving. The results showed that target accuracy was high across all input methods (greater than 94%). Pressure-based targeting was the slowest while directly tapping on the targets was the faster selection method. Pressure-based input also caused the largest number of glances towards to the touchscreen but the duration of each glance was shorter than directly touching the screen. Our study will enable designers to make more appropriate design choices for future in-car interactions

    The sound motion controller: a distributed system for interactive music performance

    Get PDF
    We developed an interactive system for music performance, able to control sound parameters in a responsive way with respect to the user’s movements. This system is conceived as a mobile application, provided with beat tracking and an expressive parameter modulation, interacting with motion sensors and effector units, which are connected to a music output, such as synthesizers or sound effects. We describe the various types of usage of our system and our achievements, aimed to increase the expression of music performance and provide an aid to music interaction. The results obtained outline a first level of integration and foresee future cognitive and technological research related to it

    Ambient Gestures

    No full text
    We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous ‘in the environment’ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing

    Tactor devices : using tactile interface designs for mobile digital appliances : a practice-based research thesis for the fulfilment of a Master of Design degree, College of Design, Fine Arts, and Music, Massey University, Wellington

    Get PDF
    This Thesis focuses the potential of communication interfaces that use tactors (tactile actuators) to improve user interactions with mobile digital devices which are currently based on audio and visual technologies. It presents two product concepts, which use tactile signals to enable new ways in tele-operations, such as tactile telecommunication and tactile navigation. Tactor interfaces, although still in its infancy as elements of modern digital communication and technology, have considerable potential for the future as designers attempt to maximise the use of all human senses in people's interaction with technology. Only the military and a few entertainment companies have introduced tactile signals into Human-Computer Interactions (HCI). Human touch perception uses the hands as the main sensing organs. They perceive tactile signals while handling, typing or navigating with digital devices and receive direct confirmation of physical actions. In contrast to other senses, touch perceptions are based on interactions with the sensed objects. The study analyses, experiments and evaluates if these interactions are useful in interface designs and recommends how tactile stimulations can be introduced to interface designs besides images and sounds that dominate the control of current digital appliances. Tactile actuators and sensors enable devices to use tactile signals, such as impulses and vibrations, to communicate with the users. Users and tactor devices will be able to communicate in a physical and direct way. Touch reflective interfaces, could react like living creatures that respond to touch, for example a cat that starts purring when touched. Digital product design is always challenged to create human-computer interactions that meet people's needs. Designing digital devices is difficult because they are not necessarily three-dimensional objects. They are stimulator of the human senses and can be as small as the sensing nerve endings that detect sensations. By miniaturisation, form and function become invisible and Product Design is increasingly incorporating Process Design that explores and enables new interactions between users and products to work interactively and efficiently. The study is divided into four chapters: Chapter 1 gives an introduction to the thesis. Chapter 2 presents a survey on current literature which examines the five human senses to define the limits and possibilities in interface design. It reviews current research on materials and technologies as well as the psychology and physiology of touch as a potential sense in human-computer interactions. It evaluates the technical feasibilty of tactile signal performances and how they could be used as tele-touch codes in navigation and telecommunication. Chapter 3 is focused on primary research undertaken to extend the knowledge in tactile sensing. It includes experiments, questionnaires, and concepts that give examples how tactor interfaces can be used in tele-operations. This section focuses on specific user groups, that may primarily benefit from tactile signal transmissions, such as sight and hearing-impaired people or professionals who have to deal with limited perceptions like fire fighters, for example. These case studies are aimed at exploring and expanding a wider range of possibilities in tactile device innovations in the networked society. Chapter 4 gives a conclusion of the research
    corecore