7,119 research outputs found

    Tangible user interfaces : past, present and future directions

    Get PDF
    In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research

    Proceedings of the international conference on cooperative multimodal communication CMC/95, Eindhoven, May 24-26, 1995:proceedings

    Get PDF

    Collaborating through sounds: audio-only interaction with diagrams

    Get PDF
    PhDThe widening spectrum of interaction contexts and users’ needs continues to expose the limitations of the Graphical User Interface. But despite the benefits of sound in everyday activities and considerable progress in Auditory Display research, audio remains under-explored in Human- Computer Interaction (HCI). This thesis seeks to contribute to unveiling the potential of using audio in HCI by building on and extending current research on how we interact with and through the auditory modality. Its central premise is that audio, by itself, can effectively support collaborative interaction with diagrammatically represented information. Before exploring audio-only collaborative interaction, two preliminary questions are raised; first, how to translate a given diagram to an alternative form that can be accessed in audio; and second, how to support audio-only interaction with diagrams through the resulting form. An analysis of diagrams that emphasises their properties as external representations is used to address the first question. This analysis informs the design of a multiple perspective hierarchybased model that captures modality-independent features of a diagram when translating it into an audio accessible form. Two user studies then address the second question by examining the feasibility of the developed model to support the activities of inspecting, constructing and editing diagrams in audio. The developed model is then deployed in a collaborative lab-based context. A third study explores audio-only collaboration by examining pairs of participants who use audio as the sole means to communicate, access and edit shared diagrams. The channels through which audio is delivered to the workspace are controlled, and the effect on the dynamics of the collaborations is investigated. Results show that pairs of participants are able to collaboratively construct diagrams through sounds. Additionally, the presence or absence of audio in the workspace, and the way in which collaborators chose to work with audio were found to impact patterns of collaborative organisation, awareness of contribution to shared tasks and exchange of workspace awareness information. This work contributes to the areas of Auditory Display and HCI by providing empirically grounded evidence of how the auditory modality can be used to support individual and collaborative interaction with diagrams.Algerian Ministry of Higher Education and Scientific Research. (MERS

    Multimodal information presentation for high-load human computer interaction

    Get PDF
    This dissertation addresses the question: given an application and an interaction context, how can interfaces present information to users in a way that improves the quality of interaction (e.g. a better user performance, a lower cognitive demand and a greater user satisfaction)? Information presentation is critical to the quality of interaction because it guides, constrains and even determines cognitive behavior. A good presentation is particularly desired in high-load human computer interactions, such as when users are under time pressure, stress, or are multi-tasking. Under a high mental workload, users may not have the spared cognitive capacity to cope with the unnecessary workload induced by a bad presentation. In this dissertation work, the major presentation factor of interest is modality. We have conducted theoretical studies in the cognitive psychology domain, in order to understand the role of presentation modality in different stages of human information processing. Based on the theoretical guidance, we have conducted a series of user studies investigating the effect of information presentation (modality and other factors) in several high-load task settings. The two task domains are crisis management and driving. Using crisis scenario, we investigated how to presentation information to facilitate time-limited visual search and time-limited decision making. In the driving domain, we investigated how to present highly-urgent danger warnings and how to present informative cues that help drivers manage their attention between multiple tasks. The outcomes of this dissertation work have useful implications to the design of cognitively-compatible user interfaces, and are not limited to high-load applications

    Proceedings of the 1st EICS Workshop on Engineering Interactive Computer Systems with SCXML

    Get PDF
    corecore