2,745 research outputs found
Tangible user interfaces : past, present and future directions
In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research
Visualization and interaction in a simulation system for flood emergencies
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia InformáticaThis thesis presents an interaction and visualization system for a river flood emergency simulation. It will also present a detailed study about forms of visual representation of critical elements in emergencies. All these elements are currently assembled in an application based
on geographic information systems and agent simulation. Many of the goals in this thesis are interconnected with project Life-Saver. This project has the goal to develop an emergency response simulator, which needs a visualization and interaction system.
The main goals of this thesis are, to create a visualization system for an emergency, to
design an intuitive multimedia interface and to implement new forms of human-computer
interaction.
At the application level there is a representation of the simulation scenario with the multiple agent and their actions. Several studies were made to create an intuitive interface.
New forms of multimedia interaction are studied and used such as interactive touch sensible boards and multi-touch panels. It is possible to load and retrieve geographic information on the scenario. The resulting architecture is used to visualize a simulation of an emergency flooding situation in a scenario where the Alqueva dam in Guadiana river fails
An electronic architecture for mediating digital information in a hallway fac̦ade
Ubiquitous computing requires integration of physical space with digital information. This presents the challenges of integrating electronics, physical space, software and the interaction tools which can effectively communicate with the audience. Many research groups have embraced different techniques depending on location, context, space, and availability of necessary skills to make the world around us as an interface to the digital world. Encouraged by early successes and fostered by project undertaken by tangible visualization group. We introduce an architecture of Blades and Tiles for the development and realization of interactive wall surfaces. It provides an inexpensive, open-ended platform for constructing large-scale tangible and embedded interfaces. In this paper, we propose tiles built using inexpensive pegboards and a gateway for each of these tiles to provide access to digital information. The paper describes the architecture using a corridor fa\c{c}ade application. The corridor fa\c{c}ade uses full-spectrum LEDs, physical labels and stencils, and capacitive touch sensors to provide mediated representation, monitoring and querying of physical and digital content. Example contents include the physical and online status of people and the activity and dynamics of online research content repositories. Several complementary devices such as Microsoft PixelSense and smartdevices can support additional user interaction with the system. This enables interested people in synergistic physical environments to observe, explore, understand, and engage in ongoing activities and relationships. This paper describes the hardware architecture and software libraries employed and how they are used in our research center hallway and academic semester projects
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Touching artefacts in an ancient world on a browser-based platform
Title: Touching artefacts in an ancient world on a browser-based platform Article & version: Published version Original citation & hyperlink: Arnab, S., Petridis, P., Dunwell, I. and de Freitas, S. (2010). Touching artefacts in an ancient world on a browser-based platform. In Y. Xiao, T. Amon & R. Muffolett
Light on horizontal interactive surfaces: Input space for tabletop computing
In the last 25 years we have witnessed the rise and growth of interactive tabletop research, both in academic and in industrial settings. The rising demand for the digital support of human activities motivated the need to bring computational power to table surfaces. In this article, we review the state of the art of tabletop computing, highlighting core aspects that frame the input space of interactive tabletops: (a) developments in hardware technologies that have caused the proliferation of interactive horizontal surfaces and (b) issues related to new classes of interaction modalities (multitouch, tangible, and touchless). A classification is presented that aims to give a detailed view of the current development of this research area and define opportunities and challenges for novel touch- and gesture-based interactions between the human and the surrounding computational environment. © 2014 ACM.This work has been funded by Integra (Amper Sistemas and CDTI, Spanish Ministry of Science and Innovation) and TIPEx (TIN2010-19859-C03-01) projects and Programa de Becas y Ayudas para la Realización de Estudios Oficiales de Máster y Doctorado en la Universidad Carlos III de Madrid, 2010
- …