216 research outputs found

    Evaluation of Physical Finger Input Properties for Precise Target Selection

    Get PDF
    The multitouch tabletop display provides a collaborative workspace for multiple users around a table. Users can perform direct and natural multitouch interaction to select target elements using their bare fingers. However, physical size of fingertip varies from one person to another which generally introduces a fat finger problem. Consequently, it creates the imprecise selection of small size target elements during direct multitouch input. In this respect, an attempt is made to evaluate the physical finger input properties i.e. contact area and shape in the context of imprecise selection

    Evaluation of Physical Finger Input Properties for Precise Target Selection

    Get PDF
    The multitouch tabletop display provides a collaborative workspace for multiple users around a table. Users can perform direct and natural multitouch interaction to select target elements using their bare fingers. However, physical size of fingertip varies from one person to another which generally introduces a fat finger problem. Consequently, it creates the imprecise selection of small size target elements during direct multitouch input. In this respect, an attempt is made to evaluate the physical finger input properties i.e. contact area and shape in the context of imprecise selection

    PianoTable: Musical Interaction on a Tabltop Display

    Get PDF

    Freeform 3D interactions in everyday environments

    Get PDF
    PhD ThesisPersonal computing is continuously moving away from traditional input using mouse and keyboard, as new input technologies emerge. Recently, natural user interfaces (NUI) have led to interactive systems that are inspired by our physical interactions in the real-world, and focus on enabling dexterous freehand input in 2D or 3D. Another recent trend is Augmented Reality (AR), which follows a similar goal to further reduce the gap between the real and the virtual, but predominately focuses on output, by overlaying virtual information onto a tracked real-world 3D scene. Whilst AR and NUI technologies have been developed for both immersive 3D output as well as seamless 3D input, these have mostly been looked at separately. NUI focuses on sensing the user and enabling new forms of input; AR traditionally focuses on capturing the environment around us and enabling new forms of output that are registered to the real world. The output of NUI systems is mainly presented on a 2D display, while the input technologies for AR experiences, such as data gloves and body-worn motion trackers are often uncomfortable and restricting when interacting in the real world. NUI and AR can be seen as very complimentary, and bringing these two fields together can lead to new user experiences that radically change the way we interact with our everyday environments. The aim of this thesis is to enable real-time, low latency, dexterous input and immersive output without heavily instrumenting the user. The main challenge is to retain and to meaningfully combine the positive qualities that are attributed to both NUI and AR systems. I review work in the intersecting research fields of AR and NUI, and explore freehand 3D interactions with varying degrees of expressiveness, directness and mobility in various physical settings. There a number of technical challenges that arise when designing a mixed NUI/AR system, which I will address is this work: What can we capture, and how? How do we represent the real in the virtual? And how do we physically couple input and output? This is achieved by designing new systems, algorithms, and user experiences that explore the combination of AR and NUI

    Bringing the Physical to the Digital

    Get PDF
    This dissertation describes an exploration of digital tabletop interaction styles, with the ultimate goal of informing the design of a new model for tabletop interaction. In the context of this thesis the term digital tabletop refers to an emerging class of devices that afford many novel ways of interaction with the digital. Allowing users to directly touch information presented on large, horizontal displays. Being a relatively young field, many developments are in flux; hardware and software change at a fast pace and many interesting alternative approaches are available at the same time. In our research we are especially interested in systems that are capable of sensing multiple contacts (e.g., fingers) and richer information such as the outline of whole hands or other physical objects. New sensor hardware enable new ways to interact with the digital. When embarking into the research for this thesis, the question which interaction styles could be appropriate for this new class of devices was a open question, with many equally promising answers. Many everyday activities rely on our hands ability to skillfully control and manipulate physical objects. We seek to open up different possibilities to exploit our manual dexterity and provide users with richer interaction possibilities. This could be achieved through the use of physical objects as input mediators or through virtual interfaces that behave in a more realistic fashion. In order to gain a better understanding of the underlying design space we choose an approach organized into two phases. First, two different prototypes, each representing a specific interaction style – namely gesture-based interaction and tangible interaction – have been implemented. The flexibility of use afforded by the interface and the level of physicality afforded by the interface elements are introduced as criteria for evaluation. Each approaches’ suitability to support the highly dynamic and often unstructured interactions typical for digital tabletops is analyzed based on these criteria. In a second stage the learnings from these initial explorations are applied to inform the design of a novel model for digital tabletop interaction. This model is based on the combination of rich multi-touch sensing and a three dimensional environment enriched by a gaming physics simulation. The proposed approach enables users to interact with the virtual through richer quantities such as collision and friction. Enabling a variety of fine-grained interactions using multiple fingers, whole hands and physical objects. Our model makes digital tabletop interaction even more “natural”. However, because the interaction – the sensed input and the displayed output – is still bound to the surface, there is a fundamental limitation in manipulating objects using the third dimension. To address this issue, we present a technique that allows users to – conceptually – pick objects off the surface and control their position in 3D. Our goal has been to define a technique that completes our model for on-surface interaction and allows for “as-direct-as possible” interactions. We also present two hardware prototypes capable of sensing the users’ interactions beyond the table’s surface. Finally, we present visual feedback mechanisms to give the users the sense that they are actually lifting the objects off the surface. This thesis contributes on various levels. We present several novel prototypes that we built and evaluated. We use these prototypes to systematically explore the design space of digital tabletop interaction. The flexibility of use afforded by the interaction style is introduced as criterion alongside the user interface elements’ physicality. Each approaches’ suitability to support the highly dynamic and often unstructured interactions typical for digital tabletops are analyzed. We present a new model for tabletop interaction that increases the fidelity of interaction possible in such settings. Finally, we extend this model so to enable as direct as possible interactions with 3D data, interacting from above the table’s surface

    A Tabletop Board Game Interface for Multi-User Interaction with a Storytelling System

    Get PDF
    The Interactive Storyteller is an interactive storytelling system with a multi-user tabletop interface. Our goal was to design a generic framework combining emergent narrative, where stories emerge from the actions of autonomous intelligent agents, with the social aspects of traditional board games. As a visual representation of the story world, a map is displayed on a multi-touch table. Users can interact with the story by touching an interface on the table surface with their fingers and by moving tangible objects that represent the characters. This type of interface, where multiple users are gathered around a table with equal access to the characters and the story world, offers a more social setting for interaction than most existing interfaces for AI-based interactive storytelling

    Supporting Reflection and Classroom Orchestration with Tangible Tabletops

    Get PDF
    Tangible tabletop systems have been extensively proven to be able to enhance participation and engagement as well as enable many exciting activities, particularly in the education domain. However, it remains unclear as to whether students really benefit from using them for tasks that require a high level of reflection. Moreover, most existing tangible tabletops are designed as stand-alone systems or devices. Increasingly, this design assumption is no longer sufficient, especially in realistic learning settings. Due to the technological evolution in schools, multiple activities, resources, and constraints in the classroom ecosystem are now involved in the learning process. The way teachers manage technology-enhanced classrooms and the involved activities and constraints in real-time, also known as classroom orchestration, is a crucial aspect for the materialization of reflection and learning. This thesis aims to explore how educational tangible tabletop systems affect reflection, how reflection and orchestration are related, and how we can support reflection and orchestration to improve learning. It presents the design, implementation, and evaluations of three tangible tabletop systems – the DockLamp, the TinkerLamp, and the TinkerLamp 2.0 – in different learning contexts. Our experience with these systems, both inside and outside of the laboratory, results in an insightful understanding of the impacts of tangible tabletops on learning and the conditions for their effective use as well as deployment. These findings can be beneficial to the researchers and designers of learning environments using tangible tabletop and similar interfaces

    deForm: An interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch

    Get PDF
    We introduce a novel input device, deForm, that supports 2.5D touch gestures, tangible tools, and arbitrary objects through real-time structured light scanning of a malleable surface of interaction. DeForm captures high-resolution surface deformations and 2D grey-scale textures of a gel surface through a three-phase structured light 3D scanner. This technique can be combined with IR projection to allow for invisible capture, providing the opportunity for co-located visual feedback on the deformable surface. We describe methods for tracking fingers, whole hand gestures, and arbitrary tangible tools. We outline a method for physically encoding fiducial marker information in the height map of tangible tools. In addition, we describe a novel method for distinguishing between human touch and tangible tools, through capacitive sensing on top of the input surface. Finally we motivate our device through a number of sample applications
    • …
    corecore