323 research outputs found

    The Tabletop is Dead? - Long Live the Table's Top!

    Get PDF
    Research with interactive tabletop displays has shown much promise for collaborative scenarios. However, tabletops never became a commercial success and rarely exist outside the research community. Being relatively expensive, heavy and immobile hardware, and only limited availability of commercial applications were some of the reasons that these systems never made it into our offices or living rooms. The timing with the introduction of multi-touch smartphones and tablets, with their smaller form factor, better mobility, support for multi touch interaction, and an app-ecosystem, made large interactive surfaces look bulky and outdated. There is, however, a shift to an increasing number of mobile and ad-hoc scenarios, where mobile devices are used on a table’s top

    3DTouch: A wearable 3D input device with an optical sensor and a 9-DOF inertial measurement unit

    Full text link
    We present 3DTouch, a novel 3D wearable input device worn on the fingertip for 3D manipulation tasks. 3DTouch is designed to fill the missing gap of a 3D input device that is self-contained, mobile, and universally working across various 3D platforms. This paper presents a low-cost solution to designing and implementing such a device. Our approach relies on relative positioning technique using an optical laser sensor and a 9-DOF inertial measurement unit. 3DTouch is self-contained, and designed to universally work on various 3D platforms. The device employs touch input for the benefits of passive haptic feedback, and movement stability. On the other hand, with touch interaction, 3DTouch is conceptually less fatiguing to use over many hours than 3D spatial input devices. We propose a set of 3D interaction techniques including selection, translation, and rotation using 3DTouch. An evaluation also demonstrates the device's tracking accuracy of 1.10 mm and 2.33 degrees for subtle touch interaction in 3D space. Modular solutions like 3DTouch opens up a whole new design space for interaction techniques to further develop on.Comment: 8 pages, 7 figure

    AUGMENTED TOUCH INTERACTIONS WITH FINGER CONTACT SHAPE AND ORIENTATION

    Get PDF
    Touchscreen interactions are far less expressive than the range of touch that human hands are capable of - even considering technologies such as multi-touch and force-sensitive surfaces. Recently, some touchscreens have added the capability to sense the actual contact area of a finger on the touch surface, which provides additional degrees of freedom - the size and shape of the touch, and the finger's orientation. These additional sensory capabilities hold promise for increasing the expressiveness of touch interactions - but little is known about whether users can successfully use the new degrees of freedom. To provide this baseline information, we carried out a study with a finger-contact-sensing touchscreen, and asked participants to produce a range of touches and gestures with different shapes and orientations, with both one and two fingers. We found that people are able to reliably produce two touch shapes and three orientations across a wide range of touches and gestures - a result that was confirmed in another study that used the augmented touches for a screen lock application

    The aptness of tangible user interfaces for explaining abstract computer network principles

    Get PDF
    The technological deployment of Tangible User Interfaces (TUI) with their intrinsic ability to interlink the physical and digital domains, have steadily gained interest within the educational sector. As a concrete example of Reality Based Interaction, such digital manipulatives have been successfully implemented in the past years to introduce scientific and engineering concepts at earlier stages throughout the educational cycle. With difference to literature, this research investigates the suitability and effectiveness of implementing a TUI system to enhance the learning experience in a higher education environment. The proposal targets the understanding of advanced computer networking principles by the deployment of an interactive table-top system. Beyond the mere simulation and modelling of networking topologies, the design presents students the ability to directly interact with and visualise the protocol execution, hence augmenting their ability to understand the abstract nature of such algorithms. Following deployment of the proposed innovate prototype within the delivery of a university undergraduate programme, the quantitative effectiveness of this novel methodology will be assessed from both a teaching and learning perspective on its ability to convey the abstract notions of computer network principles

    The aptness of tangible user interfaces for explaining abstract computer network principles

    Get PDF
    The technological deployment of Tangible User Interfaces (TUI) with their intrinsic ability to interlink the physical and digital domains, have steadily gained interest within the educational sector. As a concrete example of Reality Based Interaction, such digital manipulatives have been successfully implemented in the past years to introduce scientific and engineering concepts at earlier stages throughout the educational cycle. With difference to literature, this research investigates the suitability and effectiveness of implementing a TUI system to enhance the learning experience in a higher education environment. The proposal targets the understanding of advanced computer networking principles by the deployment of an interactive table-top system. Beyond the mere simulation and modelling of networking topologies, the design presents students the ability to directly interact with and visualise the protocol execution, hence augmenting their ability to understand the abstract nature of such algorithms. Following deployment of the proposed innovate prototype within the delivery of a university undergraduate programme, the quantitative effectiveness of this novel methodology will be assessed from both a teaching and learning perspective on its ability to convey the abstract notions of computer network principles

    Interaction With Tilting Gestures In Ubiquitous Environments

    Full text link
    In this paper, we introduce a tilting interface that controls direction based applications in ubiquitous environments. A tilt interface is useful for situations that require remote and quick interactions or that are executed in public spaces. We explored the proposed tilting interface with different application types and classified the tilting interaction techniques. Augmenting objects with sensors can potentially address the problem of the lack of intuitive and natural input devices in ubiquitous environments. We have conducted an experiment to test the usability of the proposed tilting interface to compare it with conventional input devices and hand gestures. The experiment results showed greater improvement of the tilt gestures in comparison with hand gestures in terms of speed, accuracy, and user satisfaction.Comment: 13 pages, 10 figure

    Supporting collaborative work using interactive tabletop

    Get PDF
    PhD ThesisCollaborative working is a key of success for organisations. People work together around tables at work, home, school, and coffee shops. With the explosion of the internet and computer systems, there are a variety of tools to support collaboration in groups, such as groupware, and tools that support online meetings. However, in the case of co-located meetings and face-to-face situations, facial expressions, body language, and the verbal communications have significant influence on the group decision making process. Often people have a natural preference for traditional pen-and-paper-based decision support solutions in such situations. Thus, it is a challenge to implement tools that rely advanced technological interfaces, such as interactive multi-touch tabletops, to support collaborative work. This thesis proposes a novel tabletop application to support group work and investigates the effectiveness and usability of the proposed system. The requirements for the developed system are based on a review of previous literature and also on requirements elicited from potential users. The innovative aspect of our system is that it allows the use of personal devices that allow some level of privacy for the participants in the group work. We expect that the personal devices may contribute to the effectiveness of the use of tabletops to support collaborative work. We chose for the purpose of evaluation experiment the collaborative development of mind maps by groups, which has been investigated earlier as a representative form of collaborative work. Two controlled laboratory experiments were designed to examine the usability features and associated emotional attitudes for the tabletop mind map application in comparison with the conventional pen-and-paper approach in the context of collaborative work. The evaluation clearly indicates that the combination of the tabletop and personal devices support and encourage multiple people working collaboratively. The comparison of the associated emotional attitudes indicates that the interactive tabletop facilitates the active involvement of participants in the group decision making significantly more than the use of the pen-and-paper conditions. The work reported here contributes significantly to our understanding of the usability and effectiveness of interactive tabletop applications in the context of supporting of collaborative work.The Royal Thai governmen

    Emerging ExG-based NUI Inputs in Extended Realities : A Bottom-up Survey

    Get PDF
    Incremental and quantitative improvements of two-way interactions with extended realities (XR) are contributing toward a qualitative leap into a state of XR ecosystems being efficient, user-friendly, and widely adopted. However, there are multiple barriers on the way toward the omnipresence of XR; among them are the following: computational and power limitations of portable hardware, social acceptance of novel interaction protocols, and usability and efficiency of interfaces. In this article, we overview and analyse novel natural user interfaces based on sensing electrical bio-signals that can be leveraged to tackle the challenges of XR input interactions. Electroencephalography-based brain-machine interfaces that enable thought-only hands-free interaction, myoelectric input methods that track body gestures employing electromyography, and gaze-tracking electrooculography input interfaces are the examples of electrical bio-signal sensing technologies united under a collective concept of ExG. ExG signal acquisition modalities provide a way to interact with computing systems using natural intuitive actions enriching interactions with XR. This survey will provide a bottom-up overview starting from (i) underlying biological aspects and signal acquisition techniques, (ii) ExG hardware solutions, (iii) ExG-enabled applications, (iv) discussion on social acceptance of such applications and technologies, as well as (v) research challenges, application directions, and open problems; evidencing the benefits that ExG-based Natural User Interfaces inputs can introduceto the areaof XR.Peer reviewe
    • …
    corecore