79 research outputs found

    3DTouch: A wearable 3D input device with an optical sensor and a 9-DOF inertial measurement unit

    Full text link
    We present 3DTouch, a novel 3D wearable input device worn on the fingertip for 3D manipulation tasks. 3DTouch is designed to fill the missing gap of a 3D input device that is self-contained, mobile, and universally working across various 3D platforms. This paper presents a low-cost solution to designing and implementing such a device. Our approach relies on relative positioning technique using an optical laser sensor and a 9-DOF inertial measurement unit. 3DTouch is self-contained, and designed to universally work on various 3D platforms. The device employs touch input for the benefits of passive haptic feedback, and movement stability. On the other hand, with touch interaction, 3DTouch is conceptually less fatiguing to use over many hours than 3D spatial input devices. We propose a set of 3D interaction techniques including selection, translation, and rotation using 3DTouch. An evaluation also demonstrates the device's tracking accuracy of 1.10 mm and 2.33 degrees for subtle touch interaction in 3D space. Modular solutions like 3DTouch opens up a whole new design space for interaction techniques to further develop on.Comment: 8 pages, 7 figure

    Multi Stage based Time Series Analysis of User Activity on Touch Sensitive Surfaces in Highly Noise Susceptible Environments

    Full text link
    This article proposes a multistage framework for time series analysis of user activity on touch sensitive surfaces in noisy environments. Here multiple methods are put together in multi stage framework; including moving average, moving median, linear regression, kernel density estimation, partial differential equations and Kalman filter. The proposed three stage filter consisting of partial differential equation based denoising, Kalman filter and moving average method provides ~25% better noise reduction than other methods according to Mean Squared Error (MSE) criterion in highly noise susceptible environments. Apart from synthetic data, we also obtained real world data like hand writing, finger/stylus drags etc. on touch screens in the presence of high noise such as unauthorized charger noise or display noise and validated our algorithms. Furthermore, the proposed algorithm performs qualitatively better than the existing solutions for touch panels of the high end hand held devices available in the consumer electronics market qualitatively.Comment: 9 pages (including 9 figures and 3 tables); International Journal of Computer Applications (published

    Integrating 2D Mouse Emulation with 3D Manipulation for Visualizations on a Multi-Touch Table

    Get PDF
    We present the Rizzo, a multi-touch virtual mouse that has been designed to provide the fine grained interaction for information visualization on a multi-touch table. Our solution enables touch interaction for existing mouse-based visualizations. Previously, this transition to a multi-touch environment was difficult because the mouse emulation of touch surfaces is often insufficient to provide full information visualization functionality. We present a unified design, combining many Rizzos that have been designed not only to provide mouse capabilities but also to act as zoomable lenses that make precise information access feasible. The Rizzos and the information visualizations all exist within a touch-enabled 3D window management system. Our approach permits touch interaction with both the 3D windowing environment as well as with the contents of the individual windows contained therein. We describe an implementation of our technique that augments the VisLink 3D visualization environment to demonstrate how to enable multi-touch capabilities on all visualizations written with the popular prefuse visualization toolkit.

    A Text Selection Technique using Word Snapping

    Get PDF
    Conventional copy-and-paste technique for touch screen devices utilizes region handles to specify text snippet. The region handles appear so as to select the initially tapped word, and the user controls the region handles. Most of the text-selection task is performed at the boundary of words, however, the minimum movement unit of the region handle is still a character. We propose a context- sensitive text-selection method for the tablet OSs. For the initial consideration, we investigated a word-snapping method that meant a word as a minimum movement unit. From our experiment, we confirmed that the word-snapping method can significantly reduce the text-selection time if the target text consists of one or two words, and no line breaks exist.KES-2014 18th International Conference on Knowledge-Based and Intelligent Information & Engineering Systems, September 15-17, 2014, Gdynia, Polan

    Multi-touch 3D Exploratory Analysis of Ocean Flow Models

    Get PDF
    Modern ocean flow simulations are generating increasingly complex, multi-layer 3D ocean flow models. However, most researchers are still using traditional 2D visualizations to visualize these models one slice at a time. Properly designed 3D visualization tools can be highly effective for revealing the complex, dynamic flow patterns and structures present in these models. However, the transition from visualizing ocean flow patterns in 2D to 3D presents many challenges, including occlusion and depth ambiguity. Further complications arise from the interaction methods required to navigate, explore, and interact with these 3D datasets. We present a system that employs a combination of stereoscopic rendering, to best reveal and illustrate 3D structures and patterns, and multi-touch interaction, to allow for natural and efficient navigation and manipulation within the 3D environment. Exploratory visual analysis is facilitated through the use of a highly-interactive toolset which leverages a smart particle system. Multi-touch gestures allow users to quickly position dye emitting tools within the 3D model. Finally, we illustrate the potential applications of our system through examples of real world significance

    Towards Gesture-based Process Modeling on Multi-Touch Devices

    Get PDF
    Contemporary tools for business process modeling use menu-based interfaces for visualizing process models and interacting with them. However, pure menu-based interactions have been optimized for applications running on desktop computers and are limited regarding their use on multi-touch devices. At the same time, the increasing distribution of mobile devices in business life as well as their multi-touch capabilities offer promising perspectives for intuitively defining and adapting business process models. Additionally, multi-touch tables could improve collaborative business process modeling based on natural gestures and interactions. In this paper we present the results of an experiment in which we investigate the way users model business processes with multi-touch devices. Furthermore, a core gesture set is suggested enabling the easy definition and adaption of business process models on these devices. Overall, gesture-based process modeling and multi-touch devices allow for new ways of (collaborative) business process modeling
    corecore