1,351 research outputs found

    Multi Stage based Time Series Analysis of User Activity on Touch Sensitive Surfaces in Highly Noise Susceptible Environments

    Full text link
    This article proposes a multistage framework for time series analysis of user activity on touch sensitive surfaces in noisy environments. Here multiple methods are put together in multi stage framework; including moving average, moving median, linear regression, kernel density estimation, partial differential equations and Kalman filter. The proposed three stage filter consisting of partial differential equation based denoising, Kalman filter and moving average method provides ~25% better noise reduction than other methods according to Mean Squared Error (MSE) criterion in highly noise susceptible environments. Apart from synthetic data, we also obtained real world data like hand writing, finger/stylus drags etc. on touch screens in the presence of high noise such as unauthorized charger noise or display noise and validated our algorithms. Furthermore, the proposed algorithm performs qualitatively better than the existing solutions for touch panels of the high end hand held devices available in the consumer electronics market qualitatively.Comment: 9 pages (including 9 figures and 3 tables); International Journal of Computer Applications (published

    Sensing and visualizing spatial relations of mobile devices

    Get PDF
    Location information can be used to enhance interaction with mobile devices. While many location systems require instrumentation of the environment, we present a system that allows devices to measure their spatial relations in a true peer-to-peer fashion. The system is based on custom sensor hardware implemented as USB dongle, and computes spatial relations in real-time. In extension of this system we propose a set of spatialized widgets for incorporation of spatial relations in the user interface. The use of these widgets is illustrated in a number of applications, showing how spatial relations can be employed to support and streamline interaction with mobile devices

    Improving expressivity in desktop interactions with a pressure-augmented mouse

    Get PDF
    Desktop-based Windows, Icons, Menus and Pointers (WIMP) interfaces have changed very little in the last 30 years, and are still limited by a lack of powerful and expressive input devices and interactions. In order to make desktop interactions more expressive and controllable, expressive input mechanisms like pressure input must be made available to desktop users. One way to provide pressure input to these users is through a pressure-augmented computer mouse; however, before pressure-augmented mice can be developed, design information must be provided to mouse developers. The problem we address in this thesis is that there is a lack of ergonomics and performance information for the design of pressure-augmented mice. Our solution was to provide empirical performance and ergonomics information for pressure-augmented mice by performing five experiments. With the results of our experiments we were able to identify the optimal design parameters for pressure-augmented mice and provide a set of recommendations for future pressure-augmented mouse designs

    The Mole: a pressure-sensitive mouse

    Get PDF
    The traditional mouse enables the positioning of a cursor in a 2D plane, as well as the interaction of binary elements within that plane (e.g., buttons, links, icons). While this basic functionality is sufficient for interacting with every modern computing environment, it makes little use of the human hand\u27s ability to perform complex multi-directional movements. Devices developed to capture these multi-directional capabilities typically lack the familiar form and function of the mouse. This thesis details the design and development of a pressure-sensitive device called The Mole. The Mole retains the familiar form and function of the mouse while passively measuring the magnitude of normal hand force (i.e., downward force normal to the 2D operating surface). The measurement of this force lends itself to the development of novel interactions, far beyond what is possible with a typical mouse. This thesis demonstrates two such interactions: the positioning of a cursor in 3D space, and the simultaneous manipulation of cursor position and graphic tool parameters

    Information processing

    Full text link
    Information processing, for example by means of a computer, is served by a tablet in combination with an inputting implement such as stylus or human finger. Such a tablet-implement combination may serve in lieu of or ancillary to other inputting means such as a keyboard. Implement positional information, e.g., as contacting the tablet, depends upon interpolation as between segmenting lines in the tablet. Cost advantage commensurate with resolution/noise desiderata is ascribable to use of analog information (without digitization) in interpolation.Published versio

    AUGMENTED TOUCH INTERACTIONS WITH FINGER CONTACT SHAPE AND ORIENTATION

    Get PDF
    Touchscreen interactions are far less expressive than the range of touch that human hands are capable of - even considering technologies such as multi-touch and force-sensitive surfaces. Recently, some touchscreens have added the capability to sense the actual contact area of a finger on the touch surface, which provides additional degrees of freedom - the size and shape of the touch, and the finger's orientation. These additional sensory capabilities hold promise for increasing the expressiveness of touch interactions - but little is known about whether users can successfully use the new degrees of freedom. To provide this baseline information, we carried out a study with a finger-contact-sensing touchscreen, and asked participants to produce a range of touches and gestures with different shapes and orientations, with both one and two fingers. We found that people are able to reliably produce two touch shapes and three orientations across a wide range of touches and gestures - a result that was confirmed in another study that used the augmented touches for a screen lock application

    Barehand Mode Switching in Touch and Mid-Air Interfaces

    Get PDF
    Raskin defines a mode as a distinct setting within an interface where the same user input will produce results different to those it would produce in other settings. Most interfaces have multiple modes in which input is mapped to different actions, and, mode-switching is simply the transition from one mode to another. In touch interfaces, the current mode can change how a single touch is interpreted: for example, it could draw a line, pan the canvas, select a shape, or enter a command. In Virtual Reality (VR), a hand gesture-based 3D modelling application may have different modes for object creation, selection, and transformation. Depending on the mode, the movement of the hand is interpreted differently. However, one of the crucial factors determining the effectiveness of an interface is user productivity. Mode-switching time of different input techniques, either in a touch interface or in a mid-air interface, affects user productivity. Moreover, when touch and mid-air interfaces like VR are combined, making informed decisions pertaining to the mode assignment gets even more complicated. This thesis provides an empirical investigation to characterize the mode switching phenomenon in barehand touch-based and mid-air interfaces. It explores the potential of using these input spaces together for a productivity application in VR. And, it concludes with a step towards defining and evaluating the multi-faceted mode concept, its characteristics and its utility, when designing user interfaces more generally

    A literature review of User Interface interaction devices

    Get PDF
    corecore