795 research outputs found

    A Haptic Modeling System

    Get PDF
    Haptics has been studied as a means of providing users with natural and immersive haptic sensations in various real, augmented, and virtual environments, but it is still relatively unfamiliar to the general public. One reason is the lack of abundant haptic content in areas familiar to the general public. Even though some modeling tools do exist for creating haptic content, the addition of haptic data to graphic models is still relatively primitive, time consuming, and unintuitive. In order to establish a comprehensive and efficient haptic modeling system, this chapter first defines the haptic modeling processes and its scopes. It then proposes a haptic modeling system that can, based on depth images and image data structure, create and edit haptic content easily and intuitively for virtual object. This system can also efficiently handle non-uniform haptic property per pixel, and can effectively represent diverse haptic properties (stiffness, friction, etc)

    Haptic Experience and the Design of Drawing Interfaces

    Get PDF
    Haptic feedback has the potential to enhance users’ sense of being engaged and creative in their artwork. Current work on providing haptic feedback in computer-based drawing applications has focused mainly on the realism of the haptic sensation rather than the users’ experience of that sensation in the context of their creative work. We present a study that focuses on user experience of three haptic drawing interfaces. These interfaces were based on two different haptic metaphors, one of which mimicked familiar drawing tools (such as pen, pencil or crayon on smooth or rough paper) and the other of which drew on abstract descriptors of haptic experience (roughness, stickiness, scratchiness and smoothness). It was found that users valued having control over the haptic sensation; that each metaphor was preferred by approximately half of the participants; and that the real world metaphor interface was considered more helpful than the abstract one, whereas the abstract interface was considered to better support creativity. This suggests that future interfaces for artistic work should have user-modifiable interaction styles for controlling the haptic sensation

    Painterly rendering techniques: A state-of-the-art review of current approaches

    Get PDF
    In this publication we will look at the different methods presented over the past few decades which attempt to recreate digital paintings. While previous surveys concentrate on the broader subject of non-photorealistic rendering, the focus of this paper is firmly placed on painterly rendering techniques. We compare different methods used to produce different output painting styles such as abstract, colour pencil, watercolour, oriental, oil and pastel. Whereas some methods demand a high level of interaction using a skilled artist, others require simple parameters provided by a user with little or no artistic experience. Many methods attempt to provide more automation with the use of varying forms of reference data. This reference data can range from still photographs, video, 3D polygonal meshes or even 3D point clouds. The techniques presented here endeavour to provide tools and styles that are not traditionally available to an artist. Copyright © 2012 John Wiley & Sons, Ltd

    Combining 3-D geovisualization with force feedback driven user interaction

    Full text link
    We describe a prototype software system for investigating novel human-computer interaction techniques for 3-D geospatial data. This system, M4-Geo (Multi-Modal Mesh Manipulation of Geospatial data), aims to provide a more intuitive interface for directly manipulating 3-D surface data, such as digital terrain models (DTM). The M4-Geo system takes place within a 3-D environment and uses a Phantom haptic force feedback device to enhance 3-D computer graphics with touch-based interactions. The Phantom uses a 3-D force feedback stylus, which acts as a virtual “finger tip ” that allows the user to feel the shape (morphology) of the terrain’s surface in great detail. In addition, it acts as a touch sensitive tool for different GIS tasks, such as digitizing (draping) of lines and polygons directly onto a 3-D surface and directly deforming surfaces (by pushing or pulling the stylus in or out). The user may adjust the properties of the surface deformation (e.g., soft or hard) locally by painting it with a special “material color.” The overlap of visual and force representation of 3-D data aides hand-eye coordination for these tasks and helps the user to perceive the 3-D spatial data in a more holistic, multi-sensory way. The use of such a 3-D force feedback device for direct interaction may thus provide more intuitive and efficient alternatives to the mouse and keyboards driven interactions common today, in particular in areas related to digital landscape design, surface hydrology and geotechnical engineering

    deForm: An interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch

    Get PDF
    We introduce a novel input device, deForm, that supports 2.5D touch gestures, tangible tools, and arbitrary objects through real-time structured light scanning of a malleable surface of interaction. DeForm captures high-resolution surface deformations and 2D grey-scale textures of a gel surface through a three-phase structured light 3D scanner. This technique can be combined with IR projection to allow for invisible capture, providing the opportunity for co-located visual feedback on the deformable surface. We describe methods for tracking fingers, whole hand gestures, and arbitrary tangible tools. We outline a method for physically encoding fiducial marker information in the height map of tangible tools. In addition, we describe a novel method for distinguishing between human touch and tangible tools, through capacitive sensing on top of the input surface. Finally we motivate our device through a number of sample applications
    corecore