27 research outputs found

    Real-time finger registration for enriching multi-touch interfaces : with virtual mouse application

    No full text
    We present a simple finger registration technique that can distinguish in real-time which hand and which fingers of the user are touching the touchscreen. The finger registration process is activated whenever the user places a hand in any orientation anywhere on the touchscreen. Such a finger registration technique enables the design of multitouch interfaces that directly map combinations of user's fingers to the interface's operations. As an initial study, we demonstrate the usability of our finger registration method in a simple application. Specifically, we design a virtual mouse interface that enables the user to perform mouse operations in a relax manner, with fingers naturally curved and the idling fingers allowed to rest comfortably, like using a physical mouse or trackball. The ability to distinguish individual contact fingers will open a new avenue for designing richer multi-touch interfaces

    1 Dot Scissor: A Single-Click Interface for Mesh Segmentation

    No full text
    Abstract—This paper presents a very easy-to-use interactive tool, which we call dot scissor, for mesh segmentation. The user’s effort is reduced to placing only a single click where a cut is desired. Such a simple interface is made possible by a directional search strategy supported by a concavity-aware harmonic field and a robust voting scheme that selects the best isoline as the cut. With a concavity-aware weighting scheme, the harmonic fields gather dense isolines along concave regions which are natural boundaries of semantic components. The voting scheme relies on an isoline-face scoring mechanism that considers both shape geometry and user intent. We show by extensive experiments and quantitative analysis that our tool advances the state-of-the-art segmentation methods in both simplicity of use and segmentation quality

    Effective derivation of similarity transformations for implicit Laplacian mesh editing

    No full text
    Laplacian coordinates as a local shape descriptor have been employed in mesh editing. As they are encoded in the global coordinate system, they need to be transformed locally to reflect the changed local features of the deformed surface. We present a novel implicit Laplacian editing framework which is linear and effectively captures local rotation information during editing. Directly representing rotation with respect to vertex positions in 3D space leads to a nonlinear system. Instead, we first compute the affine transformations implicitly defined for all the Laplacian coordinates by solving a large sparse linear system, and then extract the rotation and uniform scaling information from each solved affine transformation. Unlike existing differential-based mesh editing techniques, our method produces visually pleasing deformation results under large angle rotations or big-scale translations of handles. Additionally, to demonstrate the advantage of our editing framework, we introduce a new intuitive editing technique, called configuration-independent merging, which produces the same merging result independent of the relative position, orientation, scale of input meshes

    Multitouch Gestures for Constrained Transformation of 3D Objects

    No full text
    3D transformation widgets allow constrained manipulations of 3D objects and are commonly used in many 3D applications for fine-grained manipulations. Since traditional transformation widgets have been mainly designed for mouse-based systems, they are not user friendly for multitouch screens. There is little research on how to use the extra input bandwidth of multitouch screens to ease constrained transformation of 3D objects. This paper presents a small set of multitouch gestures which offers a seamless control of manipulation constraints (i.e., axis or plane) and modes (i.e., translation, rotation or scaling). Our technique does not require any complex manipulation widgets but candidate axes, which are for visualization rather than direct manipulation. Such design not only minimizes visual clutter but also tolerates imprecise touch-based inputs. To further expand our axis-based interaction vocabulary, we introduce intuitive touch gestures for relative manipulations, including snapping and borrowing axes of another object. A preliminary evaluation shows that our technique is more effective than a direct adaption of standard transformation widgets to the tactile paradigm
    corecore