75,990 research outputs found

    Real-time vision-based microassembly of 3D MEMS.

    Get PDF
    International audienceRobotic microassembly is a promising way to fabricate micrometric components based three dimensions (3D) compound products where the materials or the technologies are incompatible: structures, devices, Micro Electro Mechanical Systems (MEMS), Micro Opto Electro Mechanical Systems (MOEMS),... To date, solutions proposed in the literature are based on 2D visual control because of the lack of accurate and robust 3D measures from the work scene. In this paper the relevance of the real-time 3D visual tracking and control is demonstrated. The 3D poses of the MEMS is supplied by a model-based tracking algorithm in real-time. It is accurate and robust enough to enable a precise regulation toward zero of a 3D error using a visual servoing approach. The assembly of 400 mm 400 mm 100 mm parts by their 100 mm 100 mm 100 mm notches with a mechanical play of 3 mm is achieved with a rate of 41 seconds per assembly. The control accuracy reaches 0.3 mm in position and 0.2 in orientation

    Real-time 3D model-based tracking: Combining edge and texture information

    Get PDF
    International audienceThis paper proposes a real-time, robust and efficient 3D model-based tracking algorithm. A non linear minimization approach is used to register 2D and 3D cues for monocular 3D tracking. The integration of texture information in a more classical non-linear edge-based pose computation highly increases the reliability of more conventional edge-based 3D tracker. Robustness is enforced by integrating a M-estimator into the minimization process via an iteratively re-weighted least squares implementation. The method presented in this paper has been validated on several video sequences as well as in visual servoing experiments considering various objects. Results show the method to be robust to large motions and textured environments

    Model-based Real-time Visualization of Realistic Three-Dimensional Heat Maps for Mobile Eye Tracking and Eye Tracking in Virtual Reality

    Get PDF
    Pfeiffer T, Memili C. Model-based Real-time Visualization of Realistic Three-Dimensional Heat Maps for Mobile Eye Tracking and Eye Tracking in Virtual Reality. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. New York, NY, USA: ACM Press; 2016: 95-102.Heat maps, or more generally, attention maps or saliency maps are an often used technique to visualize eye-tracking data. With heat maps qualitative information about visual processing can be easily visualized and communicated between experts and laymen. They are thus a versatile tool for many disciplines, in particular for usability engineering, and are often used to get a first overview about recorded eye-tracking data. Today, heat maps are typically generated for 2D stimuli that have been presented on a computer display. In such cases the mapping of overt visual attention on the stimulus is rather straight forward and the process is well understood. However, when turning towards mobile eye tracking and eye tracking in 3D virtual environments, the case is much more complicated. In the first part of the paper, we discuss several challenges that have to be considered in 3D environments, such as changing perspectives, multiple viewers, object occlusions, depth of fixations, or dynamically moving objects. In the second part, we present an approach for the generation of 3D heat maps addressing the above mentioned issues while working in real-time. Our visualizations provide high-quality output for multi-perspective eye-tracking recordings of visual attention in 3D environments

    Model-based Real-time Visualization of Realistic Three-Dimensional Heat Maps for Mobile Eye Tracking and Eye Tracking in Virtual Reality

    Get PDF
    Pfeiffer T, Memili C. Model-based Real-time Visualization of Realistic Three-Dimensional Heat Maps for Mobile Eye Tracking and Eye Tracking in Virtual Reality. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. New York, NY, USA: ACM Press; 2016: 95-102.Heat maps, or more generally, attention maps or saliency maps are an often used technique to visualize eye-tracking data. With heat maps qualitative information about visual processing can be easily visualized and communicated between experts and laymen. They are thus a versatile tool for many disciplines, in particular for usability engineering, and are often used to get a first overview about recorded eye-tracking data. Today, heat maps are typically generated for 2D stimuli that have been presented on a computer display. In such cases the mapping of overt visual attention on the stimulus is rather straight forward and the process is well understood. However, when turning towards mobile eye tracking and eye tracking in 3D virtual environments, the case is much more complicated. In the first part of the paper, we discuss several challenges that have to be considered in 3D environments, such as changing perspectives, multiple viewers, object occlusions, depth of fixations, or dynamically moving objects. In the second part, we present an approach for the generation of 3D heat maps addressing the above mentioned issues while working in real-time. Our visualizations provide high-quality output for multi-perspective eye-tracking recordings of visual attention in 3D environments

    Online Structured Learning for Real-Time Computer Vision Gaming Applications

    Get PDF
    In recent years computer vision has played an increasingly important role in the development of computer games, and it now features as one of the core technologies for many gaming platforms. The work in this thesis addresses three problems in real-time computer vision, all of which are motivated by their potential application to computer games. We rst present an approach for real-time 2D tracking of arbitrary objects. In common with recent research in this area we incorporate online learning to provide an appearance model which is able to adapt to the target object and its surrounding background during tracking. However, our approach moves beyond the standard framework of tracking using binary classication and instead integrates tracking and learning in a more principled way through the use of structured learning. As well as providing a more powerful framework for adaptive visual object tracking, our approach also outperforms state-of-the-art tracking algorithms on standard datasets. Next we consider the task of keypoint-based object tracking. We take the traditional pipeline of matching keypoints followed by geometric verication and show how this can be embedded into a structured learning framework in order to provide principled adaptivity to a given environment. We also propose an approximation method allowing us to take advantage of recently developed binary image descriptors, meaning our approach is suitable for real-time application even on low-powered portable devices. Experimentally, we clearly see the benet that online adaptation using structured learning can bring to this problem. Finally, we present an approach for approximately recovering the dense 3D structure of a scene which has been mapped by a simultaneous localisation and mapping system. Our approach is guided by the constraints of the low-powered portable hardware we are targeting, and we develop a system which coarsely models the scene using a small number of planes. To achieve this, we frame the task as a structured prediction problem and introduce online learning into our approach to provide adaptivity to a given scene. This allows us to use relatively simple multi-view information coupled with online learning of appearance to efficiently produce coarse reconstructions of a scene

    Articulation estimation and real-time tracking of human hand motions

    Get PDF
    Schröder M. Articulation estimation and real-time tracking of human hand motions. Bielefeld: Universität Bielefeld; 2015.This thesis deals with the problem of estimating and tracking the full articulation of human hands. Algorithmically recovering hand articulations is a challenging problem due to the hand’s high number of degrees of freedom and the complexity of its motions. Besides the accuracy and efficiency of the hand posture estimation, hand tracking methods are faced with issues such as invasiveness, ease of deployment and sensor artifacts. In this thesis several different hand tracking approaches are examined, including marker-based optical motion capture, data-driven discriminative visual tracking and generative tracking based on articulated registration, and various contributions to these areas are presented. The problem of optimally placing reduced marker sets on a performer’s hand for optical hand motion capture is explored. A method is proposed that automatically generates functional reduced marker layouts by optimizing for their numerical stability and geometric feasibility. A data-driven discriminative tracking approach based on matching the hand’s appearance in the sensor data with an image database is investigated. In addition to an efficient nearest neighbor search for images, a combination of discriminative initialization and generative refinement is employed. The method’s applicability is demonstrated in interactive robot teleoperation. Various real human hand motions are captured and statistically analyzed to derive low-dimensional representations of hand articulations. An adaptive hand posture subspace concept is developed and integrated into a generative real-time hand tracking approach that aligns a virtual hand model with sensor point clouds based on constrained inverse kinematics. Generative hand tracking is formulated as a regularized articulated registration process, in which geometrical model fitting is combined with statistical, kinematic and temporal regularization priors. A registration concept that combines 2D and 3D alignment and explicitly accounts for occlusions and visibility constraints is devised. High-quality, non-invasive, real-time hand tracking is achieved based on this regularized articulated registration formulation
    • …
    corecore