47 research outputs found

    Multi-Scale 3D Scene Flow from Binocular Stereo Sequences

    Full text link
    Scene flow methods estimate the three-dimensional motion field for points in the world, using multi-camera video data. Such methods combine multi-view reconstruction with motion estimation. This paper describes an alternative formulation for dense scene flow estimation that provides reliable results using only two cameras by fusing stereo and optical flow estimation into a single coherent framework. Internally, the proposed algorithm generates probability distributions for optical flow and disparity. Taking into account the uncertainty in the intermediate stages allows for more reliable estimation of the 3D scene flow than previous methods allow. To handle the aperture problems inherent in the estimation of optical flow and disparity, a multi-scale method along with a novel region-based technique is used within a regularized solution. This combined approach both preserves discontinuities and prevents over-regularization – two problems commonly associated with the basic multi-scale approaches. Experiments with synthetic and real test data demonstrate the strength of the proposed approach.National Science Foundation (CNS-0202067, IIS-0208876); Office of Naval Research (N00014-03-1-0108

    Reducing Drift in Parametric Motion Tracking

    Get PDF
    We develop a class of differential motion trackers that automatically stabilize when in finite domains. Most differ-ential trackers compute motion only relative to one previous frame, accumulating errors indefinitely. We estimate pose changes between a set of past frames, and develop a probabilistic framework for integrating those estimates. We use an approximation to the posterior distribution of pose changes as an uncertainty model for parametric motion in order to help arbitrate the use of multiple base frames. We demonstrate this framework on a simple 2D translational tracker and a 3D, 6-degree of freedom tracker

    Automatic enhancement of noisy image sequences through local spatio-temporal spectrum analysis

    Get PDF
    Contiene: 13 ilustraciones, 2 tablas y fórmulasA fully automatic method is proposed to produce an enhanced image from a very noisy sequence consisting of a translating object over a background with different translation motion. The method is based on averaging registered versions of the frames in which the object has been motion compensated. Conventional techniques for displacement estimation are not adequate for these very noise sequences, and thus a new strategy has been used taking advantage of the simple model of the sequences. First, the local spatio-temporal spectrum is estimated through a bank of multidirectional/multiscale third order Gaussian derivative filters, yielding a representation of the sequence that facilitates further processing and analysis tasks. Then, energy-related measurements describing the local texture and motion are easily extracted from this representation. These descriptors are used to segment the sequence based on a local joint measure of motion and texture. Once the object of interest has been segmented, its velocity is estimated applying the gradient constraint to the output of a directional band-pass filter for all pixels belonging to the object. Velocity estimates are then used to compensate the motion prior to the average. The results obtained with real sequences of moving ships taken under very noisy conditions are highly satisfactory, demonstrating the robustness and usefulness of the proposed method.Supported by the Comisión Interministerial de Ciencia y Tecnología of Spain, grant TIC98-0925-C02-01Peer reviewe

    Statistics of natural image sequences: temporal motion smoothness by local phase correlations

    Full text link

    Dynamic Visual Motion Estimation

    Get PDF
    corecore