46,118 research outputs found
Integration of the 3D Environment for UAV Onboard Visual Object Tracking
Single visual object tracking from an unmanned aerial vehicle (UAV) poses
fundamental challenges such as object occlusion, small-scale objects,
background clutter, and abrupt camera motion. To tackle these difficulties, we
propose to integrate the 3D structure of the observed scene into a
detection-by-tracking algorithm. We introduce a pipeline that combines a
model-free visual object tracker, a sparse 3D reconstruction, and a state
estimator. The 3D reconstruction of the scene is computed with an image-based
Structure-from-Motion (SfM) component that enables us to leverage a state
estimator in the corresponding 3D scene during tracking. By representing the
position of the target in 3D space rather than in image space, we stabilize the
tracking during ego-motion and improve the handling of occlusions, background
clutter, and small-scale objects. We evaluated our approach on prototypical
image sequences, captured from a UAV with low-altitude oblique views. For this
purpose, we adapted an existing dataset for visual object tracking and
reconstructed the observed scene in 3D. The experimental results demonstrate
that the proposed approach outperforms methods using plain visual cues as well
as approaches leveraging image-space-based state estimations. We believe that
our approach can be beneficial for traffic monitoring, video surveillance, and
navigation.Comment: Accepted in MDPI Journal of Applied Science
Multi-Scale 3D Scene Flow from Binocular Stereo Sequences
Scene flow methods estimate the three-dimensional motion field for points in the world, using multi-camera video data. Such methods combine multi-view reconstruction with motion estimation. This paper describes an alternative formulation for dense scene flow estimation that provides reliable results using only two cameras by fusing stereo and optical flow estimation into a single coherent framework. Internally, the proposed algorithm generates probability distributions for optical flow and disparity. Taking into account the uncertainty in the intermediate stages allows for more reliable estimation of the 3D scene flow than previous methods allow. To handle the aperture problems inherent in the estimation of optical flow and disparity, a multi-scale method along with a novel region-based technique is used within a regularized solution. This combined approach both preserves discontinuities and prevents over-regularization – two problems commonly associated with the basic multi-scale approaches. Experiments with synthetic and real test data demonstrate the strength of the proposed approach.National Science Foundation (CNS-0202067, IIS-0208876); Office of Naval Research (N00014-03-1-0108
- …