799 research outputs found

    Dynamic Switching State Systems for Visual Tracking

    Get PDF
    This work addresses the problem of how to capture the dynamics of maneuvering objects for visual tracking. Towards this end, the perspective of recursive Bayesian filters and the perspective of deep learning approaches for state estimation are considered and their functional viewpoints are brought together

    Dynamic Switching State Systems for Visual Tracking

    Get PDF
    This work addresses the problem of how to capture the dynamics of maneuvering objects for visual tracking. Towards this end, the perspective of recursive Bayesian filters and the perspective of deep learning approaches for state estimation are considered and their functional viewpoints are brought together

    Advanced machine learning approaches for target detection, tracking and recognition

    Get PDF
    This dissertation addresses the key technical components of an Automatic Target Recognition (ATR) system namely: target detection, tracking, learning and recognition. Novel solutions are proposed for each component of the ATR system based on several new advances in the field of computer vision and machine learning. Firstly, we introduce a simple and elegant feature, RelCom, and a boosted feature selection method to achieve a very low computational complexity target detector. Secondly, we present a particle filter based target tracking algorithm that uses a quad histogram based appearance model along with online feature selection. Further, we improve the tracking performance by means of online appearance learning where appearance learning is cast as an Adaptive Kalman filtering (AKF) problem which we formulate using both covariance matching and, for the first time in a visual tracking application, the recent autocovariance least-squares (ALS) method. Then, we introduce an integrated tracking and recognition system that uses two generative models to accommodate the pose variations and maneuverability of different ground targets. Specifically, a tensor-based generative model is used for multi-view target representation that can synthesize unseen poses, and can be trained from a small set of signatures. In addition, a target-dependent kinematic model is invoked to characterize the target dynamics. Both generative models are integrated in a graphical framework for joint estimation of the target's kinematics, pose, and discrete valued identity. Finally, for target recognition we advocate the concept of a continuous identity manifold that captures both inter-class and intra-class shape variability among training targets. A hemispherical view manifold is used for modeling the view-dependent appearance. In addition to being able to deal with arbitrary view variations, this model can determine the target identity at both class and sub-class levels, for targets not present in the training data. The proposed components of the ATR system enable us to perform low computational complexity target detection with low false alarm rates, robust tracking of targets under challenging circumstances and recognition of target identities at both class and sub-class levels. Experiments on real and simulated data confirm the performance of the proposed components with promising results

    Particle filter based target tracking from X-band nautical radar images

    Get PDF
    In this thesis, two particle filter (PF) based visual tracking approaches are designed for maneuvering target tracking from X-band nautical radar images: a PF-only based approach and a combined particle-Kalman filters (PF-KF) based approach. Unlike existing Kalman filter (KF) based target tracking algorithms used by nautical radar, these two proposed tracking methods both employ a kernel-based histogram model to represent the target in the radar image, and a Bhattacharyya coefficient based similar- ity distance between reference and candidate target models to provide the likelihood function for the particle filtering. However, the PF-KF method applies a sampling importance resampling (SIR) particle filter to obtain preliminary target positions, and then a Kalman filter to derive refined target positions and velocities. Moreover, several strategies are also proposed to improve the tracking accuracy and stability. These strategies include an enhanced reference target model construction method, updating reference target model, and adaptive KF for maneuver. Comparison of the target information obtained by the proposed PF-KF method from various field X-band nautical radar image sequences with those measured by GPS shows the pro- posed approach can provide a reliable and flexible online target tracking for nautical radar application. It is also shown that, in the scenario of strong sea clutter, the proposed approach outperforms the PF-only based approach and the classical track- ing approach which combines order-statistics (OS) CFAR processing and the Kalman filter

    A photogrammetric approach for real-time 3D localization and tracking of pedestrians in monocular infrared imagery

    Get PDF
    Target tracking within conventional video imagery poses a significant challenge that is increasingly being addressed via complex algorithmic solutions. The complexity of this problem can be fundamentally attributed to the ambiguity associated with actual 3D scene position of a given tracked object in relation to its observed position in 2D image space. We propose an approach that challenges the current trend in complex tracking solutions by addressing this fundamental ambiguity head-on. In contrast to prior work in the field, we leverage the key advantages of thermal-band infrared (IR) imagery for the pedestrian localization to show that robust localization and foreground target separation, afforded via such imagery, facilities accurate 3D position estimation to within the error bounds of conventional Global Position System (GPS) positioning. This work investigates the accuracy of classical photogrammetry, within the context of current target detection and classification techniques, as a means of recovering the true 3D position of pedestrian targets within the scene. Based on photogrammetric estimation of target position, we then illustrate the efficiency of regular Kalman filter based tracking operating on actual 3D pedestrian scene trajectories. We present both a statistical and experimental analysis of the associated errors of this approach in addition to real-time 3D pedestrian tracking using monocular infrared (IR) imagery from a thermal-band camera. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only

    Dynamic Switching State Systems for Visual Tracking

    Get PDF
    This work addresses the problem of how to capture the dynamics of maneuvering objects for visual tracking. Towards this end, the perspective of recursive Bayesian filters and the perspective of deep learning approaches for state estimation are considered and their functional viewpoints are brought together

    Joint Target Tracking and Recognition Using Shape-based Generative Model

    Get PDF
    Recently a generative model that combines both of identity and view manifolds was proposed for multi-view shape modeling that was originally used for pose estimation and recognition of civilian vehicles from image sequences. In this thesis, we extend this model to both civilian and military vehicles, and examine its effectiveness for real-world automated target tracking and recognition (ATR) applications in both infrared and visible image sequences. A particle filter-based ATR algorithm is introduced where the generative model is used for shape interpolation along both the view and identity manifolds. The ATR algorithm is tested on the newly released SENSIAC (Military Sensing Information Analysis Center) infrared database along with some visible-band image sequences. Overall tracking and recognition performance is evaluated in terms of the accuracy of 3D position/pose estimation and target classification.</School of Electrical & Computer Engineerin

    Vision Science and Technology at NASA: Results of a Workshop

    Get PDF
    A broad review is given of vision science and technology within NASA. The subject is defined and its applications in both NASA and the nation at large are noted. A survey of current NASA efforts is given, noting strengths and weaknesses of the NASA program

    Correlation Filters for Unmanned Aerial Vehicle-Based Aerial Tracking: A Review and Experimental Evaluation

    Full text link
    Aerial tracking, which has exhibited its omnipresent dedication and splendid performance, is one of the most active applications in the remote sensing field. Especially, unmanned aerial vehicle (UAV)-based remote sensing system, equipped with a visual tracking approach, has been widely used in aviation, navigation, agriculture,transportation, and public security, etc. As is mentioned above, the UAV-based aerial tracking platform has been gradually developed from research to practical application stage, reaching one of the main aerial remote sensing technologies in the future. However, due to the real-world onerous situations, e.g., harsh external challenges, the vibration of the UAV mechanical structure (especially under strong wind conditions), the maneuvering flight in complex environment, and the limited computation resources onboard, accuracy, robustness, and high efficiency are all crucial for the onboard tracking methods. Recently, the discriminative correlation filter (DCF)-based trackers have stood out for their high computational efficiency and appealing robustness on a single CPU, and have flourished in the UAV visual tracking community. In this work, the basic framework of the DCF-based trackers is firstly generalized, based on which, 23 state-of-the-art DCF-based trackers are orderly summarized according to their innovations for solving various issues. Besides, exhaustive and quantitative experiments have been extended on various prevailing UAV tracking benchmarks, i.e., UAV123, UAV123@10fps, UAV20L, UAVDT, DTB70, and VisDrone2019-SOT, which contain 371,903 frames in total. The experiments show the performance, verify the feasibility, and demonstrate the current challenges of DCF-based trackers onboard UAV tracking.Comment: 28 pages, 10 figures, submitted to GRS
    corecore