3 research outputs found

    FISST Based Method for Multi-Target Tracking in the Image Plane of Optical Sensors

    Get PDF
    A finite set statistics (FISST)-based method is proposed for multi-target tracking in the image plane of optical sensors. The method involves using signal amplitude information in probability hypothesis density (PHD) filter which is derived from FISST to improve multi-target tracking performance. The amplitude of signals generated by the optical sensor is modeled first, from which the amplitude likelihood ratio between target and clutter is derived. An alternative approach is adopted for the situations where the signal noise ratio (SNR) of target is unknown. Then the PHD recursion equations incorporated with signal information are derived and the Gaussian mixture (GM) implementation of this filter is given. Simulation results demonstrate that the proposed method achieves significantly better performance than the generic PHD filter. Moreover, our method has much lower computational complexity in the scenario with high SNR and dense clutter

    Appearance modeling for persistent object tracking in wide-area and full motion video

    Get PDF
    Object tracking is a core element of computer vision and autonomous systems. As such single and multiple object tracking has been widely investigated especially for full motion video sequences. The acquisition of wide-area motion imagery (WAMI) from moving airborne platforms is a much more recent sensor innovation that has an array of defense and civilian applications with numerous opportunities for providing a unique combination of dense spatial and temporal coverage unmatched by other sensor systems. Airborne WAMI presents a host of challenges for object tracking including large data volume, multi-camera arrays, image stabilization, low resolution targets, target appearance variability and high background clutter especially in urban environments. Time varying low frame rate large imagery poses a range of difficulties in terms of reliable long term multi-target tracking. The focus of this thesis is on the Likelihood of Features Tracking (LOFT) testbed system that is an appearance based (single instance) object tracker designed specifcally for WAMI and follows the track before detect paradigm. The motivation for tracking using dynamics before detecting was so that large scale data can be handled in an environment where computational cost can be kept at a bare minimum. Searching for an object everywhere on a large frame is not practical as there are many similar objects, clutter, high rise structures in case of urban scenes and comes with the additional burden of greatly increased computational cost. LOFT bypasses this difficulty by using filtering and dynamics to constrain the search area to a more realistic region within the large frame and uses multiple features to discern objects of interest. The objects of interest are expected as input in the form of bounding boxes to the algorithm. The main goal of this work is to present an appearance update modeling strategy that fits LOFT's track before detect paradigm and to showcase the accuracy of the overall system as compared with other state of the art tracking algorithms and also with and without the presence of this strategy. The update strategy using various information cues from the Radon Transform was designed with certain performance parameters in mind such as minimal increase in computational cost and a considerable increase in precision and recall rates of the overall system. This has been demonstrated with supporting performance numbers using standard evaluation techniques as in literature. The extensions of LOFT WAMI tracker to include a more detailed appearance model with an update strategy that is well suited for persistent target tracking is novel in the opinion of the author. Key engineering contributions have been made with the help of this work wherein the core LOFT has been evaluated as part several government research and development programs including the Air Force Research Lab's Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) Enterprise to the Edge (CETE), Army Research Lab's Advanced Video Activity Analytics (AVAA) and a proposed fine grained distributed computing architecture on the cloud for processing at the edge. A simplified version of LOFT was developed for tracking objects in standard videos and entered in the Visual Object Tracking (VOT) Challenge competition that is held in conjunction with the leading computer vision conferences. LOFT incorporating the proposed appearance adaptation module produces significantly better tracking results in aerial WAMI of urban scenes

    Distributed Random Set Theoretic Soft/Hard Data Fusion

    Get PDF
    Research on multisensor data fusion aims at providing the enabling technology to combine information from several sources in order to form a unifi ed picture. The literature work on fusion of conventional data provided by non-human (hard) sensors is vast and well-established. In comparison to conventional fusion systems where input data are generated by calibrated electronic sensor systems with well-defi ned characteristics, research on soft data fusion considers combining human-based data expressed preferably in unconstrained natural language form. Fusion of soft and hard data is even more challenging, yet necessary in some applications, and has received little attention in the past. Due to being a rather new area of research, soft/hard data fusion is still in a edging stage with even its challenging problems yet to be adequately de fined and explored. This dissertation develops a framework to enable fusion of both soft and hard data with the Random Set (RS) theory as the underlying mathematical foundation. Random set theory is an emerging theory within the data fusion community that, due to its powerful representational and computational capabilities, is gaining more and more attention among the data fusion researchers. Motivated by the unique characteristics of the random set theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying framework capable of processing both unconventional soft data and conventional hard data, this dissertation argues in favor of a random set theoretic approach as the first step towards realizing a soft/hard data fusion framework. Several challenging problems related to soft/hard fusion systems are addressed in the proposed framework. First, an extension of the well-known Kalman lter within random set theory, called Kalman evidential filter (KEF), is adopted as a common data processing framework for both soft and hard data. Second, a novel ontology (syntax+semantics) is developed to allow for modeling soft (human-generated) data assuming target tracking as the application. Third, as soft/hard data fusion is mostly aimed at large networks of information processing, a new approach is proposed to enable distributed estimation of soft, as well as hard data, addressing the scalability requirement of such fusion systems. Fourth, a method for modeling trust in the human agents is developed, which enables the fusion system to protect itself from erroneous/misleading soft data through discounting such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data fusion literature a novel soft data association algorithm is developed and deployed to extend the proposed target tracking framework into multi-target tracking case. Finally, the multi-target tracking framework is complemented by introducing a distributed classi fication approach applicable to target classes described with soft human-generated data. In addition, this dissertation presents a novel data-centric taxonomy of data fusion methodologies. In particular, several categories of fusion algorithms have been identifi ed and discussed based on the data-related challenging aspect(s) addressed. It is intended to provide the reader with a generic and comprehensive view of the contemporary data fusion literature, which could also serve as a reference for data fusion practitioners by providing them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c data-related challenges expected in a given application
    corecore