3,598 research outputs found

    Multisensor data fusion for joint people tracking and identification with a service robot

    Get PDF
    Tracking and recognizing people are essential skills modern service robots have to be provided with. The two tasks are generally performed independently, using ad-hoc solutions that first estimate the location of humans and then proceed with their identification. The solution presented in this paper, instead, is a general framework for tracking and recognizing people simultaneously with a mobile robot, where the estimates of the human location and identity are fused using probabilistic techniques. Our approach takes inspiration from recent implementations of joint tracking and classification, where the considered targets are mainly vehicles and aircrafts in military and civilian applications. We illustrate how people can be robustly tracked and recognized with a service robot using an improved histogram-based detection and multisensor data fusion. Some experiments in real challenging scenarios show the good performance of our solution

    Extended Object Tracking: Introduction, Overview and Applications

    Full text link
    This article provides an elaborate overview of current research in extended object tracking. We provide a clear definition of the extended object tracking problem and discuss its delimitation to other types of object tracking. Next, different aspects of extended object modelling are extensively discussed. Subsequently, we give a tutorial introduction to two basic and well used extended object tracking approaches - the random matrix approach and the Kalman filter-based approach for star-convex shapes. The next part treats the tracking of multiple extended objects and elaborates how the large number of feasible association hypotheses can be tackled using both Random Finite Set (RFS) and Non-RFS multi-object trackers. The article concludes with a summary of current applications, where four example applications involving camera, X-band radar, light detection and ranging (lidar), red-green-blue-depth (RGB-D) sensors are highlighted.Comment: 30 pages, 19 figure

    Image fusion techniqes for remote sensing applications

    Get PDF
    Image fusion refers to the acquisition, processing and synergistic combination of information provided by various sensors or by the same sensor in many measuring contexts. The aim of this survey paper is to describe three typical applications of data fusion in remote sensing. The first study case considers the problem of the Synthetic Aperture Radar (SAR) Interferometry, where a pair of antennas are used to obtain an elevation map of the observed scene; the second one refers to the fusion of multisensor and multitemporal (Landsat Thematic Mapper and SAR) images of the same site acquired at different times, by using neural networks; the third one presents a processor to fuse multifrequency, multipolarization and mutiresolution SAR images, based on wavelet transform and multiscale Kalman filter. Each study case presents also results achieved by the proposed techniques applied to real data

    ATC Trajectory Reconstruction for Automated Evaluation of Sensor and Tracker Performance

    Get PDF
    Currently most air traffic controller decisions are based on the information provided by the ground support tools provided by automation systems, based on a network of surveillance sensors and the associated tracker. To guarantee surveillance integrity, it is clear that performance assessments of the different elements of the surveillance system are necessary. Due to the evolution suffered by the surveillance processing chain in the recent past, its complexity has been increased by the integration of new sensor types (e.g., automatic dependent surveillance-broadcast [ADS-B], Mode S radars, and wide area multilateration [WAM]), data link applications, and networking technologies. With new sensors, there is a need for system-level performance evaluations as well as methods for establishing assessment at each component of the tracking evaluation.This work was funded by contract EUROCONTROL’s TRES, by the Spanish Ministry of Economy and Competitiveness under grants CICYT TEC2008-06732/TEC and CYCIT TEC2011-28626, and by the Government of Madrid under grant S2009/TIC-1485 (CONTEXTS).Publicad

    Multisensor-based human detection and tracking for mobile service robots

    Get PDF
    The one of fundamental issues for service robots is human-robot interaction. In order to perform such a task and provide the desired services, these robots need to detect and track people in the surroundings. In the present paper, we propose a solution for human tracking with a mobile robot that implements multisensor data fusion techniques. The system utilizes a new algorithm for laser-based legs detection using the on-board LRF. The approach is based on the recognition of typical leg patterns extracted from laser scans, which are shown to be very discriminative also in cluttered environments. These patterns can be used to localize both static and walking persons, even when the robot moves. Furthermore, faces are detected using the robot's camera and the information is fused to the legs position using a sequential implementation of Unscented Kalman Filter. The proposed solution is feasible for service robots with a similar device configuration and has been successfully implemented on two different mobile platforms. Several experiments illustrate the effectiveness of our approach, showing that robust human tracking can be performed within complex indoor environments

    Multisensor data fusion via Gaussian process models for dimensional and geometric verification

    Get PDF
    An increasing amount of commercial measurement instruments implementing a wide range of measurement technologies is rapidly becoming available for dimensional and geometric verification. Multiple solutions are often acquired within the shop-floor with the aim of providing alternatives to cover a wider array of measurement needs, thus overcoming the limitations of individual instruments and technologies. In such scenarios, multisensor data fusion aims at going one step further by seeking original and different ways to analyze and combine multiple measurement datasets taken from the same measurand, in order to produce synergistic effects and ultimately obtain overall better measurement results. In this work an original approach to multisensor data fusion is presented, based on the development of Gaussian process models (the technique also known as kriging), starting from point sets acquired from multiple instruments. The approach is illustrated and validated through the application to a simulated test case and two real-life industrial metrology scenarios involving structured light scanners and coordinate measurement machines. The results show that not only the proposed approach allows for obtaining final measurement results whose metrological quality transcends that of the original single-sensor datasets, but also it allows to better characterize metrological performance and potential sources of measurement error originated from within each individual sensor
    corecore