1,240 research outputs found

    Multisensor Poisson Multi-Bernoulli Filter for Joint Target-Sensor State Tracking

    Full text link
    In a typical multitarget tracking (MTT) scenario, the sensor state is either assumed known, or tracking is performed in the sensor's (relative) coordinate frame. This assumption does not hold when the sensor, e.g., an automotive radar, is mounted on a vehicle, and the target state should be represented in a global (absolute) coordinate frame. Then it is important to consider the uncertain location of the vehicle on which the sensor is mounted for MTT. In this paper, we present a multisensor low complexity Poisson multi-Bernoulli MTT filter, which jointly tracks the uncertain vehicle state and target states. Measurements collected by different sensors mounted on multiple vehicles with varying location uncertainty are incorporated sequentially based on the arrival of new sensor measurements. In doing so, targets observed from a sensor mounted on a well-localized vehicle reduce the state uncertainty of other poorly localized vehicles, provided that a common non-empty subset of targets is observed. A low complexity filter is obtained by approximations of the joint sensor-feature state density minimizing the Kullback-Leibler divergence (KLD). Results from synthetic as well as experimental measurement data, collected in a vehicle driving scenario, demonstrate the performance benefits of joint vehicle-target state tracking.Comment: 13 pages, 7 figure

    Marker-Less Stage Drift Correction in Super-Resolution Microscopy Using the Single-Cluster PHD Filter

    Get PDF
    Fluorescence microscopy is a technique which allows the imaging of cellular and intracellular dynamics through the activation of fluorescent molecules attached to them. It is a very important technique because it can be used to analyze the behavior of intracellular processes in vivo in contrast to methods like electron microscopy. There are several challenges related to the extraction of meaningful information from images acquired from optical microscopes due to the low contrast between objects and background and the fact that point-like objects are observed as blurred spots due to the diffraction limit of the optical system. Another consideration is that for the study of intracellular dynamics, multiple particles must be tracked at the same time, which is a challenging task due to problems such as the presence of false positives and missed detections in the acquired data. Additionally, the objective of the microscope is not completely static with respect to the cover slip due to mechanical vibrations or thermal expansions which introduces bias in the measurements. In this paper, a Bayesian approach is used to simultaneously track the locations of objects with different motion behaviors and the stage drift using image data obtained from fluorescence microscopy experiments. Namely, detections are extracted from the acquired frames using image processing techniques, and then these detections are used to accurately estimate the particle positions and simultaneously correct the drift introduced by the motion of the sample stage. A single cluster Probability Hypothesis Density (PHD) filter with object classification is used for the estimation of the multiple target state assuming different motion behaviors. The detection and tracking methods are tested and their performance is evaluated on both simulated and real data

    Joint Registration and Fusion of an Infra-Red Camera and Scanning Radar in a Maritime Context

    Get PDF
    The number of nodes in sensor networks is continually increasing, and maintaining accurate track estimates inside their common surveillance region is a critical necessity. Modern sensor platforms are likely to carry a range of different sensor modalities, all providing data at differing rates, and with varying degrees of uncertainty. These factors complicate the fusion problem as multiple observation models are required, along with a dynamic prediction model. However, the problem is exacerbated when sensors are not registered correctly with respect to each other, i.e. if they are subject to a static or dynamic bias. In this case, measurements from different sensors may correspond to the same target, but do not correlate with each other when in the same Frame of Reference (FoR), which decreases track accuracy. This paper presents a method to jointly estimate the state of multiple targets in a surveillance region, and to correctly register a radar and an Infrared Search and Track (IRST) system onto the same FoR to perform sensor fusion. Previous work using this type of parent-offspring process has been successful when calibrating a pair of cameras, but has never been attempted on a heterogeneous sensor network, nor in a maritime environment. This article presents results on both simulated scenarios and a segment of real data that show a significant increase in track quality in comparison to using incorrectly calibrated sensors or single-radar only

    A New Wave in Robotics: Survey on Recent mmWave Radar Applications in Robotics

    Full text link
    We survey the current state of millimeterwave (mmWave) radar applications in robotics with a focus on unique capabilities, and discuss future opportunities based on the state of the art. Frequency Modulated Continuous Wave (FMCW) mmWave radars operating in the 76--81GHz range are an appealing alternative to lidars, cameras and other sensors operating in the near visual spectrum. Radar has been made more widely available in new packaging classes, more convenient for robotics and its longer wavelengths have the ability to bypass visual clutter such as fog, dust, and smoke. We begin by covering radar principles as they relate to robotics. We then review the relevant new research across a broad spectrum of robotics applications beginning with motion estimation, localization, and mapping. We then cover object detection and classification, and then close with an analysis of current datasets and calibration techniques that provide entry points into radar research.Comment: 19 Pages, 11 Figures, 2 Tables, TRO Submission pendin

    Data fusion for unsupervised video object detection, tracking and geo-positioning

    Get PDF
    In this work we describe a system and propose a novel algorithm for moving object detection and tracking based on video feed. Apart of many well-known algorithms, it performs detection in unsupervised style, using velocity criteria for the objects detection. The algorithm utilises data from a single camera and Inertial Measurement Unit (IMU) sensors and performs fusion of video and sensory data captured from the UAV. The algorithm includes object tracking and detection, augmented by object geographical co-ordinates estimation. The algorithm can be generalised for any particular video sensor and is not restricted to any specific applications. For object tracking, Bayesian filter scheme combined with approximate inference is utilised. Object localisation in real-world co-ordinates is based on the tracking results and IMU sensor measurements
    corecore