2,577 research outputs found

    A System Implementation and Evaluation of a Cooperative Fusion and Tracking Algorithm based on a Gaussian Mixture PHD Filter

    Get PDF
    This paper focuses on a real system implementation, analysis, and evaluation of a cooperative sensor fusion algorithm based on a Gaussian Mixture Probability Hypothesis Density (GM-PHD) filter, using simulated and real vehicles endowed with automotive-grade sensors. We have extended our previously presented cooperative sensor fusion algorithm with a fusion weight optimization method and implemented it on a vehicle that we denote as the ego vehicle. The algorithm fuses information obtained from one or more vehicles located within a certain range (that we call cooperative), which are running a multi-object tracking PHD filter, and which are sharing their object estimates. The algorithm is evaluated on two Citroen C-ZERO prototype vehicles equipped with Mobileye cameras for object tracking and lidar sensors from which the ground truth positions of the tracked objects are extracted. Moreover, the algorithm is evaluated in simulation using simulated C-ZERO vehicles and simulated Mobileye cameras. The ground truth positions of tracked objects are in this case provided by the simulator. Multiple experimental runs are conducted in both simulated and real-world conditions in which a few legacy vehicles were tracked. Results show that the cooperative fusion algorithm allows for extending the sensing field of view, while keeping the tracking accuracy and errors similar to the case in which the vehicles act alone

    Multisensor Poisson Multi-Bernoulli Filter for Joint Target-Sensor State Tracking

    Full text link
    In a typical multitarget tracking (MTT) scenario, the sensor state is either assumed known, or tracking is performed in the sensor's (relative) coordinate frame. This assumption does not hold when the sensor, e.g., an automotive radar, is mounted on a vehicle, and the target state should be represented in a global (absolute) coordinate frame. Then it is important to consider the uncertain location of the vehicle on which the sensor is mounted for MTT. In this paper, we present a multisensor low complexity Poisson multi-Bernoulli MTT filter, which jointly tracks the uncertain vehicle state and target states. Measurements collected by different sensors mounted on multiple vehicles with varying location uncertainty are incorporated sequentially based on the arrival of new sensor measurements. In doing so, targets observed from a sensor mounted on a well-localized vehicle reduce the state uncertainty of other poorly localized vehicles, provided that a common non-empty subset of targets is observed. A low complexity filter is obtained by approximations of the joint sensor-feature state density minimizing the Kullback-Leibler divergence (KLD). Results from synthetic as well as experimental measurement data, collected in a vehicle driving scenario, demonstrate the performance benefits of joint vehicle-target state tracking.Comment: 13 pages, 7 figure

    5G mmWave Cooperative Positioning and Mapping Using Multi-Model PHD Filter and Map Fusion

    Get PDF
    5G millimeter wave (mmWave) signals can enable accurate positioning in vehicular networks when the base station and vehicles are equipped with large antenna arrays. However, radio-based positioning suffers from multipath signals generated by different types of objects in the physical environment. Multipath can be turned into a benefit, by building up a radio map (comprising the number of objects, object type, and object state) and using this map to exploit all available signal paths for positioning. We propose a new method for cooperative vehicle positioning and mapping of the radio environment, comprising a multiple-model probability hypothesis density filter and a map fusion routine, which is able to consider different types of objects and different fields of views. Simulation results demonstrate the performance of the proposed method

    Multi-sensor data fusion techniques for RPAS detect, track and avoid

    Get PDF
    Accurate and robust tracking of objects is of growing interest amongst the computer vision scientific community. The ability of a multi-sensor system to detect and track objects, and accurately predict their future trajectory is critical in the context of mission- and safety-critical applications. Remotely Piloted Aircraft System (RPAS) are currently not equipped to routinely access all classes of airspace since certified Detect-and-Avoid (DAA) systems are yet to be developed. Such capabilities can be achieved by incorporating both cooperative and non-cooperative DAA functions, as well as providing enhanced communications, navigation and surveillance (CNS) services. DAA is highly dependent on the performance of CNS systems for Detection, Tacking and avoiding (DTA) tasks and maneuvers. In order to perform an effective detection of objects, a number of high performance, reliable and accurate avionics sensors and systems are adopted including non-cooperative sensors (visual and thermal cameras, Laser radar (LIDAR) and acoustic sensors) and cooperative systems (Automatic Dependent Surveillance-Broadcast (ADS-B) and Traffic Collision Avoidance System (TCAS)). In this paper the sensors and system information candidates are fully exploited in a Multi-Sensor Data Fusion (MSDF) architecture. An Unscented Kalman Filter (UKF) and a more advanced Particle Filter (PF) are adopted to estimate the state vector of the objects based for maneuvering and non-maneuvering DTA tasks. Furthermore, an artificial neural network is conceptualised/adopted to exploit the use of statistical learning methods, which acts to combined information obtained from the UKF and PF. After describing the MSDF architecture, the key mathematical models for data fusion are presented. Conceptual studies are carried out on visual and thermal image fusion architectures

    Multi-object tracking using sensor fusion

    Get PDF

    5G mmWave Cooperative Positioning and Mapping using Multi-Model PHD Filter and Map Fusion

    Get PDF
    5G millimeter wave (mmWave) signals can enable accurate positioning in vehicular networks when the base station and vehicles are equipped with large antenna arrays. However, radio-based positioning suffers from multipath signals generated by different types of objects in the physical environment. Multipath can be turned into a benefit, by building up a radio map (comprising the number of objects, object type, and object state) and using this map to exploit all available signal paths for positioning. We propose a new method for cooperative vehicle positioning and mapping of the radio environment, comprising a multiple-model probability hypothesis density filter and a map fusion routine, which is able to consider different types of objects and different fields of views. Simulation results demonstrate the performance of the proposed method.Comment: This work has been accepted in the IEEE Transactions on Wireless Communication

    Multi-object tracking using sensor fusion

    Get PDF
    • …
    corecore