101 research outputs found
Precise Localization and Formation Control of Swarm Robots via Wireless Sensor Networks
Precise localization and formation control are one of the key technologies to achieve coordination and control of swarm robots, which is also currently a bottleneck for practical applications of swarm robotic systems. Aiming at overcoming the limited individual perception and the difficulty of achieving precise localization and formation, a localization approach combining dead reckoning (DR) with wireless sensor network- (WSN-) based methods is proposed in this paper. Two kinds of WSN localization technologies are adopted in this paper, that is, ZigBee-based RSSI (received signal strength indication) global localization and electronic tag floors for calibration of local positioning. First, the DR localization information is combined with the ZigBee-based RSSI position information using the Kalman filter method to achieve precise global localization and maintain the robot formation. Then the electronic tag floors provide the robots with their precise coordinates in some local areas and enable the robot swarm to calibrate its formation by reducing the accumulated position errors. Hence, the overall performance of localization and formation control of the swarm robotic system is improved. Both of the simulation results and the experimental results on a real schematic system are given to demonstrate the success of the proposed approach
Multisensor-based human detection and tracking for mobile service robots
The one of fundamental issues for service robots is human-robot interaction. In order to perform such a task and provide the desired services, these robots need to detect and track people in the surroundings. In the present paper, we propose a solution for human tracking with a mobile robot that implements multisensor data fusion techniques. The system utilizes a new algorithm for laser-based legs detection using the on-board LRF. The approach is based on the recognition of typical leg patterns extracted from laser scans, which are shown to be very discriminative also in cluttered environments. These patterns can be used to localize both static and walking persons, even when the robot moves. Furthermore, faces are detected using the robot's camera and the information is fused to the legs position using a sequential implementation of Unscented Kalman Filter. The proposed solution is feasible for service robots with a similar device configuration and has been successfully implemented on two different mobile platforms.
Several experiments illustrate the effectiveness of our approach, showing that robust human tracking can be performed within complex indoor environments
Active SLAM: A Review On Last Decade
This article presents a comprehensive review of the Active Simultaneous
Localization and Mapping (A-SLAM) research conducted over the past decade. It
explores the formulation, applications, and methodologies employed in A-SLAM,
particularly in trajectory generation and control-action selection, drawing on
concepts from Information Theory (IT) and the Theory of Optimal Experimental
Design (TOED). This review includes both qualitative and quantitative analyses
of various approaches, deployment scenarios, configurations, path-planning
methods, and utility functions within A-SLAM research. Furthermore, this
article introduces a novel analysis of Active Collaborative SLAM (AC-SLAM),
focusing on collaborative aspects within SLAM systems. It includes a thorough
examination of collaborative parameters and approaches, supported by both
qualitative and statistical assessments. This study also identifies limitations
in the existing literature and suggests potential avenues for future research.
This survey serves as a valuable resource for researchers seeking insights into
A-SLAM methods and techniques, offering a current overview of A-SLAM
formulation.Comment: 34 pages, 8 figures, 6 table
Common Data Fusion Framework : An open-source Common Data Fusion Framework for space robotics
Multisensor data fusion plays a vital role in providing autonomous systems with environmental information crucial for reliable functioning. In this article, we summarize the modular structure of the newly developed and released Common Data Fusion Framework and explain how it is used. Sensor data are registered and fused within the Common Data Fusion Framework to produce comprehensive 3D environment representations and pose estimations. The proposed software components to model this process in a reusable manner are presented through a complete overview of the framework, then the provided data fusion algorithms are listed, and through the case of 3D reconstruction from 2D images, the Common Data Fusion Framework approach is exemplified. The Common Data Fusion Framework has been deployed and tested in various scenarios that include robots performing operations of planetary rover exploration and tracking of orbiting satellites
An annotated bibligraphy of multisensor integration
technical reportIn this paper we give an annotated bibliography of the multisensor integration literature
Cooperative Localization of Drones by using Interval Methods
International audienc
Cooperative localization of drones by using interval methods
In this article we address the problem of cooperative pose estimation in a group of unmanned aerial vehicles (UAVs) in a bounded-error context. The UAVs are equipped with cameras to track landmarks, and with a communication and ranging system to cooperate with their neighbors. Measurements are represented by intervals, and constraints are expressed on the robots poses (positions and orientations). Pose domains subpavings are obtained by using set inversion via interval analysis. Each robot of the group first computes a pose domain using only its sensors measurements. Then, through position boxes exchanges, the positions are cooperatively refined by constraint propagation in the group. Results with real robot data are presented, and show that the position accuracy is improved thanks to cooperation
- âŠ