Real–time Wireless Sensing at the Edge for In Situ Multi-Robot Deployment

Abstract

Autonomous robots are increasingly being deployed in complex, unstructured, and often GPS-denied environments to support critical tasks such as search and rescue, wildlife monitoring, and exploration. In these domains, coordination among multiple robots is essential for improving task efficiency, spatial coverage, and resilience. Achieving effective multi-robot coordination, however, hinges on access to global state information such as relative positions, which is often difficult to obtain when robots operate under communication constraints and must rely on local onboard sensing alone. Visual sensors such as LiDAR and cameras offer a means of local observation, but their utility degrades significantly in the presence of occlusions or poor visibility. To overcome these limitations, wireless signals have emerged as a complementary sensing modality. By leveraging signal phase variations during a robot’s motion, it is possible to emulate a virtual antenna array and estimate bearing directions to signal sources. However, integrating this capability into mobile robotic systems, and extending it to multi-robot coordination tasks, presents several key challenges related to sensing accuracy, algorithmic scalability, platform constraints, and real-world deployment. The central objective of this thesis is to establish wireless signal-based directionality sensing as a robust and scalable onboard modality for coordination in in situ multi-robot tasks, particularly under constrained communication and sensing conditions. To realize this, the thesis makes contributions at the intersection of algorithm design, systems development, and real-world validation. In the first part, we develop a decentralized and distributed coordination algorithms for multi-robot exploration and mapping, relying on bearing estimates derived from signal-phase measurements using commercial off-the-shelf Wi-Fi cards. This approach eliminates the need for a shared map or a global coordinate frame, enabling coordination in GPS-denied and infrastructure-sparse environments. The system is validated under strict communication constraints where robots share only minimal information. In the second part, we extend this capability beyond traditional 2D planar or linear motion to unconstrained 3D free-space motion, allowing aerial robots to perform coordination. We introduce algorithmic and systems-level advances that generalize the bearing estimation process to three-dimensional trajectories and integrate them into a software toolbox deployable on mobile robots with onboard sensing and computation. Finally, we broaden the applicability of this sensing modality by demonstrating compatibility with low-power, off-the-shelf fish tracking tags operating in the very high frequency (VHF) band. These tags are widely used in marine wildlife research to enable remote sensing at long distances. We integrate this capability into a lightweight drone platform and demonstrate real world deployment in challenging marine environments while accounting for stringent size, weight, and power (SWaP) constraints. Overall, this thesis takes a significant step toward making wireless sensing a core primitive for mo- bile robotic systems. It presents novel algorithmic frameworks, open-source system implementations, and extensive empirical validation—both in simulation and in field deployments—that collectively establish wireless-based directionality as a powerful and general-purpose tool for multi-robot coordination in communication-constrained and unstructured environments.Engineering and Applied Sciences - Computer Scienc

Similar works

Full text

thumbnail-image

Harvard University - DASH

redirect

This paper was published in Harvard University - DASH.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.