4 research outputs found
Adaptive and Unsupervised Learning-based 3D Spatio-temporal Filter for Event-driven Cameras
In the evolving landscape of robotics and visual navigation, event cameras have gained
significant traction, notably for their exceptional dynamic range, efficient power consumption
and low latency. Despite these advantages, conventional processing methods oversimplify the
data into two dimensions, neglecting critical temporal information. To overcome this limitation,
we propose a novel method that treats events as 3D time-discrete signals. Drawing inspiration
from the intricate biological filtering systems inherent to the human visual apparatus, we’ve
developed a 3D spatio-temporal filter based on unsupervised machine learning algorithm. This
filter effectively reduces noise levels and performs data size reduction, with its parameters being
dynamically adjusted based on Population Activity. This ensures adaptability and precision
under various conditions, like changes in motion velocity and ambient lighting.
In our novel validation approach, we first identify the noise type and determine its power
spectral density in the event stream. We then apply a one-dimensional discrete Fast Fourier
Transform to assess the filtered event data within the frequency domain, ensuring the targeted
noise frequencies are adequately reduced. Our research also delved into the impact of indoor
lighting on event stream noise. Remarkably, our method led to a 37% decrease in the data point
cloud, improving data quality in diverse outdoor settings
Discussion on event-based cameras for dynamic obstacles recognition and detection for UAVs in outdoor environments
To safely navigate and avoid obstacles in a complex dynamic environment, autonomous drones need a reaction time less than 10 milliseconds. Thus, event-based cameras have increasingly become more widespread in the academic research field for dynamic obstacles detection and avoidance for UAV, as their achievements outperform their
frame-based counterparts in term of low-latency. Several publications showed significant results using these sensors. However, most of the experiments relied on indoor data. After a short introduction explaining the differences and features of an event-based camera compared to traditional RGB camera, this work explores the limits of the state-of-art event-based algorithms for obstacles recognition and detection by expanding their results from indoor experiments to real-world outdoor
experiments. Indeed, this paper shows the inaccuracy of event-based
algorithms for recognition due to insufficient amount of events generated and the inefficiency of event-based obstacles detection algorithms due to the high ration of noise
Exploring the Technical Advances and Limits of Autonomous UAVs for Precise Agriculture in Constrained Environments
In the field of precise agriculture with autonomous unmanned aerial vehicles (UAVs), the utilization of drones holds significant potential to transform crop monitoring, management, and harvesting techniques. However, despite the numerous benefits of UAVs in smart farming, there are still several technical challenges that need to be addressed in order to render their widespread adoption possible, especially in constrained environments. This paper provides a study of the technical aspect and limitations of autonomous UAVs in precise agriculture applications for constrained environments
Autonomous Hybrid Ground/Aerial Mobility in Unknown Environments
Hybrid ground and aerial vehicles can possess distinct advantages over
ground-only or flight-only designs in terms of energy savings and increased
mobility. In this work we outline our unified framework for controls, planning,
and autonomy of hybrid ground/air vehicles. Our contribution is three-fold: 1)
We develop a control scheme for the control of passive two-wheeled hybrid
ground/aerial vehicles. 2) We present a unified planner for both rolling and
flying by leveraging differential flatness mappings. 3) We conduct experiments
leveraging mapping and global planning for hybrid mobility in unknown
environments, showing that hybrid mobility uses up to five times less energy
than flying only