6 research outputs found

    Hearing What You Cannot See: Acoustic Vehicle Detection Around Corners

    Full text link
    This work proposes to use passive acoustic perception as an additional sensing modality for intelligent vehicles. We demonstrate that approaching vehicles behind blind corners can be detected by sound before such vehicles enter in line-of-sight. We have equipped a research vehicle with a roof-mounted microphone array, and show on data collected with this sensor setup that wall reflections provide information on the presence and direction of occluded approaching vehicles. A novel method is presented to classify if and from what direction a vehicle is approaching before it is visible, using as input Direction-of-Arrival features that can be efficiently computed from the streaming microphone array data. Since the local geometry around the ego-vehicle affects the perceived patterns, we systematically study several environment types, and investigate generalization across these environments. With a static ego-vehicle, an accuracy of 0.92 is achieved on the hidden vehicle classification task. Compared to a state-of-the-art visual detector, Faster R-CNN, our pipeline achieves the same accuracy more than one second ahead, providing crucial reaction time for the situations we study. While the ego-vehicle is driving, we demonstrate positive results on acoustic detection, still achieving an accuracy of 0.84 within one environment type. We further study failure cases across environments to identify future research directions.Comment: Accepted to IEEE Robotics & Automation Letters (2021), DOI: 10.1109/LRA.2021.3062254. Code, Data & Video: https://github.com/tudelft-iv/occluded_vehicle_acoustic_detectio

    Onboard Audio and Video Processing for Secure Detection, Localization, and Tracking in Counter-UAV Applications

    Get PDF
    Nowadays, UAVs are of fundamental importance in numerous civil applications like search and rescue and military applications like monitoring and patrolling or counter-UAV where the remote UAV nodes collect sensor data. In the last case, flying UAVs collect environmental data to be used to contrast external attacks launched by adversary drones. However, due to the limited computing resources on board of the acquisition UAVs, most of the signal processing is still performed on a ground central unit where the sensor data is sent wirelessly. This poses serious security problems from malicious entities such as cyber attacks that exploit vulnerabilities at the application level. One possibility to reduce the risk is to concentrate part of the computing onboard of the remote nodes. In this context, we propose a framework where detection of nearby drones and their localization and tracking can be performed in real-time on the small computing devices mounted on board of the drones. Background subtraction is applied to the video frames for pre-processing with the objective of an on-board UAV detection using machine-vision algorithms. For the localization and tracking of the detected UAV, multi-channel acoustic signals are instead considered and DOA estimations are obtained through the MUSIC algorithm. In this work, the proposed idea is described in detail along with some experiments and, then, methods of effective implementation are provided

    Outdoor auditory scene analysis using a moving microphone array embedded in a quadrocopter

    No full text

    Outdoor auditory scene analysis using a moving microphone array embedded in a quadrocopter

    No full text

    高速ビジョンを用いた振動源定位に関する研究

    Get PDF
    広島大学(Hiroshima University)博士(工学)Doctor of Engineeringdoctora
    corecore