24,429 research outputs found
Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios
Event cameras are bio-inspired vision sensors that output pixel-level
brightness changes instead of standard intensity frames. These cameras do not
suffer from motion blur and have a very high dynamic range, which enables them
to provide reliable visual information during high speed motions or in scenes
characterized by high dynamic range. However, event cameras output only little
information when the amount of motion is limited, such as in the case of almost
still motion. Conversely, standard cameras provide instant and rich information
about the environment most of the time (in low-speed and good lighting
scenarios), but they fail severely in case of fast motions, or difficult
lighting such as high dynamic range or low light scenes. In this paper, we
present the first state estimation pipeline that leverages the complementary
advantages of these two sensors by fusing in a tightly-coupled manner events,
standard frames, and inertial measurements. We show on the publicly available
Event Camera Dataset that our hybrid pipeline leads to an accuracy improvement
of 130% over event-only pipelines, and 85% over standard-frames-only
visual-inertial systems, while still being computationally tractable.
Furthermore, we use our pipeline to demonstrate - to the best of our knowledge
- the first autonomous quadrotor flight using an event camera for state
estimation, unlocking flight scenarios that were not reachable with traditional
visual-inertial odometry, such as low-light environments and high-dynamic range
scenes.Comment: 8 pages, 9 figures, 2 table
Multi-sensor classification of tennis strokes
In this work, we investigate tennis stroke recognition
using a single inertial measuring unit attached to a player’s forearm during a competitive match. This paper evaluates the best approach for stroke detection using either accelerometers, gyroscopes or magnetometers, which are embedded into the inertial measuring unit. This work concludes what is the optimal training data set for stroke classification and proves that classifiers can perform well when tested on players who were not used to train the classifier. This work provides a significant step forward for our overall goal, which is to develop next generation sports coaching tools using both inertial and visual sensors in an instrumented indoor sporting environment
Toward next generation coaching tools for court based racquet sports
Even with today’s advances in automatic indexing of multimedia content, existing coaching tools for court sports lack the ability to automatically index a competitive match into key events. This paper proposes an automatic event indexing and event retrieval system
for tennis, which can be used to coach from beginners upwards.
Event indexing is possible using either visual or inertial sensing, with the latter potentially providing system portability. To achieve maximum performance in event indexing, multi-sensor data integration is implemented, where data from both sensors is merged to automatically index key tennis events. A complete event retrieval
system is also presented to allow coaches to build advanced queries which existing sports coaching solutions cannot facilitate without an inordinate amount of manual indexing
Transfer Learning-Based Crack Detection by Autonomous UAVs
Unmanned Aerial Vehicles (UAVs) have recently shown great performance
collecting visual data through autonomous exploration and mapping in building
inspection. Yet, the number of studies is limited considering the post
processing of the data and its integration with autonomous UAVs. These will
enable huge steps onward into full automation of building inspection. In this
regard, this work presents a decision making tool for revisiting tasks in
visual building inspection by autonomous UAVs. The tool is an implementation of
fine-tuning a pretrained Convolutional Neural Network (CNN) for surface crack
detection. It offers an optional mechanism for task planning of revisiting
pinpoint locations during inspection. It is integrated to a quadrotor UAV
system that can autonomously navigate in GPS-denied environments. The UAV is
equipped with onboard sensors and computers for autonomous localization,
mapping and motion planning. The integrated system is tested through
simulations and real-world experiments. The results show that the system
achieves crack detection and autonomous navigation in GPS-denied environments
for building inspection
- …