4,695 research outputs found
Mixed marker-based/marker-less visual odometry system for mobile robots
When moving in generic indoor environments, robotic platforms generally rely solely on information provided by onboard sensors to determine their position and orientation. However, the lack of absolute references often leads to the introduction of severe drifts in estimates computed, making autonomous operations really hard to accomplish. This paper proposes a solution to alleviate the impact of the above issues by combining two vision‐based pose estimation techniques working on relative and absolute coordinate systems, respectively. In particular, the unknown ground features in the images that are captured by the vertical camera of a mobile platform are processed by a vision‐based odometry algorithm, which is capable of estimating the relative frame‐to‐frame movements. Then, errors accumulated in the above step are corrected using artificial markers displaced at known positions in the environment. The markers are framed from time to time, which allows the robot to maintain the drifts bounded by additionally providing it with the navigation commands needed for autonomous flight. Accuracy and robustness of the designed technique are demonstrated using an off‐the‐shelf quadrotor via extensive experimental test
Virtual Borders: Accurate Definition of a Mobile Robot's Workspace Using Augmented Reality
We address the problem of interactively controlling the workspace of a mobile
robot to ensure a human-aware navigation. This is especially of relevance for
non-expert users living in human-robot shared spaces, e.g. home environments,
since they want to keep the control of their mobile robots, such as vacuum
cleaning or companion robots. Therefore, we introduce virtual borders that are
respected by a robot while performing its tasks. For this purpose, we employ a
RGB-D Google Tango tablet as human-robot interface in combination with an
augmented reality application to flexibly define virtual borders. We evaluated
our system with 15 non-expert users concerning accuracy, teaching time and
correctness and compared the results with other baseline methods based on
visual markers and a laser pointer. The experimental results show that our
method features an equally high accuracy while reducing the teaching time
significantly compared to the baseline methods. This holds for different border
lengths, shapes and variations in the teaching process. Finally, we
demonstrated the correctness of the approach, i.e. the mobile robot changes its
navigational behavior according to the user-defined virtual borders.Comment: Accepted on 2018 IEEE/RSJ International Conference on Intelligent
Robots and Systems (IROS), supplementary video: https://youtu.be/oQO8sQ0JBR
MOMA: Visual Mobile Marker Odometry
In this paper, we present a cooperative odometry scheme based on the
detection of mobile markers in line with the idea of cooperative positioning
for multiple robots [1]. To this end, we introduce a simple optimization scheme
that realizes visual mobile marker odometry via accurate fixed marker-based
camera positioning and analyse the characteristics of errors inherent to the
method compared to classical fixed marker-based navigation and visual odometry.
In addition, we provide a specific UAV-UGV configuration that allows for
continuous movements of the UAV without doing stops and a minimal
caterpillar-like configuration that works with one UGV alone. Finally, we
present a real-world implementation and evaluation for the proposed UAV-UGV
configuration
Performance Study on Natural Marker Detection for Augmented Reality Supported Facility Maintenance
The operation and maintenance phase is the longest and most expensive life-cycle period of building facilities. Operators need to perform activities to provide a comfortable living and working environment and to upkeep equipment to prevent functionality failures. For that purpose they manually browse, sort and select dispersed and unformatted facility information before actually going on the site. Although some software tools have been introduced, they still spent 50% of the on-site work on inspection target localization and navigation. To improve these manual, time consuming and tedious procedures, the authors previously presented a framework that uses BIM-based Augmented Reality (AR) to support facility maintenance tasks. The proposed workflow contains AR supported activities, namely AR-based indoor navigation and AR-based maintenance instructions. An inherent problem of AR is marker definition and detection. As introduced, indoor natural markers such as exit signs, fire extinguisher location signs, and appliances’ labels were identified to be suitable for both navigation and maintenance instructions. However, small markers, changing lighting conditions, low detection frame rates and accuracies might prevent the proposed approach from being practical. In this paper the performance of natural marker detection will be evaluated under different configurations, varying marker types, marker sizes, camera resolutions and lighting conditions. The detection performance will be measured using pre-defined metrics incorporating detection accuracy, tracking quality, frame rates, and robustness. The result will be a set of recommendations on what configurations are most suitable and practical within the given framework
AUGMENTED REALITY BASED INDOOR POSITIONING NAVIGATION TOOL.
Nowadays, indoor navigation gained people’s attention. Lots of techniques and technologies have been used in order to develop the indoor navigation. Indoor navigation is far away behind the outdoor navigation. For outdoor navigation, we have GPS to guide and give direction to the desired place. Unfortunately, it is restricted for the outdoor purpose only. Thus, the main objective of this project is to develop an interactive indoor navigation system and augmented reality is being use to superimposed the directional signage. In this project small computer which is Raspberry Pi has been used as a computing device. Probably in the future, all smartphones will have augmented reality based indoor navigation tools because it already equipped with many sensors such as an accelerometer, gyro, and compass which will improve the accuracy of positioning. Basically, the project has been tested at Universiti Teknologi PETRONAS’s Information Resource Centre (IRC), and it has shown its flexibility in working as an indoor positioning tool to navigate to 5 different locations with multiple levels
Use of Augmented Reality in Human Wayfinding: A Systematic Review
Augmented reality technology has emerged as a promising solution to assist
with wayfinding difficulties, bridging the gap between obtaining navigational
assistance and maintaining an awareness of one's real-world surroundings. This
article presents a systematic review of research literature related to AR
navigation technologies. An in-depth analysis of 65 salient studies was
conducted, addressing four main research topics: 1) current state-of-the-art of
AR navigational assistance technologies, 2) user experiences with these
technologies, 3) the effect of AR on human wayfinding performance, and 4)
impacts of AR on human navigational cognition. Notably, studies demonstrate
that AR can decrease cognitive load and improve cognitive map development, in
contrast to traditional guidance modalities. However, findings regarding
wayfinding performance and user experience were mixed. Some studies suggest
little impact of AR on improving outdoor navigational performance, and certain
information modalities may be distracting and ineffective. This article
discusses these nuances in detail, supporting the conclusion that AR holds
great potential in enhancing wayfinding by providing enriched navigational
cues, interactive experiences, and improved situational awareness.Comment: 52 page
Augmented Reality Based Indoor Positioning Navigation Tool
The main objective of this project is to design a new method to develop indoor positioning navigation system without using wireless technology through image processing
- …