230 research outputs found
Marine Vessel Inspection as a Novel Field for Service Robotics: A Contribution to Systems, Control Methods and Semantic Perception Algorithms.
This cumulative thesis introduces a novel field for service robotics: the inspection of marine vessels using mobile inspection robots. In this thesis, three scientific contributions are provided and experimentally verified in the field of marine inspection, but are not limited to this type of application. The inspection scenario is merely a golden thread to combine the cumulative scientific results presented in this thesis. The first contribution is an adaptive, proprioceptive control approach for hybrid leg-wheel robots, such as the robot ASGUARD described in this thesis. The robot is able to deal with rough terrain and stairs, due to the control concept introduced in this thesis. The proposed system is a suitable platform to move inside the cargo holds of bulk carriers and to deliver visual data from inside the hold. Additionally, the proposed system also has stair climbing abilities, allowing the system to move between different decks. The robot adapts its gait pattern dynamically based on proprioceptive data received from the joint motors and based on the pitch and tilt angle of the robot's body during locomotion. The second major contribution of the thesis is an independent ship inspection system, consisting of a magnetic wall climbing robot for bulkhead inspection, a particle filter based localization method, and a spatial content management system (SCMS) for spatial inspection data representation and organization. The system described in this work was evaluated in several laboratory experiments and field trials on two different marine vessels in close collaboration with ship surveyors. The third scientific contribution of the thesis is a novel approach to structural classification using semantic perception approaches. By these methods, a structured environment can be semantically annotated, based on the spatial relationships between spatial entities and spatial features. This method was verified in the domain of indoor perception (logistics and household environment), for soil sample classification, and for the classification of the structural parts of a marine vessel. The proposed method allows the description of the structural parts of a cargo hold in order to localize the inspection robot or any detected damage. The algorithms proposed in this thesis are based on unorganized 3D point clouds, generated by a LIDAR within a ship's cargo hold. Two different semantic perception methods are proposed in this thesis. One approach is based on probabilistic constraint networks; the second approach is based on Fuzzy Description Logic and spatial reasoning using a spatial ontology about the environment
Coupling Vision and Proprioception for Navigation of Legged Robots
We exploit the complementary strengths of vision and proprioception to
develop a point-goal navigation system for legged robots, called VP-Nav. Legged
systems are capable of traversing more complex terrain than wheeled robots, but
to fully utilize this capability, we need a high-level path planner in the
navigation system to be aware of the walking capabilities of the low-level
locomotion policy in varying environments. We achieve this by using
proprioceptive feedback to ensure the safety of the planned path by sensing
unexpected obstacles like glass walls, terrain properties like slipperiness or
softness of the ground and robot properties like extra payload that are likely
missed by vision. The navigation system uses onboard cameras to generate an
occupancy map and a corresponding cost map to reach the goal. A fast marching
planner then generates a target path. A velocity command generator takes this
as input to generate the desired velocity for the walking policy. A safety
advisor module adds sensed unexpected obstacles to the occupancy map and
environment-determined speed limits to the velocity command generator. We show
superior performance compared to wheeled robot baselines, and ablation studies
which have disjoint high-level planning and low-level control. We also show the
real-world deployment of VP-Nav on a quadruped robot with onboard sensors and
computation. Videos at https://navigation-locomotion.github.ioComment: CVPR 2022 final version. Website at
https://navigation-locomotion.github.i
Reimagining Robotic Walkers For Real-World Outdoor Play Environments With Insights From Legged Robots: A Scoping Review
PURPOSE
For children with mobility impairments, without cognitive delays, who want to participate in outdoor activities, existing assistive technology (AT) to support their needs is limited. In this review, we investigate the control and design of a selection of robotic walkers while exploring a selection of legged robots to develop solutions that address this gap in robotic AT. METHOD
We performed a comprehensive literature search from four main databases: PubMed, Google Scholar, Scopus, and IEEE Xplore. The keywords used in the search were the following: “walker”, “rollator”, “smart walker”, “robotic walker”, “robotic rollator”. Studies were required to discuss the control or design of robotic walkers to be considered. A total of 159 papers were analyzed. RESULTS
From the 159 papers, 127 were excluded since they failed to meet our inclusion criteria. The total number of papers analyzed included publications that utilized the same device, therefore we classified the remaining 32 studies into groups based on the type of robotic walker used. This paper reviewed 15 different types of robotic walkers. CONCLUSIONS
The ability of many-legged robots to negotiate and transition between a range of unstructured substrates suggests several avenues of future consideration whose pursuit could benefit robotic AT, particularly regarding the present limitations of wheeled paediatric robotic walkers for children’s daily outside use.
For more information: Kod*lab (link to kodlab.seas.upenn.edu
Recommended from our members
Control Implementation of Dynamic Locomotion on Compliant, Underactuated, Force-Controlled Legged Robots with Non-Anthropomorphic Design
The control of locomotion on legged robots traditionally involves a robot that takes a standard legged form, such as the anthropomorphic humanoid, the dog-like quadruped, or the bird-like biped. Additionally, these systems will often be actuated with position-controlled servos or series-elastic actuators that are connected through rigid links. This work investigates the control implementation of dynamic, force-controlled locomotion on a family of legged systems that significantly deviate from these classic paradigms by incorporating modern, state-of-the-art proprioceptive actuators on uniquely configured compliant legs that do not closely resemble those found in nature. The results of this work can be used to better inform how to implement controllers on legged systems without stiff, position-controlled actuators, and also provide insight on how intelligently designed mechanical features can potentially simplify the control of complex, nonlinear dynamical systems like legged robots. To this end, this work presents the approach to control for a family of non-anthropomorphic bipedal robotic systems which are developed both in simulation and with physical hardware. The first is the Non-Anthropomorphic Biped, Version 1 (NABi-1) that features position-controlled joints along with a compliant foot element on a minimally actuated leg, and is controlled using simple open-loop trajectories based on the Zero Moment Point. The second system is the second version of the non-anthropomorphic biped (NABi-2) which utilizes the proprioceptive Back-drivable Electromagnetic Actuator for Robotics (BEAR) modules for actuation and fully realizes feedback-based force controlled locomotion. These systems are used to highlight both the strengths and weaknesses of utilizing proprioceptive actuation in systems, and suggest the tradeoffs that are made when using force control for dynamic locomotion. These systems also present case studies for different approaches to system design when it comes to bipedal legged robots
An Intelligent Architecture for Legged Robot Terrain Classification Using Proprioceptive and Exteroceptive Data
In this thesis, we introduce a novel architecture called Intelligent Architecture for Legged Robot Terrain Classification Using Proprioceptive and Exteroceptive Data (iARTEC ) . The proposed architecture integrates different terrain characterization and classification with other robotic system components. Within iARTEC , we consider the problem of having a legged robot autonomously learn to identify different terrains. Robust terrain identification can be used to enhance the capabilities of legged robot systems, both in terms of locomotion and navigation. For example, a robot that has learned to differentiate sand from gravel can autonomously modify (or even select a different) path in favor of traversing over a better terrain. The same knowledge of the terrain type can also be used to guide a robot in order to avoid specific terrains. To tackle this problem, we developed four approaches for terrain characterization, classification, path planning, and control for a mobile legged robot. We developed a particle system inspired approach to estimate the robot footâ ground contact interaction forces. The approach is derived from the well known Bekkerâ s theory to estimate the contact forces based on its point contact model concepts. It is realistically model real-time 3-dimensional contact behaviors between rigid body objects and the soil. For a real-time capable implementation of this approach, its reformulated to use a lookup table generated from simple contact experiments of the robot foot with the terrain. Also, we introduced a short-range terrain classifier using the robot embodied data. The classifier is based on a supervised machine learning approach to optimize the classifier parameters and terrain it using proprioceptive sensor measurements. The learning framework preprocesses sensor data through channel reduction and filtering such that the classifier is trained on the feature vectors that are closely associated with terrain class. For the long-range terrain type prediction using the robot exteroceptive data, we present an online visual terrain classification system. It uses only a monocular camera with a feature-based terrain classification algorithm which is robust to changes in illumination and view points. For this algorithm, we extract local features of terrains using Speed Up Robust Feature (SURF). We encode the features using the Bag of Words (BoW) technique, and then classify the words using Support Vector Machines (SVMs). In addition, we described a terrain dependent navigation and path planning approach that is based on E* planer and employs a proposed metric that specifies the navigation costs associated terrain types. This generated path naturally avoids obstacles and favors terrains with lower values of the metric. At the low level, a proportional input-scaling controller is designed and implemented to autonomously steer the robot to follow the desired path in a stable manner. iARTEC performance was tested and validated experimentally using several different sensing modalities (proprioceptive and exteroceptive) and on the six legged robotic platform CREX. The results show that the proposed architecture integrating the aforementioned approaches with the robotic system allowed the robot to learn both robot-terrain interaction and remote terrain perception models, as well as the relations linking those models. This learning mechanism is performed according to the robot own embodied data. Based on the knowledge available, the approach makes use of the detected remote terrain classes to predict the most probable navigation behavior. With the assigned metric, the performance of the robot on a given terrain is predicted. This allows the navigation of the robot to be influenced by the learned models. Finally, we believe that iARTEC and the methods proposed in this thesis can likely also be implemented on other robot types (such as wheeled robots), although we did not test this option in our work
- …