6 research outputs found

    Sewer Robotics

    Get PDF

    Simultaneous localization and mapping for inspection robots in water and sewer pipe networks: a review

    Get PDF
    At the present time, water and sewer pipe networks are predominantly inspected manually. In the near future, smart cities will perform intelligent autonomous monitoring of buried pipe networks, using teams of small robots. These robots, equipped with all necessary computational facilities and sensors (optical, acoustic, inertial, thermal, pressure and others) will be able to inspect pipes whilst navigating, selflocalising and communicating information about the pipe condition and faults such as leaks or blockages to human operators for monitoring and decision support. The predominantly manual inspection of pipe networks will be replaced with teams of autonomous inspection robots that can operate for long periods of time over a large spatial scale. Reliable autonomous navigation and reporting of faults at this scale requires effective localization and mapping, which is the estimation of the robot’s position and its surrounding environment. This survey presents an overview of state-of-the-art works on robot simultaneous localization and mapping (SLAM) with a focus on water and sewer pipe networks. It considers various aspects of the SLAM problem in pipes, from the motivation, to the water industry requirements, modern SLAM methods, map-types and sensors suited to pipes. Future challenges such as robustness for long term robot operation in pipes are discussed, including how making use of prior knowledge, e.g. geographic information systems (GIS) can be used to build map estimates, and improve the multi-robot SLAM in the pipe environmen

    A robust method for approximate visual robot localization in feature-sparse sewer pipes

    Get PDF
    Buried sewer pipe networks present many challenges for robot localization systems, which require non-standard solutions due to the unique nature of these environments: they cannot receive signals from global positioning systems (GPS) and can also lack visual features necessary for standard visual odometry algorithms. In this paper, we exploit the fact that pipe joints are equally spaced and develop a robot localization method based on pipe joint detection that operates in one degree-of-freedom along the pipe length. Pipe joints are detected in visual images from an on-board forward facing (electro-optical) camera using a bag-of-keypoints visual categorization algorithm, which is trained offline by unsupervised learning from images of sewer pipe joints. We augment the pipe joint detection algorithm with drift correction using vision-based manhole recognition. We evaluated the approach using real-world data recorded from three sewer pipes (of lengths 30, 50 and 90 m) and benchmarked against a standard method for visual odometry (ORB-SLAM3), which demonstrated that our proposed method operates more robustly and accurately in these feature-sparse pipes: ORB-SLAM3 completely failed on one tested pipe due to a lack of visual features and gave a mean absolute error in localization of approximately 12%–20% on the other pipes (and regularly lost track of features, having to re-initialize multiple times), whilst our method worked successfully on all tested pipes and gave a mean absolute error in localization of approximately 2%–4%. In summary, our results highlight an important trade-off between modern visual odometry algorithms that have potentially high precision and estimate full six degree-of-freedom pose but are potentially fragile in feature sparse pipes, versus simpler, approximate localization methods that operate in one degree-of-freedom along the pipe length that are more robust and can lead to substantial improvements in accuracy

    A Robot to Measure Water Parameters in Water Distribution Systems

    Get PDF
    Water distribution systems (WDS) are critical infrastructures that transfer drinking water to consumers. In the U.S., around 42 billion gallons of water are being delivered per day via one million miles of pipes to be used in different sectors. Incidents to pipelines cause leak or let contaminants enter purified water in pipe that is harmful to public health. Hence, periodic condition assessments of pipelines and water inside it are required. However, due to the long and complicated configurations of these networks, access to all parts of the pipelines is a cumbersome task. To this aim, in-pipe robots are promising solution that facilitate access to different locations inside pipelines and perform different in-pipe missions. In this project, we design and fabricate an in-pipe robotic system is that is used for water quality monitoring. The robot is equipped with a wireless sensor module and the sensor module is synchronized with the motion unit of the robot. The wireless sensor module facilitates bi-directional data transmission between the robot and base station aboveground. The integrated robotic system navigates in different configurations of the pipeline with smart motion. To this aim, the mechanical design of the self-powered robot based on three adjustable arm modules and three actuator modules is designed. The components of the robot are characterized based on real operation conditions in pipes. A multi-phase motion control algorithm is developed for the robot to move in straight path and non-straight configurations like bends and T-junctions. A bi-directional wireless sensor module is designed to send data packets through underground environment. Finally, the multi-phase motion controller is synchronized with the wireless sensor module and we propose an operation procedure for the robot. In the operation procedure, some radio transceivers are located at non-straight configurations of pipelines and receive the sensor measurements from the robot and guide the robot in the desired direction. The proposed operation procedure provides smart navigation and data transmission during operation for the robot

    Vision-Based Control of Unmanned Aerial Vehicles for Automated Structural Monitoring and Geo-Structural Analysis of Civil Infrastructure Systems

    Full text link
    The emergence of wireless sensors capable of sensing, embedded computing, and wireless communication has provided an affordable means of monitoring large-scale civil infrastructure systems with ease. To date, the majority of the existing monitoring systems, including those based on wireless sensors, are stationary with measurement nodes installed without an intention for relocation later. Many monitoring applications involving structural and geotechnical systems require a high density of sensors to provide sufficient spatial resolution to their assessment of system performance. While wireless sensors have made high density monitoring systems possible, an alternative approach would be to empower the mobility of the sensors themselves to transform wireless sensor networks (WSNs) into mobile sensor networks (MSNs). In doing so, many benefits would be derived including reducing the total number of sensors needed while introducing the ability to learn from the data obtained to improve the location of sensors installed. One approach to achieving MSNs is to integrate the use of unmanned aerial vehicles (UAVs) into the monitoring application. UAV-based MSNs have the potential to transform current monitoring practices by improving the speed and quality of data collected while reducing overall system costs. The efforts of this study have been chiefly focused upon using autonomous UAVs to deploy, operate, and reconfigure MSNs in a fully autonomous manner for field monitoring of civil infrastructure systems. This study aims to overcome two main challenges pertaining to UAV-enabled wireless monitoring: the need for high-precision localization methods for outdoor UAV navigation and facilitating modes of direct interaction between UAVs and their built or natural environments. A vision-aided UAV positioning algorithm is first introduced to augment traditional inertial sensing techniques to enhance the ability of UAVs to accurately localize themselves in a civil infrastructure system for placement of wireless sensors. Multi-resolution fiducial markers indicating sensor placement locations are applied to the surface of a structure, serving as navigation guides and precision landing targets for a UAV carrying a wireless sensor. Visual-inertial fusion is implemented via a discrete-time Kalman filter to further increase the robustness of the relative position estimation algorithm resulting in localization accuracies of 10 cm or smaller. The precision landing of UAVs that allows the MSN topology change is validated on a simple beam with the UAV-based MSN collecting ambient response data for extraction of global mode shapes of the structure. The work also explores the integration of a magnetic gripper with a UAV to drop defined weights from an elevation to provide a high energy seismic source for MSNs engaged in seismic monitoring applications. Leveraging tailored visual detection and precise position control techniques for UAVs, the work illustrates the ability of UAVs to—in a repeated and autonomous fashion—deploy wireless geophones and to introduce an impulsive seismic source for in situ shear wave velocity profiling using the spectral analysis of surface waves (SASW) method. The dispersion curve of the shear wave profile of the geotechnical system is shown nearly equal between the autonomous UAV-based MSN architecture and that taken by a traditional wired and manually operated SASW data collection system. The developments and proof-of-concept systems advanced in this study will extend the body of knowledge of robot-deployed MSN with the hope of extending the capabilities of monitoring systems while eradicating the need for human interventions in their design and use.PHDCivil EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/169980/1/zhh_1.pd
    corecore