848 research outputs found

    Autonomous control of underground mining vehicles using reactive navigation

    Get PDF
    Describes how many of the navigation techniques developed by the robotics research community over the last decade may be applied to a class of underground mining vehicles (LHDs and haul trucks). We review the current state-of-the-art in this area and conclude that there are essentially two basic methods of navigation applicable. We describe an implementation of a reactive navigation system on a 30 tonne LHD which has achieved full-speed operation at a production mine

    An overview of robotics and autonomous systems for harsh environments

    Get PDF
    Across a wide range of industries and applications, robotics and autonomous systems can fulfil the crucial and challenging tasks such as inspection, exploration, monitoring, drilling, sampling and mapping in areas of scientific discovery, disaster prevention, human rescue and infrastructure management, etc. However, in many situations, the associated environment is either too dangerous or inaccessible to humans. Hence, a wide range of robots have been developed and deployed to replace or aid humans in these activities. A look at these harsh environment applications of robotics demonstrate the diversity of technologies developed. This paper reviews some key application areas of robotics that involve interactions with harsh environments (such as search and rescue, space exploration, and deep-sea operations), gives an overview of the developed technologies and provides a discussion of the key trends and future directions common to many of these areas

    Viewfinder: final activity report

    Get PDF
    The VIEW-FINDER project (2006-2009) is an 'Advanced Robotics' project that seeks to apply a semi-autonomous robotic system to inspect ground safety in the event of a fire. Its primary aim is to gather data (visual and chemical) in order to assist rescue personnel. A base station combines the gathered information with information retrieved from off-site sources. The project addresses key issues related to map building and reconstruction, interfacing local command information with external sources, human-robot interfaces and semi-autonomous robot navigation. The VIEW-FINDER system is a semi-autonomous; the individual robot-sensors operate autonomously within the limits of the task assigned to them, that is, they will autonomously navigate through and inspect an area. Human operators monitor their operations and send high level task requests as well as low level commands through the interface to any nodes in the entire system. The human interface has to ensure the human supervisor and human interveners are provided a reduced but good and relevant overview of the ground and the robots and human rescue workers therein

    Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation

    Get PDF

    Design of a walking robot

    Get PDF
    Carnegie Mellon University's Autonomous Planetary Exploration Program (APEX) is currently building the Daedalus robot; a system capable of performing extended autonomous planetary exploration missions. Extended autonomy is an important capability because the continued exploration of the Moon, Mars and other solid bodies within the solar system will probably be carried out by autonomous robotic systems. There are a number of reasons for this - the most important of which are the high cost of placing a man in space, the high risk associated with human exploration and communication delays that make teleoperation infeasible. The Daedalus robot represents an evolutionary approach to robot mechanism design and software system architecture. Daedalus incorporates key features from a number of predecessor systems. Using previously proven technologies, the Apex project endeavors to encompass all of the capabilities necessary for robust planetary exploration. The Ambler, a six-legged walking machine was developed by CMU for demonstration of technologies required for planetary exploration. In its five years of life, the Ambler project brought major breakthroughs in various areas of robotic technology. Significant progress was made in: mechanism and control, by introducing a novel gait pattern (circulating gait) and use of orthogonal legs; perception, by developing sophisticated algorithms for map building; and planning, by developing and implementing the Task Control Architecture to coordinate tasks and control complex system functions. The APEX project is the successor of the Ambler project

    SAFER: Search and Find Emergency Rover

    Get PDF
    When disaster strikes and causes a structure to collapse, it poses a unique challenge to search and rescue teams as they assess the situation and search for survivors. Currently there are very few tools that can be used by these teams to aid them in gathering important information about the situation that allow members to stay at a safe distance. SAFER, Search and Find Emergency Rover, is an unmanned, remotely operated vehicle that can provide early reconnaissance to search and rescue teams so they may have more information to prepare themselves for the dangers that lay inside the wreckage. Over the past year, this team has restored a bare, non-operational chassis inherited from Roverwerx 2012 into a rugged and operational rover with increased functionality and reliability. SAFER uses a 360-degree camera to deliver real time visual reconnaissance to the operator who can remain safely stationed on the outskirts of the disaster. With strong drive motors providing enough torque to traverse steep obstacles and enough power to travel at up to 3 ft/s, SAFER can cover ground quickly and effectively over its 1-3 hour battery life, maximizing reconnaissance for the team. Additionally, SAFER contains 3 flashing beacons that can be dropped by the operator in the event a victim is found so that when team members do enter the scene they may easily locate victims. In the future, other teams may wish to improve upon this iteration by adding thermal imaging, air quality sensors, and potentially a robotic arm with a camera that can see in spaces too small for the entire rover to enter

    Improving the mobility performance of autonomous unmanned ground vehicles by adding the ability to 'Sense/Feel' their local environment.

    Get PDF
    This paper follows on from earlier work detailed in output one and critically reviews the sensor technologies used in autonomous vehicles, including robots, to ascertain the physical properties of the environment including terrain sensing. The paper reports on a comprehensive study done in terrain types and how these could be determined and the appropriate sensor technologies that can be used. It also reports on work currently in progress in applying these sensor technologies and gives details of a prototype system built at Middlesex University on a reconfigurable mobility system, demonstrating the success of the proposed strategies. This full paper was subject to a blind refereed review process and presented at the 12th HCI International 2007, Beijing, China, incorporating 8 other international thematic conferences. The conference involved over 250 parallel sessions and was attended by 2000 delegates. The conference proceedings are published by Springer in a 17 volume paperback book edition in the Lecture Notes in Computer Science series (LNCS). These are available on-line through the LNCS Digital Library, readily accessible by all subscribing libraries around the world, published in the proceedings of the Second International Conference on Virtual Reality, ICVR 2007, held as Part of HCI International 2007, Beijing, China, July 22-27, 2007. It is also published as a collection of 81 papers in Lecture Notes in Computer Science Series by Springer

    An improved robot for bridge inspection

    Get PDF
    This paper presents a significant improvement from the previous submission from the same authors at ISARC 2016. The robot is now equipped with low-cost cameras and a 2D laser scanner which is used to monitor and survey a bridge bearing. The robot is capable of localising by combining a data from a pre-surveyed 3D model of the space with real-time data collection in-situ. Autonomous navigation is also performed using the 2D laser scanner in a mapped environment. The Robot Operating System (ROS) framework is used to integrate data collection and communication for navigation
    • …
    corecore