240 research outputs found

    A haptic base human robot interaction approach for robotic grit blasting

    Full text link
    This paper proposes a remote operation method for a robot arm in a complex environment by using the Virtual Force (VF) based approach. A virtual robot arm is manipulated by a steering force, at the end-effecter, which is generated according to the movement of a feedback haptic. A three-dimensional force field (3D-F2) is employed in collision detection and avoidance. Repulsive forces from the 3D-F2 are produced and feedback to the haptic device that enables the operator to have a sense of touch on the encountered obstacle and then steer the arm to avoid it. As a result, collision-free poses of the virtual robot arm can then be used to command the real robot. Experiments are conducted in a mock up bridge environment where the real robot arm is steered to target points by the operator. Experiment results have shown successful collision avoidance and emulation of the actual command force and the virtual forces in remote operations

    Study of the urban evolution of Brasilia with the use of LANDSAT data

    Get PDF
    The urban growth of Brasilia within the last ten years is analyzed with special emphasis on the utilization of remote sensing orbital data and automatic image processing. The urban spatial structure and the monitoring of its temporal changes were focused in a whole and dynamic way by the utilization of MSS-LANDSAT images for June 1973, 1978 and 1983. In order to aid data interpretation, a registration algorithm implemented at the Interactive Multispectral Image Analysis System (IMAGE-100) was utilized aiming at the overlap of multitemporal images. The utilization of suitable digital filters, combined with the images overlap, allowed a rapid identification of areas of possible urban growth and oriented the field work. The results obtained permitted an evaluation of the urban growth of Brasilia, taking as reference the proposed stated for the construction of the city

    Remotely operated telepresent robotics

    Get PDF
    Remotely operated robots with the ability of performing specific tasks are often used in hazardous environments in place of humans to prevent injury or death. Modern remotely operated robots suffer from limitations with accuracy which is primarily due the lack of depth perception and unintuitive hardware controls. The undertaken research project suggests an alternative method of vision and control to increase a user‟s operational performance of remotely controlled robotics. The Oculus Rift Development Kit 2.0 is a low cost device originally developed for the electronic entertainment industry which allows users to experience virtual reality by the use of a head mounted display. This technology is able to be adapted to different uses and is primarily utilised to achieve real world stereoscopic 3D vision for the user. Additionally a wearable controller was trialled with the goal of allowing a robotic arm to mimic the position of the user‟s arm via a master/slave setup. By incorporating the stated vision and control methods, any possible improvements in the accuracy and speed for users was investigated through experimentation and a conducted study. Results indicated that using the Oculus Rift for stereoscopic vision improved upon the user‟s ability to judge distances remotely but was detrimental to the user‟s ability to operate the robot. The research has been conducted under the supervision of the University of Southern Queensland (USQ) and provides useful information towards the area of remotely operated telepresent robotics

    Development of operator interfaces for a heavy maintenance manipulator

    Get PDF
    This dissertation details the development of an intuitive operator interface for a complex serial manipulator, to be used in heavy maintenance tasks. This interface allows the operator to control the manipulator in the 'task-space', with software handling the conversion to 'joint-space'. Testing of the interfaces shows operator task-space control to be most effective in reducing operator workload and improving the ease of use of a complex machine. These methods are applicable in concept, to a wider range of manipulators and other machines. A number of operator interfaces were developed: a Joystick Interface, a Master Arm interface and a 6-D Mouse Interface. The Joystick Interface made use of a task space to joint space transformation implemented in software. The Master Arm utilised a scale model to conduct the transformation. Finally, a 3D mouse Interface utilised sensors in an Android Device with a software based task to joint space transformation. These interfaces were tested and the Joystick Interface proved most suitable according to the operator's subjective opinion. Quantitative measurement also showed that it accurately reproduced the operator's commands. The software transformation developed for the Joystick and 6-D Mouse interfaces utilised the Jacobian Matrix to complete the task-space to joint-space conversion. However, since the manipulator contained a redundant joint, an additional algorithm was required to handle the redundancy. This additional algorithm also improved manipulator safety, as it navigated the arm away from singularities which could result in large joint movement. The novelty of this algorithm is based on its pragmatic approach, and could be modified to achieve a number of safety or performance goals. The control strategy centred on the operator specifying commands to the arm in the frame of the task. The developed algorithm enabled the control strategy by ensuring that viable solutions for joint velocity could be found in a manipulator that has redundant joints. Furthermore, this algorithm utilised a cost function that minimised the chances of large joint movements due to singularities, improving the safety of the device. Overall, the project has delivered a viable operator interface for controlling a complex, redundant manipulator. This interface was tested against a number of alternate operator interfaces. The contrasting results of the strengths and weaknesses of various interfaces meant that a number of key insights were gained, and a pragmatic approach to redundancy management was developed

    Command and Control Systems for Search and Rescue Robots

    Get PDF
    The novel application of unmanned systems in the domain of humanitarian Search and Rescue (SAR) operations has created a need to develop specific multi-Robot Command and Control (RC2) systems. This societal application of robotics requires human-robot interfaces for controlling a large fleet of heterogeneous robots deployed in multiple domains of operation (ground, aerial and marine). This chapter provides an overview of the Command, Control and Intelligence (C2I) system developed within the scope of Integrated Components for Assisted Rescue and Unmanned Search operations (ICARUS). The life cycle of the system begins with a description of use cases and the deployment scenarios in collaboration with SAR teams as end-users. This is followed by an illustration of the system design and architecture, core technologies used in implementing the C2I, iterative integration phases with field deployments for evaluating and improving the system. The main subcomponents consist of a central Mission Planning and Coordination System (MPCS), field Robot Command and Control (RC2) subsystems with a portable force-feedback exoskeleton interface for robot arm tele-manipulation and field mobile devices. The distribution of these C2I subsystems with their communication links for unmanned SAR operations is described in detail. Field demonstrations of the C2I system with SAR personnel assisted by unmanned systems provide an outlook for implementing such systems into mainstream SAR operations in the future

    Design Issues and in Field Tests of the New Sustainable Tractor LOCOSTRA

    Get PDF
    first, in Italy, focusing on the agricultural application of the machine, in natural scenarios with different ground and vegetatio

    Ultrasound Guided Robot for Human Liver Biopsy using High Intensity Focused Ultrasound for Hemostasis

    Get PDF
    Percutaneous liver biopsy is the gold standard among clinician\u27s tool to diagnose and guide subsequent therapy for liver disease. Ultrasound image guidance is being increasingly used to reduce associated procedural risks but post–biopsy complications still persist. The major and most common complication is hemorrhage, which is highly unpredictable and may sometimes lead to death. Though the risk of mortality is low, it is too high for a diagnostic procedure. Post-biopsy care and additional surgical intervention to arrest hemorrhage make liver biopsy a costly procedure for health care delivery systems. Non-invasive methods to stop bleeding exist like electro–cautery, microwave, lasers, radio frequency, argon–beam, and High Intensity Focused Ultrasound (HIFU). All the methods except HIFU require direct exposure of the needle puncture site for hemostasis. HIFU is an ultrasound modality and uses mechanical sound waves for focused energy delivery. Ultrasound waves are minimally affected by tissue attenuation and focus internal targets without direct exposure. Human error in focusing HIFU renders it unusable for a medical procedure especially when noninvasive. In this project we designed and developed an ultrasound guided prototype robot for accurate HIFU targeting to induce hemostasis. The robotic system performs percutaneous needle biopsy and a 7.5 cm focal length HIFU is fired at the puncture point when the needle tip retracts to the liver surface after sample collection. The robot has 4 degrees of freedom (DOF) for biopsy needle insertion, HIFU positioning, needle angle alignment and US probe image plane orientation. As the needle puncture point is always in the needle path, mechanically constraining the HIFU to focus on the needle reduced the required functionality significantly. Two mini c-arms are designed for needle angle alignment and US probe image plane orientation. This reduced the contact foot print of the robot over the patient providing a greater dexterity for positioning the robot. The robot is validated for HIFU hemostasis by a series of experiments on chicken breasts. HIFU initiated hemorrhage control with robotic biopsy ensures arrest of post-biopsy hemorrhage and decreases patient anxiety, hospital stay, morbidity, time of procedure, and cost. This can also be extended to other organs like kidneys, lungs etc. and has widespread implications such as control of hemorrhage in post-biopsies in patients with reduced ability for hemostasis. This research opens a greater scope for research for automation and design making it a physician friendly tool for eventual clinical use

    An Augmented Interaction Strategy For Designing Human-Machine Interfaces For Hydraulic Excavators

    Get PDF
    Lack of adequate information feedback and work visibility, and fatigue due to repetition have been identified as the major usability gaps in the human-machine interface (HMI) design of modern hydraulic excavators that subject operators to undue mental and physical workload, resulting in poor performance. To address these gaps, this work proposed an innovative interaction strategy, termed “augmented interaction”, for enhancing the usability of the hydraulic excavator. Augmented interaction involves the embodiment of heads-up display and coordinated control schemes into an efficient, effective and safe HMI. Augmented interaction was demonstrated using a framework consisting of three phases: Design, Implementation/Visualization, and Evaluation (D.IV.E). Guided by this framework, two alternative HMI design concepts (Design A: featuring heads-up display and coordinated control; and Design B: featuring heads-up display and joystick controls) in addition to the existing HMI design (Design C: featuring monitor display and joystick controls) were prototyped. A mixed reality seating buck simulator, named the Hydraulic Excavator Augmented Reality Simulator (H.E.A.R.S), was used to implement the designs and simulate a work environment along with a rock excavation task scenario. A usability evaluation was conducted with twenty participants to characterize the impact of the new HMI types using quantitative (task completion time, TCT; and operating error, OER) and qualitative (subjective workload and user preference) metrics. The results indicated that participants had a shorter TCT with Design A. For OER, there was a lower error probability due to collisions (PER1) with Design A, and lower error probability due to misses (PER2)with Design B. The subjective measures showed a lower overall workload and a high preference for Design B. It was concluded that augmented interaction provides a viable solution for enhancing the usability of the HMI of a hydraulic excavator
    • 

    corecore