4 research outputs found

    Image-Based Flexible Endoscope Steering

    Get PDF
    Manually steering the tip of a flexible endoscope to navigate through an endoluminal path relies on the physician’s dexterity and experience. In this paper we present the realization of a robotic flexible endoscope steering system that uses the endoscopic images to control the tip orientation towards the direction of the lumen. Two image-based control algorithms are investigated, one is based on the optical flow and the other is based on the image intensity. Both are evaluated using simulations in which the endoscope was steered through the lumen. The RMS distance to the lumen center was less than 25% of the lumen width. An experimental setup was built using a standard flexible endoscope, and the image-based control algorithms were used to actuate the wheels of the endoscope for tip steering. Experiments were conducted in an anatomical model to simulate gastroscopy. The image intensity- based algorithm was capable of steering the endoscope tip through an endoluminal path from the mouth to the duodenum accurately. Compared to manual control, the robotically steered endoscope performed 68% better in terms of keeping the lumen centered in the image

    Automatic Collision Avoidance for Teleoperated Underactuated Aerial Vehicles using Telemetric Measurements

    Get PDF
    The paper deals with the obstacle avoidance problem for unmanned aerial vehicles (UAVs) operating in teleoperated mode. First, a feedback controller that we proposed recently for the stabilization of the UAV's linear velocity is recalled. Then, based on sensory measurements, a control strategy is proposed in order to modify the reference velocity on-line in the neighborhood of obstacles so as to avoid collisions. Both cases of telemetry and optical flow sensors are addressed. Stability properties of the proposed feedback controller are established based on a Lyapunov analysis. Simulations results are reported to illustrate the approach

    A new framework for force feedback teleoperation of robotic vehicles based on optical flow

    Get PDF
    This paper proposes the use of optical flow from a moving robot to provide force feedback to an operator's joystick to facilitate collision free teleoperation. Optic flow is measured by wide angle cameras on board the vehicle and used to generate a virtual environmental force that is reflected to the user through the joystick, as well as feeding back into the control of the vehicle. The coupling between optical flow (velocity) and force is modelled as an impedance - in this case an optical impedance. We show that the proposed control is dissipative and prevents the vehicle colliding with the environment as well as providing the operator with a natural feel for the remote environment. The paper focuses on applications to aerial robotics vehicles, however, the ideas apply directly to other force actuated vehicles such as submersibles or space vehicles, and the authors believe the approach has potential for control of terrestrial vehicles and even teleoperation of manipulators. Experimental results are provided for a simulated aerial robot in a virtual environment controlled by a haptic joystick

    A New Framework for Force Feedback Teleoperation of Robotic Vehicles based on Optical Flow

    No full text
    corecore