60 research outputs found

    Autonomous Robots for Active Removal of Orbital Debris

    Full text link
    This paper presents a vision guidance and control method for autonomous robotic capture and stabilization of orbital objects in a time-critical manner. The method takes into account various operational and physical constraints, including ensuring a smooth capture, handling line-of-sight (LOS) obstructions of the target, and staying within the acceleration, force, and torque limits of the robot. Our approach involves the development of an optimal control framework for an eye-to-hand visual servoing method, which integrates two sequential sub-maneuvers: a pre-capturing maneuver and a post-capturing maneuver, aimed at achieving the shortest possible capture time. Integrating both control strategies enables a seamless transition between them, allowing for real-time switching to the appropriate control system. Moreover, both controllers are adaptively tuned through vision feedback to account for the unknown dynamics of the target. The integrated estimation and control architecture also facilitates fault detection and recovery of the visual feedback in situations where the feedback is temporarily obstructed. The experimental results demonstrate the successful execution of pre- and post-capturing operations on a tumbling and drifting target, despite multiple operational constraints

    Concurrent image-based visual servoing with adaptive zooming for non-cooperative rendezvous maneuvers

    Get PDF
    An image-based servo controller for the guidance of a spacecraft during non-cooperative rendezvous is presented in this paper. The controller directly utilizes the visual features from image frames of a target spacecraft for computing both attitude and orbital maneuvers concurrently. The utilization of adaptive optics, such as zooming cameras, is also addressed through developing an invariant-image servo controller. The controller allows for performing rendezvous maneuvers independently from the adjustments of the camera focal length, improving the performance and versatility of maneuvers. The stability of the proposed control scheme is proven analytically in the invariant space, and its viability is explored through numerical simulations

    Visual servoing of an Earth observation satellite of the LION constellation

    Get PDF
    International audienceSatellites for observation missions, or imagery satellites, have increased drastically in number and performances since the beginning of the space age. Recent Earth observation satellites are now equipped with new instruments that allow image processing in real-time. Issues such as ground target tracking, moving or not, can now be addressed by controlling precisely the satellite attitude. The satellite "camera" can be used as an input sensor for real-time attitude control process. This can be addressed thanks to a closed loop control scheme that includes the image acquisition and image processing parts. Real-time attitude control using such sensors will then allow the tracking of static or moving ground targets. In this paper, we propose to consider this problem using a visual servoing (VS) approach. This work is thus focused on establishing a visual control law that allows to precisely control a low orbit Earth observation starer satellite attitude using images provided by its matrix sensor. The goal is to perform acquisition missions devoted to gaze on an object of interest that is visible in the image before the VS starts. The visual sensor is fixed to the satellite, and we have full control over its three rotational degrees of freedom subject to dynamic constraints, while the satellite is moving on an orbit that only influences its position (that is not controlled by our VS scheme). Compensating for the target motion in the image by explicitly embedding it in the control scheme becomes essential when it is significant. In our case, the satellite orbit is known, so we can determine accurately its translational motion, and compensate for it in the control law. When it comes to target motion, we propose to decompose it into a known displacement caused by Earth's dynamics and a residual motion due to its potential own motion. The contribution of this paper is a visual servoing scheme able to control the attitude of an agile Earth observation satellite for target tracking. Three visual features are selected for controlling the 3 attitude parameters, for achieving a centering task, and an orientation task. The control law allows for dealing with the satellite's high translational velocity induced by its orbit and other external motion including Earth's rotation and target own motion. A rate saturation algorithm is also proposed dealing with dynamic constraints. Simulations and experiments on an actual robot will be presented

    Vision-based rotational control of an agile observation satellite

    Get PDF
    International audienceRecent Earth observation satellites are now equipped with new instrument that allows image feedback in real-time. Problematic such as ground target tracking, moving or not, can now be addressed by precisely controlling the satellite attitude. In this paper, we propose to consider this problem using a visual servoing approach. While focusing on the target, the control scheme has also to take into account the satellite motion induced by its orbit, Earth rotational velocities, potential target own motion, but also rotational velocities and accelerations constraints of the system. We show the efficiency of our system using both simulation (considering real Earth image) and experiments on a robot that replicates actual high resolution satellite constraints

    Design and Operational Elements of the Robotic Subsystem for the e.deorbit Debris Removal Mission

    Get PDF
    This paper presents a robotic capture concept that was developed as part of the e.deorbit study by ESA. The defective and tumbling satellite ENVISAT was chosen as a potential target to be captured, stabilized, and subsequently de-orbited in a controlled manner. A robotic capture concept was developed that is based on a chaser satellite equipped with a seven degrees-of-freedom dexterous robotic manipulator, holding a dedicated linear two-bracket gripper. The satellite is also equipped with a clamping mechanism for achieving a stiff fixation with the grasped target, following their combined satellite-stack de-tumbling and prior to the execution of the de-orbit maneuver. Driving elements of the robotic design, operations and control are described and analyzed. These include pre and post-capture operations, the task-specific kinematics of the manipulator, the intrinsic mechanical arm flexibility and its effect on the arm's positioning accuracy, visual tracking, as well as the interaction between the manipulator controller and that of the chaser satellite. The kinematics analysis yielded robust reachability of the grasp point. The effects of intrinsic arm flexibility turned out to be noticeable but also effectively scalable through robot joint speed adaption throughout the maneuvers. During most of the critical robot arm operations, the internal robot joint torques are shown to be within the design limits. These limits are only reached for a limiting scenario of tumbling motion of ENVISAT, consisting of an initial pure spin of 5 deg/s about its unstable intermediate axis of inertia. The computer vision performance was found to be satisfactory with respect to positioning accuracy requirements. Further developments are necessary and are being pursued to meet the stringent mission-related robustness requirements. Overall, the analyses conducted in this study showed that the capture and de-orbiting of ENVISAT using the proposed robotic concept is feasible with respect to relevant mission requirements and for most of the operational scenarios considered. Future work aims at developing a combined chaser-robot system controller. This will include a visual servo to minimize the positioning errors during the contact phases of the mission (grasping and clamping). Further validation of the visual tracking in orbital lighting conditions will be pursued

    Task space control for on-orbit space robotics using a new ROS-based framework

    Get PDF
    This paper proposes several task space control approaches for complex on-orbit high degrees of freedom robots. These approaches include redundancy resolution and take the non-linear dynamic model of the on-orbit robotic systems into account. The suitability of the proposed task space control approaches is explored in several on-orbit servicing operations requiring visual servoing tasks of complex humanoid robots. A unified open-source framework for space-robotics simulations, called OnOrbitROS, is used to evaluate the proposed control systems and compare their behaviour with state-of-the-art existing ones. The adopted framework is based on ROS and includes and reproduces the principal environmental conditions that eventual space robots and manipulators could experience in an on-orbit servicing scenario. The architecture of the different software modules developed and their application on complex space robotic systems is presented. Efficient real-time implementations are achieved using the proposed OnOrbitROS framework. The proposed controllers are applied to perform the guidance of a humanoid robot. The robot dynamics are integrated into the definition of the controllers and an analysis of the results and practical properties are described in the results section

    Autonomous Visual Servo Robotic Capture of Non-cooperative Target

    Get PDF
    This doctoral research develops and validates experimentally a vision-based control scheme for the autonomous capture of a non-cooperative target by robotic manipulators for active space debris removal and on-orbit servicing. It is focused on the final capture stage by robotic manipulators after the orbital rendezvous and proximity maneuver being completed. Two challenges have been identified and investigated in this stage: the dynamic estimation of the non-cooperative target and the autonomous visual servo robotic control. First, an integrated algorithm of photogrammetry and extended Kalman filter is proposed for the dynamic estimation of the non-cooperative target because it is unknown in advance. To improve the stability and precision of the algorithm, the extended Kalman filter is enhanced by dynamically correcting the distribution of the process noise of the filter. Second, the concept of incremental kinematic control is proposed to avoid the multiple solutions in solving the inverse kinematics of robotic manipulators. The proposed target motion estimation and visual servo control algorithms are validated experimentally by a custom built visual servo manipulator-target system. Electronic hardware for the robotic manipulator and computer software for the visual servo are custom designed and developed. The experimental results demonstrate the effectiveness and advantages of the proposed vision-based robotic control for the autonomous capture of a non-cooperative target. Furthermore, a preliminary study is conducted for future extension of the robotic control with consideration of flexible joints

    Aerospace medicine and biology: A continuing bibliography with indexes (supplement 344)

    Get PDF
    This bibliography lists 125 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during January, 1989. Subject coverage includes: aerospace medicine and psychology, life support systems and controlled environments, safety equipment, exobiology and extraterrestrial life, and flight crew behavior and performance

    Visual Tracking and Motion Estimation for an On-orbit Servicing of a Satellite

    Get PDF
    This thesis addresses visual tracking of a non-cooperative as well as a partially cooperative satellite, to enable close-range rendezvous between a servicer and a target satellite. Visual tracking and estimation of relative motion between a servicer and a target satellite are critical abilities for rendezvous and proximity operation such as repairing and deorbiting. For this purpose, Lidar has been widely employed in cooperative rendezvous and docking missions. Despite its robustness to harsh space illumination, Lidar has high weight and rotating parts and consumes more power, thus undermines the stringent requirements of a satellite design. On the other hand, inexpensive on-board cameras can provide an effective solution, working at a wide range of distances. However, conditions of space lighting are particularly challenging for image based tracking algorithms, because of the direct sunlight exposure, and due to the glossy surface of the satellite that creates strong reflection and image saturation, which leads to difficulties in tracking procedures. In order to address these difficulties, the relevant literature is examined in the fields of computer vision, and satellite rendezvous and docking. Two classes of problems are identified and relevant solutions, implemented on a standard computer are provided. Firstly, in the absence of a geometric model of the satellite, the thesis presents a robust feature-based method with prediction capability in case of insufficient features, relying on a point-wise motion model. Secondly, we employ a robust model-based hierarchical position localization method to handle change of image features along a range of distances, and localize an attitude-controlled (partially cooperative) satellite. Moreover, the thesis presents a pose tracking method addressing ambiguities in edge-matching, and a pose detection algorithm based on appearance model learning. For the validation of the methods, real camera images and ground truth data, generated with a laboratory tet bed similar to space conditions are used. The experimental results indicate that camera based methods provide robust and accurate tracking for the approach of malfunctioning satellites in spite of the difficulties associated with specularities and direct sunlight. Also exceptional lighting conditions associated to the sun angle are discussed, aimed at achieving fully reliable localization system in a certain mission

    A bio-plausible design for visual attitude stabilization

    Get PDF
    We consider the problem of attitude stabilization using exclusively visual sensory input, and we look for a solution which can satisfy the constraints of a "bio-plausible" computation. We obtain a PD controller which is a bilinear form of the goal image, and the current and delayed visual input. Moreover, this controller can be learned using classic neural networks algorithms. The structure of the resulting computation, derived from general principles by imposing a bilinear computation, has striking resemblances with existing models for visual information processing in insects (Reichardt Correlators and lobula plate tangential cells). We validate the algorithms using faithful simulations of the fruit fly visual input
    • …
    corecore