20 research outputs found

    Robust feature extraction pose estimation during fly-around and straight-line approach in close range.

    Get PDF
    This paper refers to the field of visual navigation in OnOrbit Servicing (OOS) and/or Active Debris Removal (ADR) missions. Mainly the robust feature extraction pose estimation technique is proposed here to estimate target while approaching it. This method is tested with two image datasets from different sensors in open loop. The stable tracking during the fly-around and straight line approach gives a positive feedback to consider this technique as a possible candidate for the future missions

    A robust navigation filter fusing delayed measurements from multiple sensors and its application to spacecraft rendezvous

    Get PDF
    A filter is an essential part of many control systems. For example guidance, navigation and control systems for spacecraft rendezvous require a robust navigation filter that generates estimates of the state in a smooth and stable way. This is important for a safe spacecraft navigation within rendezvous missions. Delayed, asynchronous measurements from possibly different sensors require a new filter technique which can handle these different challenges. A new method is developed which is based on an Extended Kalman Filter with several adaptations in the prediction and correction step. Two key aspects are extrapolation of delayed measurements and sensor fusion in the filter correction. The new filter technique is applied on different close-range rendezvous examples and tested at the hardware-in-the-loop facility EPOS 2.0 (European Proximity Operations Simulator) with two different rendezvous sensors. Even with realistic delays by using an ARM-based on-board computer in the hardware-in-the-loop tests the filter is able to provide accurate, stable and smooth state estimates in all test scenarios

    10-Year Anniversary of the European Proximity Operations Simulator 2.0 - Looking Back at Test Campaigns, Rendezvous Research and Facility Improvements

    Get PDF
    Completed in 2009, the European Proximity Operations Simulator 2.0 (EPOS 2.0) succeeded EPOS 1.0 at the German Space Operations Center (GSOC). One of the many contributions the old EPOS 1.0 facility made to spaceflight rendezvous is the verification of the Jena-Optronik laser-based sensors used by the Automated Transfer Vehicle. While EPOS 2.0 builds upon its heritage, it is a completely new design aiming at considerably more complex rendezvous scenarios. During the last ten years, GSOC’s On-Orbit-Servicing and Autonomy group, who operates, maintains and evolves EPOS 2.0, has made numerous contributions to the field of uncooperative rendezvous, using EPOS as its primary tool. After general research in optical navigation in the early 2010s, the OOS group took a leading role in the DLR project On-Orbit-Servicing End-to-End Simulation in 2014. EPOS 2.0 served as the hardware in the loop simulator of the rendezvous phase and contributed substantially to the project’s remarkable success. Over the years, E2E has revealed demanding requirements, leading to numerous facility improvements and extensions. In addition to the OOS group’s research work, numerous and diverse open-loop test campaigns for industry and internal (DLR) customers have shaped the capabilities of EPOS 2.0 significantly

    Operations of On-Orbit Servicing Missions

    No full text
    This chapter gives a comprehensive insight into the operational aspects of On-Orbit Servicing as well as rendezvous and docking missions. By means of several examples, the operational challenges are explained, and solutions are outlined. The orbit mechanics of a rendezvous mission is described in the local orbital frame. We use the Clohessy-Wiltshire equations to explain the different elements of the approach navigation. The influence of the sensor technology on the approach strategy is also discussed. In the context of robotic capture, we address the necessary changes in the communication concept, e.g., to ensure teleoperation. Finally, we describe the use of test and validation facilities for the critical maneuvers of a rendezvous and docking mission

    Smoothed normal distribution transform for efficient point cloud registration during space rendezvous

    Get PDF
    Next to the iterative closest point (ICP) algorithm, the normal distribution transform (NDT) algorithm is be- coming a second standard for 3D point cloud registration in mobile robotics. Both methods are effective, however they require a sufficiently good initialization to successfully converge. In particular, the discontinuities in the NDT cost function can lead to difficulties when performing the optimization. In addition, when the size of the point clouds increases, performing the registration in real-time becomes challenging. This work in- troduces a Gaussian smoothing technique of the NDT map, which can be done prior to the registration process. A kd-tree adaptation of the typical octree representation of NDT maps is also proposed. The performance of the modified smoothed NDT (S-NDT) algorithm for pairwise scan registration is assessed on two large-scale outdoor datasets, and compared to the performance of a state-of-the-art ICP implementation. S-NDT is around four times faster and as robust as ICP while reaching similar precision. The algorithm is thereafter applied to the problem of LiDAR tracking of a spacecraft in close-range in the context of space rendezvous, demonstrating the performance and applicability to real-time applications

    Lidar Pose Tracking of a Tumbling Spacecraft Using the Smoothed Normal Distribution Transform

    Get PDF
    Lidar sensors enable precise pose estimation of an uncooperative spacecraft in close range. In this context, the iterative closest point (ICP) is usually employed as a tracking method. However, when the size of the point clouds increases, the required computation time of the ICP can become a limiting factor. The normal distribution transform (NDT) is an alternative algorithm which can be more efficient than the ICP, but suffers from robustness issues. In addition, lidar sensors are also subject to motion blur effects when tracking a spacecraft tumbling with a high angular velocity, leading to a loss of precision in the relative pose estimation. This work introduces a smoothed formulation of the NDT to improve the algorithm’s robustness while maintaining its efficiency. Additionally, two strategies are investigated to mitigate the effects of motion blur. The first consists in un-distorting the point cloud, while the second is a continuous-time formulation of the NDT. Hardware-in-the-loop tests at the European Proximity Operations Simulator demonstrate the capability of the proposed methods to precisely track an uncooperative spacecraft under realistic conditions within tens of milliseconds, even when the spacecraft tumbles with a significant angular rate

    LiDAR based pose tracking of an uncooperative spacecraft using the smoothed normal distribution transform

    Get PDF
    Lidar sensors provide precise 3D point cloud measurements, and can be used for pose estimation of an uncooperative satellite during space rendezvous. For updating the estimated pose of a the target, iterative closest point (ICP) or one of its variants is usually applied as a tracking method. However, for dense point clouds and space hardware with reduced computing power, the execution time of ICP can become a limiting factor. Normal distribution transform (NDT) is an alternative algorithm for fine point cloud registration, which can be faster than ICP. Yet, NDT can be less robust than ICP due to the discontinuities in its cost function. This work proposes a smoothing method of the NDT map in order to mitigate this robustness problem. In addition, a strategy for correcting motion blur observed with lidar sensors when recording a tumbling target is developed. Experiments at the European Proximity Operations Simulator demonstrate the efficiency, precision and robustness of the smoothed NDT algorithm when compared to ICP, as well as the importance of motion blur correction for precise pose estimation

    A robust navigation filter fusing delayed measurements from multiple sensors

    No full text
    An extended Kalman-Filter is modified to handle delayed and out-of-sync measurements from multiple sensors. It is integrated into an existing guidance, navigation and control system for close range satellite-rendezvous-navigation based on optical sensors. This navigation filter is evaluated in a Hardwarein-the-Loop (HiL)-simulation using the European Proximity Operations Simulator (EPOS). The resulting solution is especially useful for autonomous navigation on slow on-board computers, like the Scalable On-Board System for Space Avionics (SCOSA) developed by DLR

    Segmentation-driven spacecraft pose estimation for vision-based relative navigation in space

    No full text
    Vision-based relative navigation technology is a key enabler of several areas of the space industry such as on-orbit servicing, space debris removal, and formation flying. A particularly demanding scenario is navigating relative to a non-cooperative target that does not offer any navigational aid and is unable to stabilize its attitude. Previously, the state-of-the-art in vision-based relative navigation has relied on image processing and template matching techniques. However, outside of the space industry, state-of-the-art object pose estimation techniques are dominated by convolutional neural networks (CNNs). This is due to CNNs flexibility towards arbitrary pose estimation targets, their ability to use whatever available target features, and robustness towards varied lighting conditions, damage to targets, occlusions, and other effects that might interfere with the image. The use of CNNs for visual relative navigation is still relatively unexplored in terms of how their unique advantages can best be exploited. This research aims to integrate a state-of-the-art CNN-based pose estimation architecture in a relative navigation system. The system's navigation performance is benchmarked on realistic images gathered from the European Proximity Operations Simulator 2.0 (EPOS 2.0) robotic hardware-in-the-loop laboratory. A synthetic dataset is generated using Blender as a rendering engine. A segmentation-based 6D pose estimation CNN is trained using the synthetic dataset and the resulting pose estimation performance is evaluated on a set of real images gathered from the cameras of the EPOS 2.0 robotic close-range relative navigation laboratory. It is demonstrated that a synthetic-image-trained CNN-based pose estimation pipeline is able to successfully perform in a close-range visual navigation setting on real camera images of spacecraft that exhibits, though with some limitations that still have to be surpassed for the system to be ready for operation. Furthermore, it is able to do so with a symmetric target, a common difficulty with neural networks in a pose estimation setting
    corecore