24 research outputs found

    A Survey of LIDAR Technology and Its Use in Spacecraft Relative Navigation

    Get PDF
    This paper provides a survey of modern LIght Detection And Ranging (LIDAR) sensors from a perspective of how they can be used for spacecraft relative navigation. In addition to LIDAR technology commonly used in space applications today (e.g. scanning, flash), this paper reviews emerging LIDAR technologies gaining traction in other non-aerospace fields. The discussion will include an overview of sensor operating principles and specific pros/cons for each type of LIDAR. This paper provides a comprehensive review of LIDAR technology as applied specifically to spacecraft relative navigation. HE problem of orbital rendezvous and docking has been a consistent challenge for complex space missions since before the Gemini 8 spacecraft performed the first successful on-orbit docking of two spacecraft in 1966. Over the years, a great deal of effort has been devoted to advancing technology associated with all aspects of the rendezvous, proximity operations, and docking (RPOD) flight phase. After years of perfecting the art of crewed rendezvous with the Gemini, Apollo, and Space Shuttle programs, NASA began investigating the problem of autonomous rendezvous and docking (AR&D) to support a host of different mission applications. Some of these applications include autonomous resupply of the International Space Station (ISS), robotic servicing/refueling of existing orbital assets, and on-orbit assembly.1 The push towards a robust AR&D capability has led to an intensified interest in a number of different sensors capable of providing insight into the relative state of two spacecraft. The present work focuses on exploring the state-of-the-art in one of these sensors - LIght Detection And Ranging (LIDAR) sensors. It should be noted that the military community frequently uses the acronym LADAR (LAser Detection And Ranging) to refer to what this paper calls LIDARs. A LIDAR is an active remote sensing device that is typically used in space applications to obtain the range to one or more points on a target spacecraft. As the name suggests, LIDAR sensors use light (typically a laser) to illuminate the target and measure the time it takes for the emitted signal to return to the sensor. Because the light must travel from the source, t

    Autonomous RPOD Technology Challenges for the Coming Decade

    Get PDF
    Rendezvous Proximity Operations and Docking (RPOD) technologies are important to a wide range of future space endeavors. This paper will review some of the recent and ongoing activities related to autonomous RPOD capabilities and summarize the current state of the art. Gaps are identified where future investments are necessary to successfully execute some of the missions likely to be conducted within the next ten years. A proposed RPOD technology roadmap that meets the broad needs of NASA's future missions will be outlined, and ongoing activities at OSFC in support of a future satellite servicing mission are presented. The case presented shows that an evolutionary, stair-step technology development program. including a robust campaign of coordinated ground tests and space-based system-level technology demonstration missions, will ultimately yield a multi-use main-stream autonomous RPOD capability suite with cross-cutting benefits across a wide range of future applications

    Relative pose determination algorithm for space on-orbit close range autonomous operation using LiDAR

    Get PDF
    Non cooperative on-orbit operations, such as rendezvous, docking or berthing operations, have become more relevant, mainly due to the necessity of expanding mission lifetimes, the increase of space debris and the reduction of human dependency. In order to automate these operations, the relative pose calculation between the target and the chaser must be determined autonomously. In recent years, LiDAR sensors have been introduced for this problem, achieving good accuracies. The critical part of this operation is the first relative pose calculation, since there is no previous information about the attitude of the target. In this work, a methodology to carry out this first relative pose calculation using LiDAR sensors is presented. A template matching algorithm has been developed, which uses the 3D model of the target to calculate the relative pose of the target regarding the LiDAR sensor. Three different study cases, with different distances and rotations, have been simulated in order to validate the algorithm, reaching an average error of 0.0383m

    Raven: An On-Orbit Relative Navigation Demonstration Using International Space Station Visiting Vehicles

    Get PDF
    Since the last Hubble Servicing Mission five years ago, the Satellite Servicing Capabilities Office (SSCO) at the NASA Goddard Space Flight Center (GSFC) has been focusing on maturing the technologies necessary to robotically service orbiting legacy assets-spacecraft not necessarily designed for in-flight service. Raven, SSCO's next orbital experiment to the International Space Station (ISS), is a real-time autonomous non-cooperative relative navigation system that will mature the estimation algorithms required for rendezvous and proximity operations for a satellite-servicing mission. Raven will fly as a hosted payload as part of the Space Test Program's STP-H5 mission, which will be mounted on an external ExPRESS Logistics Carrier (ELC) and will image the many visiting vehicles arriving and departing from the ISS as targets for observation. Raven will host multiple sensors: a visible camera with a variable field of view lens, a long-wave infrared camera, and a short-wave flash lidar. This sensor suite can be pointed via a two-axis gimbal to provide a wide field of regard to track the visiting vehicles as they make their approach. Various real-time vision processing algorithms will produce range, bearing, and six degree of freedom pose measurements that will be processed in a relative navigation filter to produce an optimal relative state estimate. In this overview paper, we will cover top-level requirements, experimental concept of operations, system design, and the status of Raven integration and test activities

    Pose Performance of LIDAR-Based Relative Navigation for Non-Cooperative Objects

    Get PDF
    Flash LIDAR is an important new sensing technology for relative navigation; these sensors have shown promising results during rendezvous and docking applications involving a cooperative vehicle. An area of recent interest is the application of this technology for pose estimation with non-cooperative client vehicles, in support of on-orbit satellite servicing activities and asteroid redirect missions. The capability for autonomous rendezvous with non-cooperative satellites will enable refueling and servicing of satellites (particularly those designed without servicing in mind), allowing these vehicles to continue operating rather than being retired. Rendezvous with an asteroid will give further insight to the origin of individual asteroids. This research investigates numerous issues surrounding pose performance using LIDAR. To begin analyzing the characteristics of the data produced by Flash LIDAR, simulated and laboratory testing have been completed. Observations of common asteroid materials were made with a surrogate LIDAR, characterizing the reflectivity of the materials. A custom Iterative Closest Point (ICP) algorithm was created to estimate the relative position and orientation of the LIDAR relative to the observed object. The performance of standardized pose estimation techniques (including ICP) has been examined using non-cooperative data as well as the characteristics of the materials that will potentially be observed during missions. For the hardware tests, a SwissRanger ToF camera was used as a surrogate Flash LIDAR

    Attitude Determination Using Imaging Lidar

    Get PDF
    The purpose of this study is to determine the attitude of an out of control object using a new technology called lidar (Light Ranging and Detection). As the number of spacecraft continues to grow, it is paramount to introduce a new type of autonomous on-orbit satellite inspection and repair involving docking. Traditional space vision technology is based on video systems. This method is limited by the necessity of operating when the target is illuminated by the sunlight or using its own source of illumination. The use of laser imaging technology offers an elegant solution to these challenges. This approach allows the collection of range data, while scanning the lidar field-of-view together with the transmitted laser beam across the required solid angle. A lidar simulator was implemented to generate point clouds of digital 3D models. This thesis describes methods that can be used to detect features such as edges, boundaries, surfaces and corners in the point cloud. From those features it was possible to define a reference frame and associate it to the object. Observing the evolution of this body frame, the changes in orientation can be deduced in the direction cosine matrix form. It was desired to retrieve angular rates in Euler angle form but since the conversion from rotation matrix to Euler is not a bijection, no satisfying results were obtained. The results are therefore expressed in terms of rotation matrix. It was found that depending on the orientation of the spacecraft the accuracy of the results varied. The results indicate that filtering of the direction cosine matrices might yield good data for determining attitude rates

    Robotics and AI-Enabled On-Orbit Operations With Future Generation of Small Satellites

    Get PDF
    The low-cost and short-lead time of small satellites has led to their use in science-based missions, earth observation, and interplanetary missions. Today, they are also key instruments in orchestrating technological demonstrations for On-Orbit Operations (O 3 ) such as inspection and spacecraft servicing with planned roles in active debris removal and on-orbit assembly. This paper provides an overview of the robotics and autonomous systems (RASs) technologies that enable robotic O 3 on smallsat platforms. Major RAS topics such as sensing & perception, guidance, navigation & control (GN&C) microgravity mobility and mobile manipulation, and autonomy are discussed from the perspective of relevant past and planned missions

    CEU Session #4 - Space Robotics for On-Orbit Servicing and Space Debris Removal

    Get PDF
    The next ten years will see an unprecedented increase in the number of spacecraft deployed in Earth orbit and the number of commercial ventures operating space assets. The large increase in the number of spacecraft and the large increase in the commercial value of space will lead to renewed interest in robotic on-orbit servicing (OOS) and active debris removal (ADR). The lecture will provide a brief overview over the history of crewed and robotic OOS and discuss the missions planned for the near future. It will then proceed to identify the critical enabling technologies for a future, operational OOS and ADR infrastructure, discuss the technical challenges and present promising concepts and demonstrated technologies that can make routine OOS and ADR a possibility. The focus will be on robotics technologies and spacecraft guidance, navigation and control systems

    Evaluation of Coarse Sun Sensor in a Miniaturized Distributed Relative Navigation System: An Experimental and Analytical Investigation

    Get PDF
    Observing the relative state of two space vehicles has been an active field of research since the earliest attempts at space rendezvous and docking during the 1960's. Several techniques have successfully been employed by several space agencies and the importance of these systems has been repeatedly demonstrated during the on-orbit assembly and continuous re-supply of the International Space Station. More recent efforts are focused on technologies that can enable fully automated navigation and control of space vehicles. Technologies which have previously been investigated or are actively researched include Video Guidance Systems (VGS), Light Detection and Ranging (LIDAR), RADAR, Differential GPS (DGPS) and Visual Navigation Systems. The proposed system leverages the theoretical foundation which has been advanced in the development of VisNav, invented at Texas A & M University, and the miniaturized commercially available Northstar sensor from Evolution Robotics. The dissertation first surveys contemporary technology, followed by an analytical investigation of the coarse sun sensor and errors associated with utilizing it in the near-field. Next, the commercial Northstar sensor is investigated, utilizing fundamentals to generate a theoretical model of its behavior, followed by the development of an experiment for the purpose of investigating and characterizing the sensor's performance. Experimental results are then presented and compared with a numerical simulation of a single-sensor system performance. A case study evaluating a two sensor implementation is presented evaluating the proposed system's performance in a multisensor configuration. The initial theoretical analysis relied on use of the cosine model, which proved inadequate in fully capturing the response of the coarse sun sensor. Fresenel effects were identified as a significant source of unmodeled sensor behavior and subsequently incorporated into the model. Additionally, near-field effects were studied and modeled. The near-field effects of significance include: unequal incidence angle, unequal incidence power, and non-uniform radiated power. It was found that the sensor displayed inherent instabilities in the 0.3 degree range. However, it was also shown that the sensor could be calibrated to this level. Methods for accomplishing calibration of the sensor in the near-field were introduced and feasibility of achieving better than 1 cm and 1 degree relative position and attitude accuracy in close proximity, even on a small satellite platform, was determined
    corecore