14 research outputs found
Achieving Real-Time Mode Estimation through Offline Compilation
As exploration of our solar system and outerspace move into the future, spacecraft are being developed to venture on increasingly challenging missions with bold objectives. The spacecraft tasked with completing these missions are becoming progressively more complex. This increases the potential for mission failure due to hardware malfunctions and unexpected spacecraft behavior. A solution to this problem lies in the development of an advanced fault management system. Fault management enables spacecraft to respond to failures and take repair actions so that it may continue its mission. The two main approaches developed for spacecraft fault management have been rule-based and model-based systems. Rules map sensor information to system behaviors, thus achieving fast response times, and making the actions of the fault management system explicit. These rules are developed by having a human reason through the interactions between spacecraft components. This process is limited by the number of interactions a human can reason about correctly. In the model-based approach, the human provides component models, and the fault management system reasons automatically about system wide interactions and complex fault combinations. This approach improves correctness, and makes explicit the underlying system models, whereas these are implicit in the rule-based approach. We propose a fault detection engine, Compiled Mode Estimation (CME) that unifies the strengths of the rule-based and model-based approaches. CME uses a compiled model to determine spacecraft behavior more accurately. Reasoning related to fault detection is compiled in an off-line process into a set of concurrent, localized diagnostic rules. These are then combined on-line along with sensor information to reconstruct the diagnosis of the system. These rules enable a human to inspect the diagnostic consequences of CME. Additionally, CME is capable of reasoning through component interactions automatically and still provide fast and correct responses. The implementation of this engine has been tested against the NEAR spacecraft advanced rule-based system, resulting in detection of failures beyond that of the rules. This evolution in fault detection will enable future missions to explore the furthest reaches of the solar system without the burden of human intervention to repair failed components
Achieving real-time mode estimation through offline compilation
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2003.Includes bibliographical references (p. 225-226).by John M. Van Eepoel.S.M
Range and Intensity Image-Based Terrain and Vehicle Relative Pose Estimation System
A navigation system includes an image acquisition device for acquiring a range image of a target vehicle, at least one processor, a memory including a target vehicle model and computer readable program code, where the processor and the computer readable program code are configured to cause the navigation system to convert the range image to a point cloud having three dimensions, compute a transform from the target vehicle model to the point cloud, and use the transform to estimate the target vehicle's attitude and position for capturing the target vehicle
Flight Results from the HST SM4 Relative Navigation Sensor System
On May 11, 2009, Space Shuttle Atlantis roared off of Launch Pad 39A enroute to the Hubble Space Telescope (HST) to undertake its final servicing of HST, Servicing Mission 4. Onboard Atlantis was a small payload called the Relative Navigation Sensor experiment, which included three cameras of varying focal ranges, avionics to record images and estimate, in real time, the relative position and attitude (aka "pose") of the telescope during rendezvous and deploy. The avionics package, known as SpaceCube and developed at the Goddard Space Flight Center, performed image processing using field programmable gate arrays to accelerate this process, and in addition executed two different pose algorithms in parallel, the Goddard Natural Feature Image Recognition and the ULTOR Passive Pose and Position Engine (P3E) algorithm
Performance Characterization of a Landmark Measurement System for ARRM Terrain Relative Navigation
This paper describes the landmark measurement system being developed for terrain relative navigation on NASAs Asteroid Redirect Robotic Mission (ARRM),and the results of a performance characterization study given realistic navigational and model errors. The system is called Retina, and is derived from the stereophotoclinometry methods widely used on other small-body missions. The system is simulated using synthetic imagery of the asteroid surface and discussion is given on various algorithmic design choices. Unlike other missions, ARRMs Retina is the first planned autonomous use of these methods during the close-proximity and descent phase of the mission
Raven: An On-Orbit Relative Navigation Demonstration Using International Space Station Visiting Vehicles
Since the last Hubble Servicing Mission five years ago, the Satellite Servicing Capabilities Office (SSCO) at the NASA Goddard Space Flight Center (GSFC) has been focusing on maturing the technologies necessary to robotically service orbiting legacy assets-spacecraft not necessarily designed for in-flight service. Raven, SSCO's next orbital experiment to the International Space Station (ISS), is a real-time autonomous non-cooperative relative navigation system that will mature the estimation algorithms required for rendezvous and proximity operations for a satellite-servicing mission. Raven will fly as a hosted payload as part of the Space Test Program's STP-H5 mission, which will be mounted on an external ExPRESS Logistics Carrier (ELC) and will image the many visiting vehicles arriving and departing from the ISS as targets for observation. Raven will host multiple sensors: a visible camera with a variable field of view lens, a long-wave infrared camera, and a short-wave flash lidar. This sensor suite can be pointed via a two-axis gimbal to provide a wide field of regard to track the visiting vehicles as they make their approach. Various real-time vision processing algorithms will produce range, bearing, and six degree of freedom pose measurements that will be processed in a relative navigation filter to produce an optimal relative state estimate. In this overview paper, we will cover top-level requirements, experimental concept of operations, system design, and the status of Raven integration and test activities
Satellite Servicing's Autonomous Rendezvous and Docking Testbed on the International Space Station
The Space Servicing Capabilities Project (SSCP) at NASA's Goddard Space Flight Center (GSFC) has been tasked with developing systems for servicing space assets. Starting in 2009, the SSCP completed a study documenting potential customers and the business case for servicing, as well as defining several notional missions and required technologies. In 2010, SSCP moved to the implementation stage by completing several ground demonstrations and commencing development of two International Space Station (ISS) payloads-the Robotic Refueling Mission (RRM) and the Dextre Pointing Package (DPP)--to mitigate new technology risks for a robotic mission to service existing assets in geosynchronous orbit. This paper introduces the DPP, scheduled to fly in July of 2012 on the third operational SpaceX Dragon mission, and its Autonomous Rendezvous and Docking (AR&D) instruments. The combination of sensors and advanced avionics provide valuable on-orbit demonstrations of essential technologies for servicing existing vehicles, both cooperative and non-cooperative
Fast Kalman Filtering for Relative Spacecraft Position and Attitude Estimation for the Raven ISS Hosted Payload
The Raven ISS Hosted Payload will feature several pose measurement sensors on a pan/tilt gimbal which will be used to autonomously track resupply vehicles as they approach and depart the International Space Station. This paper discusses the derivation of a Relative Navigation Filter (RNF) to fuse measurements from the different pose measurement sensors to produce relative position and attitude estimates. The RNF relies on relative translation and orientation kinematics and careful pose sensor modeling to eliminate dependence on orbital position information and associated orbital dynamics models. The filter state is augmented with sensor biases to provide a mechanism for the filter to estimate and mitigate the offset between the measurements from different pose sensor
Pose Measurement Performance of the Argon Relative Navigation Sensor Suite in Simulated Flight Conditions
Argon is a flight-ready sensor suite with two visual cameras, a flash LIDAR, an on- board flight computer, and associated electronics. Argon was designed to provide sensing capabilities for relative navigation during proximity, rendezvous, and docking operations between spacecraft. A rigorous ground test campaign assessed the performance capability of the Argon navigation suite to measure the relative pose of high-fidelity satellite mock-ups during a variety of simulated rendezvous and proximity maneuvers facilitated by robot manipulators in a variety of lighting conditions representative of the orbital environment. A brief description of the Argon suite and test setup are given as well as an analysis of the performance of the system in simulated proximity and rendezvous operations
Relative Terrain Imaging Navigation for the Asteroid Redirect Robotic Mission (ARRM)
No abstract availabl