936 research outputs found

    Doctor of Philosophy

    Get PDF
    dissertationThe need for position and orientation information in a wide variety of applications has led to the development of equally varied methods for providing it. Amongst the alternatives, inertial navigation is a solution that o ffers self-contained operation and provides angular rate, orientation, acceleration, velocity, and position information. Until recently, the size, cost, and weight of inertial sensors has limited their use to vehicles with relatively large payload capacities and instrumentation budgets. However, the development of microelectromechanical system (MEMS) inertial sensors now o ers the possibility of using inertial measurement in smaller, even human-scale, applications. Though much progress has been made toward this goal, there are still many obstacles. While operating independently from any outside reference, inertial measurement su ers from unbounded errors that grow at rates up to cubic in time. Since the reduced size and cost of these new miniaturized sensors comes at the expense of accuracy and stability, the problem of error accumulation becomes more acute. Nevertheless, researchers have demonstrated that useful results can be obtained in real-world applications. The research presented herein provides several contributions to the development of human-scale inertial navigation. A calibration technique allowing complex sensor models to be identified using inexpensive hardware and linear solution techniques has been developed. This is shown to provide significant improvements in the accuracy of the calibrated outputs from MEMS inertial sensors. Error correction algorithms based on easily identifiable characteristics of the sensor outputs have also been developed. These are demonstrated in both one- and three-dimensional navigation. The results show significant improvements in the levels of accuracy that can be obtained using these inexpensive sensors. The algorithms also eliminate empirical, application-specific simplifications and heuristics, upon which many existing techniques have depended, and make inertial navigation a more viable solution for tracking the motion around us

    Development and Flight of a Robust Optical-Inertial Navigation System Using Low-Cost Sensors

    Get PDF
    This research develops and tests a precision navigation algorithm fusing optical and inertial measurements of unknown objects at unknown locations. It provides an alternative to the Global Positioning System (GPS) as a precision navigation source, enabling passive and low-cost navigation in situations where GPS is denied/unavailable. This paper describes two new contributions. First, a rigorous study of the fundamental nature of optical/inertial navigation is accomplished by examining the observability grammian of the underlying measurement equations. This analysis yields a set of design principles guiding the development of optical/inertial navigation algorithms. The second contribution of this research is the development and flight test of an optical-inertial navigation system using low-cost and passive sensors (including an inexpensive commercial-grade inertial sensor, which is unsuitable for navigation by itself). This prototype system was built and flight tested at the U.S. Air Force Test Pilot School. The algorithm that was implemented leveraged the design principles described above, and used images from a single camera. It was shown (and explained by the observability analysis) that the system gained significant performance by aiding it with a barometric altimeter and magnetic compass, and by using a digital terrain database (DTED). The (still) low-cost and passive system demonstrated performance comparable to high quality navigation-grade inertial navigation systems, which cost an order of magnitude more than this optical-inertial prototype. The resultant performance of the system tested provides a robust and practical navigation solution for Air Force aircraft

    Innovative Solutions for Navigation and Mission Management of Unmanned Aircraft Systems

    Get PDF
    The last decades have witnessed a significant increase in Unmanned Aircraft Systems (UAS) of all shapes and sizes. UAS are finding many new applications in supporting several human activities, offering solutions to many dirty, dull, and dangerous missions, carried out by military and civilian users. However, limited access to the airspace is the principal barrier to the realization of the full potential that can be derived from UAS capabilities. The aim of this thesis is to support the safe integration of UAS operations, taking into account both the user's requirements and flight regulations. The main technical and operational issues, considered among the principal inhibitors to the integration and wide-spread acceptance of UAS, are identified and two solutions for safe UAS operations are proposed: A. Improving navigation performance of UAS by exploiting low-cost sensors. To enhance the performance of the low-cost and light-weight integrated navigation system based on Global Navigation Satellite System (GNSS) and Micro Electro-Mechanical Systems (MEMS) inertial sensors, an efficient calibration method for MEMS inertial sensors is required. Two solutions are proposed: 1) The innovative Thermal Compensated Zero Velocity Update (TCZUPT) filter, which embeds the compensation of thermal effect on bias in the filter itself and uses Back-Propagation Neural Networks to build the calibration function. Experimental results show that the TCZUPT filter is faster than the traditional ZUPT filter in mapping significant bias variations and presents better performance in the overall testing period. Moreover, no calibration pre-processing stage is required to keep measurement drift under control, improving the accuracy, reliability, and maintainability of the processing software; 2) A redundant configuration of consumer grade inertial sensors to obtain a self-calibration of typical inertial sensors biases. The result is a significant reduction of uncertainty in attitude determination. In conclusion, both methods improve dead-reckoning performance for handling intermittent GNSS coverage. B. Proposing novel solutions for mission management to support the Unmanned Traffic Management (UTM) system in monitoring and coordinating the operations of a large number of UAS. Two solutions are proposed: 1) A trajectory prediction tool for small UAS, based on Learning Vector Quantization (LVQ) Neural Networks. By exploiting flight data collected when the UAS executes a pre-assigned flight path, the tool is able to predict the time taken to fly generic trajectory elements. Moreover, being self-adaptive in constructing a mathematical model, LVQ Neural Networks allow creating different models for the different UAS types in several environmental conditions; 2) A software tool aimed at supporting standardized procedures for decision-making process to identify UAS/payload configurations suitable for any type of mission that can be authorized standing flight regulations. The proposed methods improve the management and safe operation of large-scale UAS missions, speeding up the flight authorization process by the UTM system and supporting the increasing level of autonomy in UAS operations

    Coupling Vanishing Point Tracking with Inertial Navigation to Estimate Attitude in a Structured Environment

    Get PDF
    This research aims to obtain accurate and stable estimates of a vehicle\u27s attitude by coupling consumer-grade inertial and optical sensors. This goal is pursued by first modeling both inertial and optical sensors and then developing a technique for identifying vanishing points in perspective images of a structured environment. The inertial and optical processes are then coupled to enable each one to aid the other. The vanishing point measurements are combined with the inertial data in an extended Kalman filter to produce overall attitude estimates. This technique is experimentally demonstrated in an indoor corridor setting using a motion profile designed to simulate flight. Through comparison with a tactical-grade inertial sensor, the combined consumer-grade inertial and optical data are shown to produce a stable attitude solution accurate to within 1.5 degrees. A measurement bias is manifested which degrades the accuracy by up to another 2.5 degrees

    Multimodal Noncontact Tracking of Surgical Instruments

    Get PDF
    For many procedures, open surgery is being replaced with minimally invasive surgical (MIS) techniques. The advantages of MIS include reduced operative trauma and fewer complications leading to faster patient recovery, better cosmetic results and shorter hospital stays. As the demand for MIS procedures increases, effective surgical training tools must be developed to improve procedure efficiency and patient safety. Motion tracking of laparoscopic instruments can provide objective skills assessment for novices and experienced users. The most common approaches to noncontact motion capture are optical and electromagnetic (EM) tracking systems, though each approach has operational limitations. Optical trackers are prone to occlusion and the performance of EM trackers degrades in the presence of magnetic and ferromagnetic material. The cost of these systems also limits their availability for surgical training and clinical environments. This thesis describes the development and validation of a novel, noncontact laparoscopic tracking system as an inexpensive alternative to current technology. This system is based on the fusion of inertial, magnetic and distance sensing to generate real-time, 6-DOF pose data. Orientation is estimated using a Kalman-filtered attitude-heading reference system (AHRS) and restricted motion at the trocar provides a datum from which position information can be recovered. The Inertial and Range-Enhanced Surgical (IRES) Tracker was prototyped, then validated using a MIS training box and by comparison to an EM tracking system. Results of IRES tracker testing showed similar performance to an EM tracker with position error as low as 1.25 mm RMS and orientation error \u3c0.58 degrees RMS along each axis. The IRES tracker also displayed greater precision and superior magnetic interference rejection capabilities. At a fraction of the cost of current laparoscopic tracking methods, the IRES tracking system would provide an excellent alternative for use in surgical training and skills assessment

    Development and Validation of an IMU/GPS/Galileo Integration Navigation System for UAV

    Get PDF
    Several and distinct Unmanned Aircraft Vehicle (UAV) applications are emerging, demanding steps to be taken in order to allow those platforms to operate in an un-segregated airspace. The key risk component, hindering the widespread integration of UAV in an un-segregated airspace, is the autonomous component: the need for a high level of autonomy in the UAV that guarantees a safe and secure integration in an un-segregated airspace. At this point, the UAV accurate state estimation plays a fundamental role for autonomous UAV, being one of the main responsibilities of the onboard autopilot. Given the 21st century global economic paradigm, academic projects based on inexpensive UAV platforms but on expensive commercial autopilots start to become a non-economic solution. Consequently, there is a pressing need to overcome this problem through, on one hand, the development of navigation systems using the high availability of low cost, low power consumption, and small size navigation sensors offered in the market, and, on the other hand, using Global Navigation Satellite Systems Software Receivers (GNSS SR). Since the performance that is required for several applications in order to allow UAV to fly in an un-segregated airspace is not yet defined, for most UAV academic applications, the navigation system accuracy required should be at least the same as the one provided by the available commercial autopilots. This research focuses on the investigation of the performance of an integrated navigation system composed by a low performance inertial measurement unit (IMU) and a GNSS SR. A strapdown mechanization algorithm, to transform raw inertial data into navigation solution, was developed, implemented and evaluated. To fuse the data provided by the strapdown algorithm with the one provided by the GNSS SR, an Extended Kalman Filter (EKF) was implemented in loose coupled closed-loop architecture, and then evaluated. Moreover, in order to improve the performance of the IMU raw data, the Allan variance and denoise techniques were considered for both studying the IMU error model and improving inertial sensors raw measurements. In order to carry out the study, a starting question was made and then, based on it, eight questions were derived. These eight secondary questions led to five hypotheses, which have been successfully tested along the thesis. This research provides a deliverable to the Project of Research and Technologies on Unmanned Air Vehicles (PITVANT) Group, consisting of a well-documented UAV Development and Validation of an IMU/GPS/Galileo Integration Navigation System for UAV II navigation algorithm, an implemented and evaluated navigation algorithm in the MatLab environment, and Allan variance and denoising algorithms to improve inertial raw data, enabling its full implementation in the existent Portuguese Air Force Academy (PAFA) UAV. The derivable provided by this thesis is the answer to the main research question, in such a way that it implements a step by step procedure on how the Strapdown IMU (SIMU)/GNSS SR should be developed and implemented in order to replace the commercial autopilot. The developed integrated SIMU/GNSS SR solution evaluated, in post-processing mode, through van-test scenario, using real data signals, at the Galileo Test and Development Environment (GATE) test area in Berchtesgaden, Germany, when confronted with the solution provided by the commercial autopilot, proved to be of better quality. Although no centimetre-level of accuracy was obtained for the position and velocity, the results confirm that the integration strategy outperforms the Piccolo system performance, being this the ultimate goal of this research work

    Vision-Aided Autonomous Precision Weapon Terminal Guidance Using a Tightly-Coupled INS and Predictive Rendering Techniques

    Get PDF
    This thesis documents the development of the Vision-Aided Navigation using Statistical Predictive Rendering (VANSPR) algorithm which seeks to enhance the endgame navigation solution possible by inertial measurements alone. The eventual goal is a precision weapon that does not rely on GPS, functions autonomously, thrives in complex 3-D environments, and is impervious to jamming. The predictive rendering is performed by viewpoint manipulation of computer-generated of target objects. A navigation solution is determined by an Unscented Kalman Filter (UKF) which corrects positional errors by comparing camera images with a collection of statistically significant virtual images. Results indicate that the test algorithm is a viable method of aiding an inertial-only navigation system to achieve the precision necessary for most tactical strikes. On 14 flight test runs, the average positional error was 166 feet at endgame, compared with an inertial-only error of 411 feet

    Automatic Landing without GPS

    Get PDF
    Sagem Défense et Sécurité (now Safran Electronics & Defense), a French space and defense company of the SAFRAN group, is working on the next generation of Unmanned Aerial System (UAS). This UAS features a fully automatic Unmanned Aerial Vehicle (UAV) equipped with a state-of-the-art navigation system. This navigation system relies mainly on a high-accuracy Inertial Measurement Unit (IMU) coupled with a GPS receiver. But the GPS is known to be easy to jam, either naturally (solar flare for example) or intentionally. In the event of a loss of GPS signal, the navigation system is not able anymore to provide accurate position and speed information to the Flight Controller (FC). Deprived of reliable position and speed information the FC is not able to guide the UAV safely to the ground. So the goal of the project detailed in this report is to add to the existing UAS the ability to land safely in case of a GPS loss. At the core of the solution described in this report is a sensor fusion algorithm taking as input inertial, vision based, barometric, laser and azimuthal measurements. The filter is using all these measurements to establish reliable position and speed estimates. Even if very reliable systems enabling automatic landing without GPS exist today; they all require heavy and expensive ground equipment. This is why SAGEM decided to develop its own solution using more embedded sensors and less ground equipment. This is a first step toward a fully embedded automatic landing system nondependent on GPS availability, a very active field of research today. All the tests done during the thesis and presented in this report shows the efficiency and robustness of this solution

    Head Tracking for 3D Audio Using a GPS-Aided MEMS IMU

    Get PDF
    Audio systems have been developed which use stereo headphones to project sound in three dimensions. When using these 3D audio systems, audio cues sound like they are originating from a particular direction. There is a desire to apply 3D audio to general aviation applications, such as projecting control tower transmissions in the direction of the tower or providing an audio orientation cue for VFR pilots who find themselves in emergency zero-visibility conditions. 3D audio systems, however, require real-time knowledge of the pilot\u27s head orientation in order to be effective. This research describes the development and testing of a low-cost head tracking system for 3D audio rendering applied in general aviation. The system uses a low-cost MEMS IMU combined with a low-cost, single frequency GPS receiver. Real-time data from both of these systems was sent to a laptop computer where a real-time Kalman filter was implemented in MATLAB to solve for position velocity, and attitude. The attitude information was then sent to a 3D audio system for sound direction rendering. The system was flight tested on board a Raytheon C-12C aircraft. The accuracy of the system was measured by comparing its output to truth data from a high-accuracy post-processed navigation-grade INS/DGPS solution. Results showed that roll and pitch error were accurate to within 1-2 degrees, but that heading error was dependent upon the flight trajectory. During straight-and-level flight, the heading error would drift up to 10-15 degrees because of heading unobservability. However, even with heading error, the ability of a pilot to determine the correct direction of a 3D audio cue was significantly improved when using the developed head tracking system over using the navigation-grade INS/GPS system fixed to the aircraft

    Automated Driftmeter Fused with Inertial Navigation

    Get PDF
    The motivation of this research is to address the use of bearing-only measurements taken by an optical sensor to aid an Inertial Navigation System (INS) whose accelerometers and gyroscopes are subject to drift and bias errors. The concept of Simultaneous Localization And Mapping (SLAM) is employed in a bootstrapping manner: the bearing measurements are used to geolocate ground features, following which the bearings taken over time of the said ground features are used to improve the navigation state provided by the INS. In this research the INS aiding action of tracking stationary, but unknown, ground features over time is evaluated. It does not, however, address the critical image registration issue associated with image processing. It is assumed that stationary ground features are able to be detected and tracked as pixel representations by a real-time image processing algorithm. Simulations are performed which indicate the potential of this research. It is shown that during wings level flight at constant speed and fixed altitude, an aircraft that geolocates and tracks ground objects can significantly reduce the error in two of its three dimensions of flight, relative to an Earth-fixed navigation frame. The aiding action of geolocating and tracking ground features, in-line with the direction of flight, with a downward facing camera did not provide improvement in the aircraft\u27s x-position estimate. However, the aircraft\u27s y-position estimate, as well as the altitude estimate, were signicantly improved
    • …
    corecore