2,596 research outputs found

    Inertial Navigation System Aiding Using Vision

    Get PDF
    The aiding of an INS using measurements over time of the line of sight of ground features as they come into view of an onboard camera is investigated. The objective is to quantify the reduction in the navigation states\u27 errors by using bearings-only measurements over time of terrain features in the aircraft\u27s field of view. INS aiding is achieved through the use of a Kalman Filter. The design of the Kalman Filter is presented and it is shown that during a long range, wings level cruising flight at constant velocity and altitude, a 90% reduction in the aided INS-calculated navigation state errors compared to a free INS, is possible

    Inertial navigation aided by simultaneous loacalization and mapping

    Get PDF
    Unmanned aerial vehicles technologies are getting smaller and cheaper to use and the challenges of payload limitation in unmanned aerial vehicles are being overcome. Integrated navigation system design requires selection of set of sensors and computation power that provides reliable and accurate navigation parameters (position, velocity and attitude) with high update rates and bandwidth in small and cost effective manner. Many of today’s operational unmanned aerial vehicles navigation systems rely on inertial sensors as a primary measurement source. Inertial Navigation alone however suffers from slow divergence with time. This divergence is often compensated for by employing some additional source of navigation information external to Inertial Navigation. From the 1990’s to the present day Global Positioning System has been the dominant navigation aid for Inertial Navigation. In a number of scenarios, Global Positioning System measurements may be completely unavailable or they simply may not be precise (or reliable) enough to be used to adequately update the Inertial Navigation hence alternative methods have seen great attention. Aiding Inertial Navigation with vision sensors has been the favoured solution over the past several years. Inertial and vision sensors with their complementary characteristics have the potential to answer the requirements for reliable and accurate navigation parameters. In this thesis we address Inertial Navigation position divergence. The information for updating the position comes from combination of vision and motion. When using such a combination many of the difficulties of the vision sensors (relative depth, geometry and size of objects, image blur and etc.) can be circumvented. Motion grants the vision sensors with many cues that can help better to acquire information about the environment, for instance creating a precise map of the environment and localize within the environment. We propose changes to the Simultaneous Localization and Mapping augmented state vector in order to take repeated measurements of the map point. We show that these repeated measurements with certain manoeuvres (motion) around or by the map point are crucial for constraining the Inertial Navigation position divergence (bounded estimation error) while manoeuvring in vicinity of the map point. This eliminates some of the uncertainty of the map point estimates i.e. it reduces the covariance of the map points estimates. This concept brings different parameterization (feature initialisation) of the map points in Simultaneous Localization and Mapping and we refer to it as concept of aiding Inertial Navigation by Simultaneous Localization and Mapping. We show that making such an integrated navigation system requires coordination with the guidance and control measurements and the vehicle task itself for performing the required vehicle manoeuvres (motion) and achieving better navigation accuracy. This fact brings new challenges to the practical design of these modern jam proof Global Positioning System free autonomous navigation systems. Further to the concept of aiding Inertial Navigation by Simultaneous Localization and Mapping we have investigated how a bearing only sensor such as single camera can be used for aiding Inertial Navigation. The results of the concept of Inertial Navigation aided by Simultaneous Localization and Mapping were used. New parameterization of the map point in Bearing Only Simultaneous Localization and Mapping is proposed. Because of the number of significant problems that appear when implementing the Extended Kalman Filter in Inertial Navigation aided by Bearing Only Simultaneous Localization and Mapping other algorithms such as Iterated Extended Kalman Filter, Unscented Kalman Filter and Particle Filters were implemented. From the results obtained, the conclusion can be drawn that the nonlinear filters should be the choice of estimators for this application

    Vision-Aided Autonomous Precision Weapon Terminal Guidance Using a Tightly-Coupled INS and Predictive Rendering Techniques

    Get PDF
    This thesis documents the development of the Vision-Aided Navigation using Statistical Predictive Rendering (VANSPR) algorithm which seeks to enhance the endgame navigation solution possible by inertial measurements alone. The eventual goal is a precision weapon that does not rely on GPS, functions autonomously, thrives in complex 3-D environments, and is impervious to jamming. The predictive rendering is performed by viewpoint manipulation of computer-generated of target objects. A navigation solution is determined by an Unscented Kalman Filter (UKF) which corrects positional errors by comparing camera images with a collection of statistically significant virtual images. Results indicate that the test algorithm is a viable method of aiding an inertial-only navigation system to achieve the precision necessary for most tactical strikes. On 14 flight test runs, the average positional error was 166 feet at endgame, compared with an inertial-only error of 411 feet

    Satellite Navigation for the Age of Autonomy

    Full text link
    Global Navigation Satellite Systems (GNSS) brought navigation to the masses. Coupled with smartphones, the blue dot in the palm of our hands has forever changed the way we interact with the world. Looking forward, cyber-physical systems such as self-driving cars and aerial mobility are pushing the limits of what localization technologies including GNSS can provide. This autonomous revolution requires a solution that supports safety-critical operation, centimeter positioning, and cyber-security for millions of users. To meet these demands, we propose a navigation service from Low Earth Orbiting (LEO) satellites which deliver precision in-part through faster motion, higher power signals for added robustness to interference, constellation autonomous integrity monitoring for integrity, and encryption / authentication for resistance to spoofing attacks. This paradigm is enabled by the 'New Space' movement, where highly capable satellites and components are now built on assembly lines and launch costs have decreased by more than tenfold. Such a ubiquitous positioning service enables a consistent and secure standard where trustworthy information can be validated and shared, extending the electronic horizon from sensor line of sight to an entire city. This enables the situational awareness needed for true safe operation to support autonomy at scale.Comment: 11 pages, 8 figures, 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS

    Pilot Assisted Inertial Navigation System Aiding Using Bearings-Only Measurements Taken Over Time

    Get PDF
    The objective of this work is to develop an alternative INS aiding source other than the GPS, while preserving the autonomy of the integrated navigation system. It is proposed to develop a modernized method of aerial navigation using driftmeter measurements from an E/O system for ground feature tracking, and an independent altitude sensor in conjunction with the INS. The pilot will track a ground feature with the E/O system, while the aircraft is on autopilot holding constant airspeed, altitude, and heading during an INS aiding session. The ground feature measurements from the E/O system and the INS output form measurements provided to a linear KF running on the navigation computer to accomplish the INS aiding action. Aiding the INS will be periodically repeated as operationally permissible under pilot discretion. Little to no modeling error will be present when implementing the linear Kalman filter, indicating the strength of the INS aiding action will be exclusively determined by the prevailing degree of observability

    Multihop Rendezvous Algorithm for Frequency Hopping Cognitive Radio Networks

    Get PDF
    Cognitive radios allow the possibility of increasing utilization of the wireless spectrum, but because of their dynamic access nature require new techniques for establishing and joining networks, these are known as rendezvous. Existing rendezvous algorithms assume that rendezvous can be completed in a single round or hop of time. However, cognitive radio networks utilizing frequency hopping that is too fast for synchronization packets to be exchanged in a single hop require a rendezvous algorithm that supports multiple hop rendezvous. We propose the Multiple Hop (MH) rendezvous algorithm based on a pre-shared sequence of random numbers, bounded timing differences, and similar channel lists to successfully match a percentage of hops. It is tested in simulation against other well known rendezvous algorithms and implemented in GNU Radio for the HackRF One. We found from the results of our simulation testing that at 100 hops per second the MH algorithm is faster than other tested algorithms at 50 or more channels with timing ±50 milliseconds, at 250 or more channels with timing ±500 milliseconds, and at 2000 channels with timing ±5000 milliseconds. In an asymmetric environment with 100 hops per second, a 500 millisecond timing difference, and 1000 channels the MH algorithm was faster than other tested algorithms as long as the channel overlap was 35% or higher for a 50% required packet success to complete rendezvous. We recommend the Multihop algorithm for use cases with a fast frequency hop rate and a slow data transmission rate requiring multiple hops to rendezvous or use cases where the channel count equals or exceeds 250 channels, as long as timing data is available and all of the radios to be connected to the network can be pre-loaded with a shared seed

    Visual-INS Using a Human Operator and Converted Measurements

    Get PDF
    A method human operated INS aiding is explored in which the pilot identifies and tracks a ground feature of unknown position over a short measurement epoch using an E/O sensor. One then refers to Visual-INS. In contrast to current research trends, a human operator is entrusted with visually tracking the ground feature. In addition, a less conventional measurement linearization technique is applied to generate “converted” measurements. A linear regression algorithm is then applied to the converted measurements providing an estimate of the INS horizontal velocity error and accelerometer biases. At the completion of the measurement epoch, the INS is corrected by subtracting out the estimated errors. Aiding the INS in this manner provides a significant improvement in the accuracy of the INS-provided aircraft navigation state estimates when compared to those of a free/unaided INS. A number of scenario are simulated including with and without a constrained flight path, with single vs. multiple ground feature tracking sessions, and with a navigation vs. tactical grade INS. Applications for this autonomous navigation approach include navigation in GPS denied environments and/or when RF emitting/receiving sensors are undesirable

    Urban Environment Navigation with Real-Time Data Utilizing Computer Vision, Inertial, and GPS Sensors

    Get PDF
    The purpose of this research was to obtain a navigation solution that used real data, in a degraded or denied global positioning system (GPS) environment, from low cost commercial o the shelf sensors. The sensors that were integrated together were a commercial inertial measurement unit (IMU), monocular camera computer vision algorithm, and GPS. Furthermore, the monocular camera computer vision algorithm had to be robust enough to handle any camera orientation that was presented to it. This research develops a visual odometry 2-D zero velocity measurement that is derived by both the features points that are extracted from a monocular camera and the rotation values given by an IMU. By presenting measurements as a 2-D zero velocity measurements, errors associated with scale, which is unobservable by a monocular camera, can be removed from the measurements. The 2-D zero velocity measurements are represented as two normalized velocity vectors that are orthogonal to the vehicle\u27s direction of travel, and are used to determine the error in the INS\u27s measured velocity vector. This error is produced by knowing which directions the vehicle is not moving, given by the 2-D zero velocity measurements, in and comparing it to the direction of travel the vehicle is thought to be moving in. The performance was evaluated by comparing results that were obtained when different sensor pairings of a commercial IMU, GPS, and monocular computer vision algorithm were used to obtain the vehicle\u27s trajectory. Three separate monocular cameras, that each pointed in a different directions, were tested independently. Finally, the solutions provided by the GPS were degraded (i.e., the number of satellites available from the GPS were limited) to determine the e effectiveness of adding a monocular computer vision algorithm to a system operating with a degraded GPS solution

    A factorization approach to inertial affine structure from motion

    Full text link
    We consider the problem of reconstructing a 3-D scene from a moving camera with high frame rate using the affine projection model. This problem is traditionally known as Affine Structure from Motion (Affine SfM), and can be solved using an elegant low-rank factorization formulation. In this paper, we assume that an accelerometer and gyro are rigidly mounted with the camera, so that synchronized linear acceleration and angular velocity measurements are available together with the image measurements. We extend the standard Affine SfM algorithm to integrate these measurements through the use of image derivatives
    • …
    corecore