4,897 research outputs found
Non-global parameter estimation using local ensemble Kalman filtering
We study parameter estimation for non-global parameters in a low-dimensional
chaotic model using the local ensemble transform Kalman filter (LETKF). By
modifying existing techniques for using observational data to estimate global
parameters, we present a methodology whereby spatially-varying parameters can
be estimated using observations only within a localized region of space. Taking
a low-dimensional nonlinear chaotic conceptual model for atmospheric dynamics
as our numerical testbed, we show that this parameter estimation methodology
accurately estimates parameters which vary in both space and time, as well as
parameters representing physics absent from the model
Computer simulation of a pilot in V/STOL aircraft control loops
The objective was to develop a computerized adaptive pilot model for the computer model of the research aircraft, the Harrier II AV-8B V/STOL with special emphasis on propulsion control. In fact, two versions of the adaptive pilot are given. The first, simply called the Adaptive Control Model (ACM) of a pilot includes a parameter estimation algorithm for the parameters of the aircraft and an adaption scheme based on the root locus of the poles of the pilot controlled aircraft. The second, called the Optimal Control Model of the pilot (OCM), includes an adaption algorithm and an optimal control algorithm. These computer simulations were developed as a part of the ongoing research program in pilot model simulation supported by NASA Lewis from April 1, 1985 to August 30, 1986 under NASA Grant NAG 3-606 and from September 1, 1986 through November 30, 1988 under NASA Grant NAG 3-729. Once installed, these pilot models permitted the computer simulation of the pilot model to close all of the control loops normally closed by a pilot actually manipulating the control variables. The current version of this has permitted a baseline comparison of various qualitative and quantitative performance indices for propulsion control, the control loops and the work load on the pilot. Actual data for an aircraft flown by a human pilot furnished by NASA was compared to the outputs furnished by the computerized pilot and found to be favorable
IMPLEMENTATION OF KALMAN FILTER TO TRACKING CUSTOM FOUR-WHEEL DRIVE FOUR-WHEEL-STEERING ROBOTIC PLATFORM
Vehicle tracking is an important component of autonomy in the robotics field, requiring integration of hardware and software, and the application of advanced algorithms. Sensors are often plagued with noise and require filtering. Additionally, no single sensor is sufficient for effective tracking. Data from multiple sensors is needed in order to perform effective tracking. The Kalman Filter provides a convenient and efficient solution for filtering and fusing sensor data as well as estimating noise error covariances. Consequently, it has been essential in tracking algorithms since its introduction in 1960.
This thesis presents an application of the Kalman filter to tracking of a custom four-wheel-drive four-wheel-steering vehicle using a limited sensor suite. Sensor selection is discussed, along with the characteristics of the sensor noise as related to meeting the requirements of the Kalman filter for guaranteeing optimality. The filter requires the development of a dynamical model, which is derived using empirical data methods and evaluated. Tracking results are presented and compared to unfiltered data
Large-area visually augmented navigation for autonomous underwater vehicles
Submitted to the Joint Program in Applied Ocean Science & Engineering
in partial fulfillment of the requirements for the degree of Doctor of Philosophy
at the Massachusetts Institute of Technology
and the Woods Hole Oceanographic Institution
June 2005This thesis describes a vision-based, large-area, simultaneous localization and mapping (SLAM) algorithm that respects the low-overlap imagery constraints typical of autonomous underwater vehicles (AUVs) while exploiting the inertial sensor information that is routinely available on such platforms. We adopt a systems-level approach exploiting the complementary aspects of inertial sensing and visual perception from a calibrated pose-instrumented platform. This systems-level strategy yields a robust solution to underwater imaging that
overcomes many of the unique challenges of a marine environment (e.g., unstructured terrain, low-overlap imagery, moving light source). Our large-area SLAM algorithm recursively incorporates relative-pose constraints using a view-based representation that exploits exact sparsity in the Gaussian canonical form. This sparsity allows for efficient O(n) update complexity in the number of images composing the view-based map by utilizing recent multilevel relaxation techniques. We show that our algorithmic formulation is inherently sparse unlike other feature-based canonical SLAM algorithms, which impose sparseness via pruning approximations. In particular, we investigate
the sparsification methodology employed by sparse extended information filters (SEIFs)
and offer new insight as to why, and how, its approximation can lead to inconsistencies in
the estimated state errors. Lastly, we present a novel algorithm for efficiently extracting consistent marginal covariances useful for data association from the information matrix. In summary, this thesis advances the current state-of-the-art in underwater visual navigation by demonstrating end-to-end automatic processing of the largest visually navigated dataset to date using data collected from a survey of the RMS Titanic (path length over 3 km and 3100 m2 of mapped area). This accomplishment embodies the summed contributions of this thesis to several current SLAM research issues including scalability, 6 degree of
freedom motion, unstructured environments, and visual perception.This work was funded in part by the CenSSIS ERC of the National Science Foundation
under grant EEC-9986821, in part by the Woods Hole Oceanographic Institution through a
grant from the Penzance Foundation, and in part by a NDSEG Fellowship awarded through
the Department of Defense
Image Dependent Relative Formation Navigation for Autonomous Aerial Refueling
This research tests the feasibility, accuracy, and reliability of a predictive rendering and holistic comparison algorithm with use of an optical sensor to provide relative distance and position behind a lead or tanker aircraft. Using an accurate model of a tanker, an algorithm renders image(s) for comparison with actual collected images by a camera installed on the receiver aircraft. Based on this comparison, information used to create the rendered image(s) is used to provide the relative navigation solution required for autonomous air refueling. Given enough predicted images and processing time, this approach should reliably find an accurate solution. Building on previous work, this research aims to minimize the number of required rendered images to provide a real-time navigational solution with sufficient accuracy for an auto-pilot controller installed on future Unmanned Aircraft Systems
Long Distance GNSS-Denied Visual Inertial Navigation for Autonomous Fixed Wing Unmanned Air Vehicles: SO(3) Manifold Filter based on Virtual Vision Sensor
This article proposes a visual inertial navigation algorithm intended to
diminish the horizontal position drift experienced by autonomous fixed wing
UAVs (Unmanned Air Vehicles) in the absence of GNSS (Global Navigation
Satellite System) signals. In addition to accelerometers, gyroscopes, and
magnetometers, the proposed navigation filter relies on the accurate
incremental displacement outputs generated by a VO (Visual Odometry) system,
denoted here as a Virtual Vision Sensor or VVS, which relies on images of the
Earth surface taken by an onboard camera and is itself assisted by the filter
inertial estimations. Although not a full replacement for a GNSS receiver since
its position observations are relative instead of absolute, the proposed system
enables major reductions in the GNSS-Denied attitude and position estimation
errors. In order to minimize the accumulation of errors in the absence of
absolute observations, the filter is implemented in the manifold of rigid body
rotations or SO (3). Stochastic high fidelity simulations of two representative
scenarios involving the loss of GNSS signals are employed to evaluate the
results. The authors release the C++ implementation of both the visual inertial
navigation filter and the high fidelity simulation as open-source software.Comment: 27 pages, 14 figures. arXiv admin note: substantial text overlap with
arXiv:2205.1324
- …