2,868 research outputs found

    Advancing the Standards for Unmanned Air System Communications, Navigation and Surveillance

    Get PDF
    Under NASA program NNA16BD84C, new architectures were identified and developed for supporting reliable and secure Communications, Navigation and Surveillance (CNS) needs for Unmanned Air Systems (UAS) operating in both controlled and uncontrolled airspace. An analysis of architectures for the two categories of airspace and an implementation technology readiness analysis were performed. These studies produced NASA reports that have been made available in the public domain and have been briefed in previous conferences. We now consider how the products of the study are influencing emerging directions in the aviation standards communities. The International Civil Aviation Organization (ICAO) Communications Panel (CP), Working Group I (WG-I) is currently developing a communications network architecture known as the Aeronautical Telecommunications Network with Internet Protocol Services (ATN/IPS). The target use case for this service is secure and reliable Air Traffic Management (ATM) for manned aircraft operating in controlled airspace. However, the work is more and more also considering the emerging class of airspace users known as Remotely Piloted Aircraft Systems (RPAS), which refers to certain UAS classes. In addition, two Special Committees (SCs) in the Radio Technical Commission for Aeronautics (RTCA) are developing Minimum Aviation System Performance Standards (MASPS) and Minimum Operational Performance Standards (MOPS) for UAS. RTCA SC-223 is investigating an Internet Protocol Suite (IPS) and AeroMACS aviation data link for interoperable (INTEROP) UAS communications. Meanwhile, RTCA SC-228 is working to develop Detect And Avoid (DAA) equipment and a Command and Control (C2) Data Link MOPS establishing LBand and C-Band solutions. These RTCA Special Committees along with ICAO CP WG/I are therefore overlapping in terms of the Communication, Navigation and Surveillance (CNS) alternatives they are seeking to provide for an integrated manned- and unmanned air traffic management service as well as remote pilot command and control. This paper presents UAS CNS architecture concepts developed under the NASA program that apply to all three of the aforementioned committees. It discusses the similarities and differences in the problem spaces under consideration in each committee, and considers the application of a common set of CNS alternatives that can be widely applied. As the works of these committees progress, it is clear that the overlap will need to be addressed to ensure a consistent and safe framework for worldwide aviation. In this study, we discuss similarities and differences in the various operational models and show how the CNS architectures developed under the NASA program apply

    Two Dimensional Positioning and Heading Solution for Flying Vehicles Using a Line-Scanning Laser Radar (LADAR)

    Get PDF
    Emerging technology in small autonomous flying vehicles requires the systems to have a precise navigation solution in order to perform tasks. In many critical environments, such as indoors, GPS is unavailable necessitating the development of supplemental aiding sensors to determine precise position. This research investigates the use of a line scanning laser radar (LADAR) as a standalone two dimensional position and heading navigation solution and sets up the device for augmentation into existing navigation systems. A fast histogram correlation method is developed to operate in real-time on board the vehicle providing position and heading updates at a rate of 10 Hz. LADAR navigation methods are adapted to 3 dimensions with a simulation built to analyze performance loss due attitude changes during flight. These simulations are then compared to experimental results collected using SICK LD-OEM 1000 mounted a cart traversing. The histogram correlation algorithm applied in this work was shown to successfully navigate a realistic environment where a quadrotor in short flights of less than 5 min in larger rooms. Application in hallways show great promise providing a stable heading along with tracking movement perpendicular to the hallway

    2007 Annual Report of the Graduate School of Engineering and Management, Air Force Institute of Technology

    Get PDF
    The Graduate School\u27s Annual Report highlights research focus areas, new academic programs, faculty accomplishments and news, and provides top-level sponsor-funded research data and information

    Model-Based Control Using Model and Mechanization Fusion Techniques for Image-Aided Navigation

    Get PDF
    Unmanned aerial vehicles are no longer used for just reconnaissance. Current requirements call for smaller autonomous vehicles that replace the human in high-risk activities. Many times these activities are performed in GPS-degraded environments. Without GPS providing today\u27s most accurate navigation solution, autonomous navigation in tight areas is more difficult. Today, image-aided navigation is used and other methods are explored to more accurately navigate in such areas (e.g., indoors). This thesis explores the use of inertial measurements and navigation solution updates using cameras with a model-based Linear Quadratic Gaussian controller. To demonstrate the methods behind this research, the controller will provide inputs to a micro-sized helicopter that allows the vehicle to maintain hover. A new method for obtaining a more accurate navigation solution was devised, originating from the following basic setup. To begin, a nonlinear system model was identified for a micro-sized, commercial, off-the-shelf helicopter. This model was verified, then linearized about the hover condition to construct a Linear Quadratic Regulator (LQR). The state error estimates, provided by an Unscented Kalman Filter using simulated image measurement updates, are used to update the navigation solution provided by inertial measurement sensors using strapdown mechanization equations. The navigation solution is used with a reference signal to determine the position and heading error. This error, along with other states, is fed to the LQR, which controls the helicopter. Research revealed that by combining the navigation solution from the INS mechanization block with a model-based navigation solution, and combining the INS error model and system model during the time propagation in the UKF, the navigation solution error decreases by 20%. The equations used for this modification stem from state and covariance combination methods utilized in the Federated Kalman Filter

    Unmanned Aircraft System Navigation in the Urban Environment: A Systems Analysis

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/140665/1/1.I010280.pd

    Mapping and classification of ecologically sensitive marine habitats using unmanned aerial vehicle (UAV) imagery and object-based image analysis (OBIA)

    Get PDF
    Nowadays, emerging technologies, such as long-range transmitters, increasingly miniaturized components for positioning, and enhanced imaging sensors, have led to an upsurge in the availability of new ecological applications for remote sensing based on unmanned aerial vehicles (UAVs), sometimes referred to as “drones”. In fact, structure-from-motion (SfM) photogrammetry coupled with imagery acquired by UAVs offers a rapid and inexpensive tool to produce high-resolution orthomosaics, giving ecologists a new way for responsive, timely, and cost-effective monitoring of ecological processes. Here, we adopted a lightweight quadcopter as an aerial survey tool and object-based image analysis (OBIA) workflow to demonstrate the strength of such methods in producing very high spatial resolution maps of sensitive marine habitats. Therefore, three different coastal environments were mapped using the autonomous flight capability of a lightweight UAV equipped with a fully stabilized consumer-grade RGB digital camera. In particular we investigated a Posidonia oceanica seagrass meadow, a rocky coast with nurseries for juvenile fish, and two sandy areas showing biogenic reefs of Sabelleria alveolata. We adopted, for the first time, UAV-based raster thematic maps of these key coastal habitats, produced after OBIA classification, as a new method for fine-scale, low-cost, and time saving characterization of sensitive marine environments which may lead to a more effective and efficient monitoring and management of natural resource

    Reliable Navigation for SUAS in Complex Indoor Environments

    Get PDF
    Indoor environments are a particular challenge for Unmanned Aerial Vehicles (UAVs). Effective navigation through these GPS-denied environments require alternative localization systems, as well as methods of sensing and avoiding obstacles while remaining on-task. Additionally, the relatively small clearances and human presence characteristic of indoor spaces necessitates a higher level of precision and adaptability than is common in traditional UAV flight planning and execution. This research blends the optimization of individual technologies, such as state estimation and environmental sensing, with system integration and high-level operational planning. The combination of AprilTag visual markers, multi-camera Visual Odometry, and IMU data can be used to create a robust state estimator that describes position, velocity, and rotation of a multicopter within an indoor environment. However these data sources have unique, nonlinear characteristics that should be understood to effectively plan for their usage in an automated environment. The research described herein begins by analyzing the unique characteristics of these data streams in order to create a highly-accurate, fault-tolerant state estimator. Upon this foundation, the system built, tested, and described herein uses Visual Markers as navigation anchors, visual odometry for motion estimation and control, and then uses depth sensors to maintain an up-to-date map of the UAV\u27s immediate surroundings. It develops and continually refines navigable routes through a novel combination of pre-defined and sensory environmental data. Emphasis is put on the real-world development and testing of the system, through discussion of computational resource management and risk reduction

    GNSS/LiDAR-Based Navigation of an Aerial Robot in Sparse Forests

    Get PDF
    Autonomous navigation of unmanned vehicles in forests is a challenging task. In such environments, due to the canopies of the trees, information from Global Navigation Satellite Systems (GNSS) can be degraded or even unavailable. Also, because of the large number of obstacles, a previous detailed map of the environment is not practical. In this paper, we solve the complete navigation problem of an aerial robot in a sparse forest, where there is enough space for the flight and the GNSS signals can be sporadically detected. For localization, we propose a state estimator that merges information from GNSS, Attitude and Heading Reference Systems (AHRS), and odometry based on Light Detection and Ranging (LiDAR) sensors. In our LiDAR-based odometry solution, the trunks of the trees are used in a feature-based scan matching algorithm to estimate the relative movement of the vehicle. Our method employs a robust adaptive fusion algorithm based on the unscented Kalman filter. For motion control, we adopt a strategy that integrates a vector field, used to impose the main direction of the movement for the robot, with an optimal probabilistic planner, which is responsible for obstacle avoidance. Experiments with a quadrotor equipped with a planar LiDAR in an actual forest environment is used to illustrate the effectiveness of our approach
    • …
    corecore