1,013 research outputs found

    Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors

    No full text
    International audienceFor the last few decades, growing interest has returned to the quite chal-lenging task of the autonomous lunar landing. Soft landing of payloads on the lu-nar surface requires the development of new means of ensuring safe descent with strong final conditions and aerospace-related constraints in terms of mass, cost and computational resources. In this paper, a two-phase approach is presented: first a biomimetic method inspired from the neuronal and sensory system of flying insects is presented as a solution to perform safe lunar landing. In order to design an au-topilot relying only on optic flow (OF) and inertial measurements, an estimation method based on a two-sensor setup is introduced: these sensors allow us to accu-rately estimate the orientation of the velocity vector which is mandatory to control the lander's pitch in a quasi-optimal way with respect to the fuel consumption. Sec-ondly a new low-speed Visual Motion Sensor (VMS) inspired by insects' visual systems performing local angular 1-D speed measurements ranging from 1.5 • /s to 25 • /s and weighing only 2.8 g is presented. It was tested under free-flying outdoor conditions over various fields onboard an 80 kg unmanned helicopter. These pre-liminary results show that the optic flow measured despite the complex disturbances encountered closely matched the ground-truth optic flow

    Bio-inspired Landing Approaches and Their Potential Use On Extraterrestrial Bodies

    No full text
    International audienceAutomatic landing on extraterrestrial bodies is still a challenging and hazardous task. Here we propose a new type of autopilot designed to solve landing problems, which is based on neurophysiological, behavioral, and biorobotic findings on flying insects. Flying insects excel in optic flow sensing techniques and cope with highly parallel data at a low energy and computational cost using lightweight dedicated motion processing circuits. In the first part of this paper, we present our biomimetic approach in the context of a lunar landing scenario, assuming a 2-degree-of-freedom spacecraft approaching the moon, which is simulated with the PANGU software. The autopilot we propose relies only on optic flow (OF) and inertial measurements, and aims at regulating the OF generated during the landing approach, by means of a feedback control system whose sensor is an OF sensor. We put forward an estimation method based on a two-sensor setup to accurately estimate the orientation of the lander's velocity vector, which is mandatory to control the lander's pitch in a near optimal way with respect to the fuel consumption. In the second part, we present a lightweight Visual Motion Sensor (VMS) which draws on the results of neurophysiological studies on the insect visual system. The VMS was able to perform local 1-D angular speed measurements in the range 1.5°/s - 25°/s. The sensor was mounted on an 80 kg unmanned helicopter and test-flown outdoors over various fields. The OF measured onboard was shown to match the ground-truth optic flow despite the dramatic disturbances and vibrations experienced by the sensor

    Towards an autonomous landing system in presence of uncertain obstacles in indoor environments

    Get PDF
    The landing task is fundamental to Micro air vehicles (MAVs) when attempting to land in an unpredictable environment (e.g., presence of static obstacles or moving obstacles). The MAV should immediately detect the environment through its sensors and decide its actions for landing. This paper addresses the problem of the autonomous landing approach of a commercial AR. Drone 2.0 in presence of uncertain obstacles in an indoor environment. A localization methodology to estimate the drone's pose based on the sensor fusion techniques which fuses IMU and Poxyz signals is proposed. In addition, a vision-based approach to detect and estimate the velocity, position of the moving obstacle in the drone's working environment is presented. To control the drone landing accurately, a cascade control based on an Accelerated Particle Swarm Optimization algorithm (APSO) is designed. The simulation and experimental results demonstrate that the obtained model is appropriate for the measured data

    SURVEILLANCE MISSION PLANNING FOR UAVS IN GPS-DENIED URBAN ENVIRONMENT

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Comparative Study of Indoor Navigation Systems for Autonomous Flight

    Get PDF
    Recently, Unmanned Aerial Vehicles (UAVs) have attracted the society and researchers due to the capability to perform in economic, scientific and emergency scenarios, and are being employed in large number of applications especially during the hostile environments. They can operate autonomously for both indoor and outdoor applications mainly including search and rescue, manufacturing, forest fire tracking, remote sensing etc. For both environments, precise localization plays a critical role in order to achieve high performance flight and interacting with the surrounding objects. However, for indoor areas with degraded or denied Global Navigation Satellite System (GNSS) situation, it becomes challenging to control UAV autonomously especially where obstacles are unidentified. A large number of techniques by using various technologies are proposed to get rid of these limits. This paper provides a comparison of such existing solutions and technologies available for this purpose with their strengths and limitations. Further, a summary of current research status with unresolved issues and opportunities is provided that would provide research directions to the researchers of the similar interests

    Adaptive Airborne Separation to Enable UAM Autonomy in Mixed Airspace

    Get PDF
    The excitement and promise generated by Urban Air Mobility (UAM) concepts have inspired both new entrants and large aerospace companies throughout the world to invest hundreds of millions in research and development of air vehicles, both piloted and unpiloted, to fulfill these dreams. The management and separation of all these new aircraft have received much less attention, however, and even though NASAs lead is advancing some promising concepts for Unmanned Aircraft Systems (UAS) Traffic Management (UTM), most operations today are limited to line of sight with the vehicle, airspace reservation and geofencing of individual flights. Various schemes have been proposed to control this new traffic, some modeled after conventional air traffic control and some proposing fully automatic management, either from a ground-based entity or carried out on board among the vehicles themselves. Previous work has examined vehicle-based traffic management in the very low altitude airspace within a metroplex called UTM airspace in which piloted traffic is rare. A management scheme was proposed in that work that takes advantage of the homogeneous nature of the traffic operating in UTM airspace. This paper expands that concept to include a traffic management plan usable at all altitudes desired for electric Vertical Takeoff and Landing urban and short-distance, inter-city transportation. The interactions with piloted aircraft operating under both visual and instrument flight rules are analyzed, and the role of Air Traffic Control services in the postulated mixed traffic environment is covered. Separation values that adapt to each type of traffic encounter are proposed, and the relationship between required airborne surveillance range and closure speed is given. Finally, realistic scenarios are presented illustrating how this concept can reliably handle the density and traffic mix that fully implemented and successful UAM operations would entail

    A Vision-Based Automatic Safe landing-Site Detection System

    Get PDF
    An automatic safe landing-site detection system is proposed for aircraft emergency landing, based on visible information acquired by aircraft-mounted cameras. Emergency landing is an unplanned event in response to emergency situations. If, as is unfortunately usually the case, there is no airstrip or airfield that can be reached by the un-powered aircraft, a crash landing or ditching has to be carried out. Identifying a safe landing-site is critical to the survival of passengers and crew. Conventionally, the pilot chooses the landing-site visually by looking at the terrain through the cockpit. The success of this vital decision greatly depends on the external environmental factors that can impair human vision, and on the pilot\u27s flight experience that can vary significantly among pilots. Therefore, we propose a robust, reliable and efficient detection system that is expected to alleviate the negative impact of these factors. In this study, we focus on the detection mechanism of the proposed system and assume that the image enhancement for increased visibility and image stitching for a larger field-of-view have already been performed on terrain images acquired by aircraft-mounted cameras. Specifically, we first propose a hierarchical elastic horizon detection algorithm to identify ground in rile image. Then the terrain image is divided into non-overlapping blocks which are clustered according to a roughness measure. Adjacent smooth blocks are merged to form potential landing-sites whose dimensions are measured with principal component analysis and geometric transformations. If the dimensions of a candidate region exceed the minimum requirement for safe landing, the potential landing-site is considered a safe candidate and highlighted on the human machine interface. At the end, the pilot makes the final decision by confirming one of the candidates, also considering other factors such as wind speed and wind direction, etc

    An Open Source, Autonomous, Vision-Based Algorithm for Hazard Detection and Avoidance for Celestial Body Landing

    Get PDF
    Planetary exploration is one of the main goals that humankind has established as a must for space exploration in order to be prepared for colonizing new places and provide scientific data for a better understanding of the formation of our solar system. In order to provide a safe approach, several safety measures must be undertaken to guarantee not only the success of the mission but also the safety of the crew. One of these safety measures is the Autonomous Hazard, Detection, and Avoidance (HDA) sub-system for celestial body landers that will enable different spacecraft to complete solar system exploration. The main objective of the HDA sub-system is to assemble a map of the local terrain during the descent of the spacecraft so that a safe landing site can be marked down. This thesis will be focused on a passive method using a monocular camera as its primary detection sensor due to its form factor and weight, which enables its implementation alongside the proposed HDA algorithm in the Intuitive Machines lunar lander NOVA-C as part of the Commercial Lunar Payload Services technological demonstration in 2021 for the NASA Artemis program to take humans back to the moon. This algorithm is implemented by including two different sources for making decisions, a two-dimensional (2D) vision-based HDA map and a three-dimensional (3D) HDA map obtained through a Structure from Motion process in combination with a plane fitting sequence. These two maps will provide different metrics in order to provide the lander a better probability of performing a safe touchdown. These metrics are processed to optimize a cost function
    corecore