1,039 research outputs found

    Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors

    No full text
    International audienceFor the last few decades, growing interest has returned to the quite chal-lenging task of the autonomous lunar landing. Soft landing of payloads on the lu-nar surface requires the development of new means of ensuring safe descent with strong final conditions and aerospace-related constraints in terms of mass, cost and computational resources. In this paper, a two-phase approach is presented: first a biomimetic method inspired from the neuronal and sensory system of flying insects is presented as a solution to perform safe lunar landing. In order to design an au-topilot relying only on optic flow (OF) and inertial measurements, an estimation method based on a two-sensor setup is introduced: these sensors allow us to accu-rately estimate the orientation of the velocity vector which is mandatory to control the lander's pitch in a quasi-optimal way with respect to the fuel consumption. Sec-ondly a new low-speed Visual Motion Sensor (VMS) inspired by insects' visual systems performing local angular 1-D speed measurements ranging from 1.5 • /s to 25 • /s and weighing only 2.8 g is presented. It was tested under free-flying outdoor conditions over various fields onboard an 80 kg unmanned helicopter. These pre-liminary results show that the optic flow measured despite the complex disturbances encountered closely matched the ground-truth optic flow

    Bio-inspired Landing Approaches and Their Potential Use On Extraterrestrial Bodies

    No full text
    International audienceAutomatic landing on extraterrestrial bodies is still a challenging and hazardous task. Here we propose a new type of autopilot designed to solve landing problems, which is based on neurophysiological, behavioral, and biorobotic findings on flying insects. Flying insects excel in optic flow sensing techniques and cope with highly parallel data at a low energy and computational cost using lightweight dedicated motion processing circuits. In the first part of this paper, we present our biomimetic approach in the context of a lunar landing scenario, assuming a 2-degree-of-freedom spacecraft approaching the moon, which is simulated with the PANGU software. The autopilot we propose relies only on optic flow (OF) and inertial measurements, and aims at regulating the OF generated during the landing approach, by means of a feedback control system whose sensor is an OF sensor. We put forward an estimation method based on a two-sensor setup to accurately estimate the orientation of the lander's velocity vector, which is mandatory to control the lander's pitch in a near optimal way with respect to the fuel consumption. In the second part, we present a lightweight Visual Motion Sensor (VMS) which draws on the results of neurophysiological studies on the insect visual system. The VMS was able to perform local 1-D angular speed measurements in the range 1.5°/s - 25°/s. The sensor was mounted on an 80 kg unmanned helicopter and test-flown outdoors over various fields. The OF measured onboard was shown to match the ground-truth optic flow despite the dramatic disturbances and vibrations experienced by the sensor

    Towards an autonomous landing system in presence of uncertain obstacles in indoor environments

    Get PDF
    The landing task is fundamental to Micro air vehicles (MAVs) when attempting to land in an unpredictable environment (e.g., presence of static obstacles or moving obstacles). The MAV should immediately detect the environment through its sensors and decide its actions for landing. This paper addresses the problem of the autonomous landing approach of a commercial AR. Drone 2.0 in presence of uncertain obstacles in an indoor environment. A localization methodology to estimate the drone's pose based on the sensor fusion techniques which fuses IMU and Poxyz signals is proposed. In addition, a vision-based approach to detect and estimate the velocity, position of the moving obstacle in the drone's working environment is presented. To control the drone landing accurately, a cascade control based on an Accelerated Particle Swarm Optimization algorithm (APSO) is designed. The simulation and experimental results demonstrate that the obtained model is appropriate for the measured data

    Incorporating Safety Excellence into Urban Air Mobility (UAM): Insights from Commercial Aviation, Rotorcraft, and Unmanned Aerial Systems (UAS)

    Get PDF
    This paper focused on safety considerations in Urban Air Mobility (UAM) through a cross-industry examination of commercial aviation, rotorcraft, and unmanned aerial systems (UAS). Although UAM promises transformative benefits, there are safety concerns remaining. Based on the Federal Aviation Administration (FAA)’s Concept of Operations (ConOps), the literature review explained the fundamental concepts of UAM. In commercial aviation, regulatory framework, pilot training and certification, vehicle design and maintenance, and emergency response planning are emphasized. For rotorcraft, safety requirements for vertical flight, collision avoidance systems, heliport standards, and weather adaptability are crucial. Leveraging UAS advancements, the study suggested autonomous systems, sense-and-avoid technology, and remote piloting for enhanced safety in the UAM sector

    SURVEILLANCE MISSION PLANNING FOR UAVS IN GPS-DENIED URBAN ENVIRONMENT

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Comparative Study of Indoor Navigation Systems for Autonomous Flight

    Get PDF
    Recently, Unmanned Aerial Vehicles (UAVs) have attracted the society and researchers due to the capability to perform in economic, scientific and emergency scenarios, and are being employed in large number of applications especially during the hostile environments. They can operate autonomously for both indoor and outdoor applications mainly including search and rescue, manufacturing, forest fire tracking, remote sensing etc. For both environments, precise localization plays a critical role in order to achieve high performance flight and interacting with the surrounding objects. However, for indoor areas with degraded or denied Global Navigation Satellite System (GNSS) situation, it becomes challenging to control UAV autonomously especially where obstacles are unidentified. A large number of techniques by using various technologies are proposed to get rid of these limits. This paper provides a comparison of such existing solutions and technologies available for this purpose with their strengths and limitations. Further, a summary of current research status with unresolved issues and opportunities is provided that would provide research directions to the researchers of the similar interests

    Adaptive Airborne Separation to Enable UAM Autonomy in Mixed Airspace

    Get PDF
    The excitement and promise generated by Urban Air Mobility (UAM) concepts have inspired both new entrants and large aerospace companies throughout the world to invest hundreds of millions in research and development of air vehicles, both piloted and unpiloted, to fulfill these dreams. The management and separation of all these new aircraft have received much less attention, however, and even though NASAs lead is advancing some promising concepts for Unmanned Aircraft Systems (UAS) Traffic Management (UTM), most operations today are limited to line of sight with the vehicle, airspace reservation and geofencing of individual flights. Various schemes have been proposed to control this new traffic, some modeled after conventional air traffic control and some proposing fully automatic management, either from a ground-based entity or carried out on board among the vehicles themselves. Previous work has examined vehicle-based traffic management in the very low altitude airspace within a metroplex called UTM airspace in which piloted traffic is rare. A management scheme was proposed in that work that takes advantage of the homogeneous nature of the traffic operating in UTM airspace. This paper expands that concept to include a traffic management plan usable at all altitudes desired for electric Vertical Takeoff and Landing urban and short-distance, inter-city transportation. The interactions with piloted aircraft operating under both visual and instrument flight rules are analyzed, and the role of Air Traffic Control services in the postulated mixed traffic environment is covered. Separation values that adapt to each type of traffic encounter are proposed, and the relationship between required airborne surveillance range and closure speed is given. Finally, realistic scenarios are presented illustrating how this concept can reliably handle the density and traffic mix that fully implemented and successful UAM operations would entail

    A Vision-Based Automatic Safe landing-Site Detection System

    Get PDF
    An automatic safe landing-site detection system is proposed for aircraft emergency landing, based on visible information acquired by aircraft-mounted cameras. Emergency landing is an unplanned event in response to emergency situations. If, as is unfortunately usually the case, there is no airstrip or airfield that can be reached by the un-powered aircraft, a crash landing or ditching has to be carried out. Identifying a safe landing-site is critical to the survival of passengers and crew. Conventionally, the pilot chooses the landing-site visually by looking at the terrain through the cockpit. The success of this vital decision greatly depends on the external environmental factors that can impair human vision, and on the pilot\u27s flight experience that can vary significantly among pilots. Therefore, we propose a robust, reliable and efficient detection system that is expected to alleviate the negative impact of these factors. In this study, we focus on the detection mechanism of the proposed system and assume that the image enhancement for increased visibility and image stitching for a larger field-of-view have already been performed on terrain images acquired by aircraft-mounted cameras. Specifically, we first propose a hierarchical elastic horizon detection algorithm to identify ground in rile image. Then the terrain image is divided into non-overlapping blocks which are clustered according to a roughness measure. Adjacent smooth blocks are merged to form potential landing-sites whose dimensions are measured with principal component analysis and geometric transformations. If the dimensions of a candidate region exceed the minimum requirement for safe landing, the potential landing-site is considered a safe candidate and highlighted on the human machine interface. At the end, the pilot makes the final decision by confirming one of the candidates, also considering other factors such as wind speed and wind direction, etc

    Visual Environment Assessment for Safe Autonomous Quadrotor Landing

    Full text link
    Autonomous identification and evaluation of safe landing zones are of paramount importance for ensuring the safety and effectiveness of aerial robots in the event of system failures, low battery, or the successful completion of specific tasks. In this paper, we present a novel approach for detection and assessment of potential landing sites for safe quadrotor landing. Our solution efficiently integrates 2D and 3D environmental information, eliminating the need for external aids such as GPS and computationally intensive elevation maps. The proposed pipeline combines semantic data derived from a Neural Network (NN), to extract environmental features, with geometric data obtained from a disparity map, to extract critical geometric attributes such as slope, flatness, and roughness. We define several cost metrics based on these attributes to evaluate safety, stability, and suitability of regions in the environments and identify the most suitable landing area. Our approach runs in real-time on quadrotors equipped with limited computational capabilities. Experimental results conducted in diverse environments demonstrate that the proposed method can effectively assess and identify suitable landing areas, enabling the safe and autonomous landing of a quadrotor.Comment: 7 pages, 5 figures, 1 table, submitted to IEEE International Conference on Robotics and Automation (ICRA), 202
    corecore