3,208 research outputs found

    Collaborating with Autonomous Agents

    Get PDF
    With the anticipated increase of small unmanned aircraft systems (sUAS) entering into the National Airspace System, it is highly likely that vehicle operators will be teaming with fleets of small autonomous vehicles. The small vehicles may consist of sUAS, which are 55 pounds or less that typically will y at altitudes 400 feet and below, and small ground vehicles typically operating in buildings or defined small campuses. Typically, the vehicle operators are not concerned with manual control of the vehicle; instead they are concerned with the overall mission. In order for this vision of high-level mission operators working with fleets of vehicles to come to fruition, many human factors related challenges must be investigated and solved. First, the interface between the human operator and the autonomous agent must be at a level that the operator needs and the agents can understand. This paper details the natural language human factors e orts that NASA Langley's Autonomy Incubator is focusing on. In particular these e orts focus on allowing the operator to interact with the system using speech and gestures rather than a mouse and keyboard. With this ability of the system to understand both speech and gestures, operators not familiar with the vehicle dynamics will be able to easily plan, initiate, and change missions using a language familiar to them rather than having to learn and converse in the vehicle's language. This will foster better teaming between the operator and the autonomous agent which will help lower workload, increase situation awareness, and improve performance of the system as a whole

    Transfer Learning-Based Crack Detection by Autonomous UAVs

    Full text link
    Unmanned Aerial Vehicles (UAVs) have recently shown great performance collecting visual data through autonomous exploration and mapping in building inspection. Yet, the number of studies is limited considering the post processing of the data and its integration with autonomous UAVs. These will enable huge steps onward into full automation of building inspection. In this regard, this work presents a decision making tool for revisiting tasks in visual building inspection by autonomous UAVs. The tool is an implementation of fine-tuning a pretrained Convolutional Neural Network (CNN) for surface crack detection. It offers an optional mechanism for task planning of revisiting pinpoint locations during inspection. It is integrated to a quadrotor UAV system that can autonomously navigate in GPS-denied environments. The UAV is equipped with onboard sensors and computers for autonomous localization, mapping and motion planning. The integrated system is tested through simulations and real-world experiments. The results show that the system achieves crack detection and autonomous navigation in GPS-denied environments for building inspection

    MusA: Using Indoor Positioning and Navigation to Enhance Cultural Experiences in a museum

    Get PDF
    In recent years there has been a growing interest into the use of multimedia mobile guides in museum environments. Mobile devices have the capabilities to detect the user context and to provide pieces of information suitable to help visitors discovering and following the logical and emotional connections that develop during the visit. In this scenario, location based services (LBS) currently represent an asset, and the choice of the technology to determine users' position, combined with the definition of methods that can effectively convey information, become key issues in the design process. In this work, we present MusA (Museum Assistant), a general framework for the development of multimedia interactive guides for mobile devices. Its main feature is a vision-based indoor positioning system that allows the provision of several LBS, from way-finding to the contextualized communication of cultural contents, aimed at providing a meaningful exploration of exhibits according to visitors' personal interest and curiosity. Starting from the thorough description of the system architecture, the article presents the implementation of two mobile guides, developed to respectively address adults and children, and discusses the evaluation of the user experience and the visitors' appreciation of these application

    Fleets of robots for environmentally-safe pest control in agriculture

    Get PDF
    Feeding the growing global population requires an annual increase in food production. This requirement suggests an increase in the use of pesticides, which represents an unsustainable chemical load for the environment. To reduce pesticide input and preserve the environment while maintaining the necessary level of food production, the efficiency of relevant processes must be drastically improved. Within this context, this research strived to design, develop, test and assess a new generation of automatic and robotic systems for effective weed and pest control aimed at diminishing the use of agricultural chemical inputs, increasing crop quality and improving the health and safety of production operators. To achieve this overall objective, a fleet of heterogeneous ground and aerial robots was developed and equipped with innovative sensors, enhanced end-effectors and improved decision control algorithms to cover a large variety of agricultural situations. This article describes the scientific and technical objectives, challenges and outcomes achieved in three common crops

    Close Formation Flight Missions Using Vision-Based Position Detection System

    Get PDF
    In this thesis, a formation flight architecture is described along with the implementation and evaluation of a state-of-the-art vision-based algorithm for solving the problem of estimating and tracking a leader vehicle within a close-formation configuration. A vision-based algorithm that uses Darknet architecture and a formation flight control law to track and follow a leader with desired clearance in forward, lateral directions are developed and implemented. The architecture is run on a flight computer that handles the process in real-time while integrating navigation sensors and a stereo camera. Numerical simulations along with indoor and outdoor actual flight tests demonstrate the capabilities of detection and tracking by providing a low cost, compact size and low weight solution for the problem of estimating the location of other cooperative or non-cooperative flying vehicles within a formation architecture

    Use of Unmanned Aerial Systems in Civil Applications

    Get PDF
    Interest in drones has been exponentially growing in the last ten years and these machines are often presented as the optimal solution in a huge number of civil applications (monitoring, agriculture, emergency management etc). However the promises still do not match the data coming from the consumer market, suggesting that the only big field in which the use of small unmanned aerial vehicles is actually profitable is the video-makers’ one. This may be explained partly with the strong limits imposed by existing (and often "obsolete") national regulations, but also - and pheraps mainly - with the lack of real autonomy. The vast majority of vehicles on the market nowadays are infact autonomous only in the sense that they are able to follow a pre-determined list of latitude-longitude-altitude coordinates. The aim of this thesis is to demonstrate that complete autonomy for UAVs can be achieved only with a performing control, reliable and flexible planning platforms and strong perception capabilities; these topics are introduced and discussed by presenting the results of the main research activities performed by the candidate in the last three years which have resulted in 1) the design, integration and control of a test bed for validating and benchmarking visual-based algorithm for space applications; 2) the implementation of a cloud-based platform for multi-agent mission planning; 3) the on-board use of a multi-sensor fusion framework based on an Extended Kalman Filter architecture

    The Design Fabrication and Flight Testing of an Academic Research Platform for High Resolution Terrain Imaging

    Get PDF
    This thesis addresses the design, construction, and flight testing of an Unmanned Aircraft System (UAS) created to serve as a testbed for Intelligence, Surveillance, and Reconnaissance (ISR) research topics that require the rapid acquisition and processing of high resolution aerial imagery and are to be performed by academic research institutions. An analysis of the requirements of various ISR research applications and the practical limitations of academic research yields a consolidated set of requirements by which the UAS is designed. An iterative design process is used to transition from these requirements to cycles of component selection, systems integration, flight tests, diagnostics, and subsystem redesign. The resulting UAS is designed as an academic research platform to support a variety of ISR research applications ranging from human machine interaction with UAS technology to orthorectified mosaic imaging. The lessons learned are provided to enable future researchers to create similar systems

    Unmanned Aerial Systems Research, Development, Education and Training at Embry-Riddle Aeronautical University

    Get PDF
    With technological breakthroughs in miniaturized aircraft-related components, including but not limited to communications, computer systems and sensors and, state-of-the-art unmanned aerial systems (UAS) have become a reality. This fast growing industry is anticipating and responding to a myriad of societal applications that will provide either new or more cost effective solutions that previous technologies could not, or will replace activities that involved humans in flight with associated risks. Embry-Riddle Aeronautical University has a long history of aviation related research and education, and is heavily engaged in UAS activities. This document provides a summary of these activities. The document is divided into two parts. The first part provides a brief summary of each of the various activities while the second part lists the faculty associated with those activities. Within the first part of this document we have separated the UAS activities into two broad areas: Engineering and Applications. Each of these broad areas is then further broken down into six sub-areas, which are listed in the Table of Contents. The second part lists the faculty, sorted by campus (Daytona Beach---D, Prescott---P and Worldwide--W) associated with the UAS activities. The UAS activities and the corresponding faculty are cross-referenced. We have chosen to provide very short summaries of the UAS activities rather than lengthy descriptions. Should more information be desired, please contact me directly or alternatively visit our research web pages (http://research.erau.edu) and contact the appropriate faculty member directly
    • …
    corecore