3 research outputs found

    A hardware-in-the-loop testing facility for unmanned aerial vehicle sensor suites and control algorithms

    Get PDF
    In the past decade Unmanned Aerial Vehicles (UAVs) have rapidly grown into a major field of robotics in both industry and academia. Many well established platforms have been developed, and the demand continues to grow. However, the UAVs utilized in industry are predominately remotely piloted aircraft offering very limited levels of autonomy. In contrast, fully autonomous flight has been achieved in research, and the degree of autonomy continues to grow, with research now focusing on advanced tasks such as navigating cluttered terrain and formation ying.The gap between academia and industry is the robustness of control algorithms. Academic research often focuses on proof of concept demonstrations with little or no consideration to real world concerns such as adverse weather or sensor integration.One of the goals of this thesis is to integrate real world issues into the design process. A testing environment was designed and built that allows sensors and control algorithms to be tested against real obstacles and environmental conditions in a controlled, repeatable fashion. The use of this facility is demonstrated in the implementation of a safe landing zone algorithm for a robotic helicopter equipped with a laser scanner. Results from tests conducted in the testing facility are used to analyze results from ights in the field.Controlling the testing environment also provides a baseline to evaluate different control solutions. In the current research paradigm, it is difficult to determine which research questions have been solved because the testing conditions vary from researcher to researcher. A common testing environment eliminates ambiguities and allows solutions to be characterized based on their performance in different terrains and environmental conditions.This thesis explores how flight tests can be conducted in the lab using the actual hardware and control algorithms. The sensor package is attached to a 6 DOF gantry whose motion is governed by the dynamic model of the aircraft. To provide an expansive terrain over which the flight can be conducted, a scaled model of the environment was created.The the feasibility of using a scaled environment is demonstrated with a common sensor package and control task: using computer vision to guide an autonomous helicopter. The effcts of scaling are investigated, and the approach validated by comparing results in the scaled model to actual flights. Finally, it is demonstrated how the facility can be used to investigate the effect of adverse conditions on control algorithm performance. The overarching philosophy of this work is that incorporating real world concerns into the design process leads to more fully developed and robust solutions.Ph.D., Mechanical Engineering -- Drexel University, 201

    Mixed-reality for unmanned aerial vehicle operations in near earth environments

    Get PDF
    Future applications will bring unmanned aerial vehicles (UAVs) to near Earth environments such as urban areas, causing a change in the way UAVs are currently operated. Of concern is that UAV accidents still occur at a much higher rate than the accident rate for commercial airliners. A number of these accidents can be attributed to a UAV pilot's low situation awareness (SA) due to the limitations of UAV operating interfaces. The main limitation is the physical separation between the vehicle and the pilot. This eliminates any motion and exteroceptive sensory feedback to the pilot. These limitation on top of a small eld of view from the onboard camera results in low SA, making near Earth operations di cult and dangerous. Autonomy has been proposed as a solution for near Earth tasks but state of the art arti cial intelligence still requires very structured and well de ned goals to allow safe autonomous operations. Therefore, there is a need to better train pilots to operate UAVs in near Earth environments and to augment their performance for increased safety and minimization of accidents.In this work, simulation software, motion platform technology, and UAV sensor suites were integrated to produce mixed-reality systems that address current limitations of UAV piloting interfaces. The mixed reality de nition is extended in this work to encompass not only the visual aspects but to also include a motion aspect. A training and evaluation system for UAV operations in near Earth environments was developed. Modi cations were made to ight simulator software to recreate current UAV operating modalities (internal and external). The training and evaluation system has been combined with Drexel's Sensor Integrated Systems Test Rig (SISTR) to allow simulated missions while incorporating real world environmental e ects andUAV sensor hardware.To address the lack of motion feedback to a UAV pilot, a system was developed that integrates a motion simulator into UAV operations. The system is designed such that during ight, the angular rate of a UAV is captured by an onboard inertial measurement unit (IMU) and is relayed to a pilot controlling the vehicle from inside the motion simulator.Efforts to further increase pilot SA led to the development of a mixed reality chase view piloting interface. Chase view is similar to a view of being towed behind the aircraft. It combines real world onboard camera images with a virtual representation of the vehicle and the surrounding operating environment. A series of UAV piloting experiments were performed using the training and evaluation systems described earlier. Subjects' behavioral performance while using the onboard camera view and the mixed reality chase view interface during missions was analyzed. Subjects' cognitive workload during missions was also assessed using subjective measures such as NASA task load index and non-subjective brain activity measurements using a functional Infrared Spectroscopy (fNIR) system. Behavioral analysis showed that the chase view interface improved pilot performance in near Earth ights and increased their situational awareness. fNIR analysis showed that a subjects cognitive workload was signi cantly less while using the chase view interface. Real world ight tests were conducted in a near Earth environment with buildings and obstacles to evaluate the chase view interface with real world data. The interface performed very well with real world, real time data in close range scenarios.The mixed reality approaches presented follow studies on human factors performance and cognitive loading. The resulting designs serve as test beds for studying UAV pilot performance, creating training programs, and developing tools to augment UAV operations and minimize UAV accidents during operations in near Earth environments.Ph.D., Mechanical Engineering -- Drexel University, 201

    A multidisciplinary framework for mission effectiveness quantification and assessment of micro autonomous systems and technologies

    Get PDF
    Micro Autonomous Systems and Technologies (MAST) is an Army Research Laboratory (ARL) sponsored project based on a consortium of revolutionary academic and industrial research institutions working together to develop new technologies in the field of microelectronics, autonomy, micromechanics and integration. The overarching goal of the MAST consortium is to develop autonomous, multifunctional, and collaborative ensembles of microsystems to enhance small unit tactical situational awareness in urban and complex terrain. Unmanned systems are used to obtain intelligence at the macro level, but there is no real-time intelligence asset at the squad level. MAST seeks to provide that asset. Consequently, multiple integrated MAST heterogeneous platforms (e.g. crawlers, flyers, etc.) working together synergistically as an ensemble shall be capable of autonomously performing a wide spectrum of operational functions based on the latest developments in micro-mechanics, micro-electronics, and power technologies to achieve the desired operational objectives. The design of such vehicles is, by nature, highly constrained in terms of size, weight and power. Technologists are trying to understand the impacts of developing state-of-the-art technologies on the MAST systems while the operators are trying to define strategies and tactics on how to use these systems. These two different perspectives create an integration gap. The operators understand the capabilities needed on the field of deployment but not necessarily the technologies, while the technologists understand the physics of the technologies but not necessarily how they will be deployed, utilized, and operated during a mission. This not only results in a major requirements disconnect, representing the difference of perspectives between soldiers and the researchers, but also demonstrates the lack of quantified means to assess the technology gap in terms of mission requirements. This necessitates the quantification and resolution of the requirements disconnect and technology gap leading to re-definitions of the requirements based on mission scenarios. A research plan, built on a technical approach based on the simultaneous application of decomposition and re-composition or 'Top-down' and 'Bottom-up' approaches, was used for development of a structured and traceable methodology. The developed methodology is implemented through an integrated framework consisting of various decision-making tools, modeling and simulation, and experimental data farming and validation. The major obstacles in the development of the presented framework stemmed from the fact that all MAST technologies are revolutionary in nature, with no available historical data, sizing and synthesis codes or reliable physics-based models. The inherently multidisciplinary, multi-objective and uncertain nature of MAST technologies makes it very difficult to map mission level objectives to measurable engineering metrics. It involves the optimization of multiple disciplines such as Aero, CS/CE, ME, EE, Biology, etc., and of multiple objectives such as mission performance, tactics, vehicle attributes, etc. Furthermore, the concept space is enormous with hundreds of billions of alternatives, and largely includes future technologies with low Technology Readiness Level (TRL) resulting in high uncertainty. The presented framework is a cyber-physical design and analysis suite that combines Warfighter mission needs and expert technologist knowledge with a set of design and optimization tools, models, and experiments in order to provide a quantitative measure of the requirements disconnect and technology gap mentioned above. This quantification provides the basis for re-definitions of the requirements that are realistic in nature and ensure mission success. The research presents the development of this methodology and framework to address the core research objectives. The developed framework was then implemented on two mission scenarios that are of interest to the MAST consortium and Army Research Laboratory, namely, Joppa Urban Dwelling and Black Hawk Down Interior Building Reconnaissance. Results demonstrate the framework’s validity and serve as proof of concept for bridging the requirements disconnect between the Warfighter and the technologists. Billions of alternative MAST vehicles, composed of current and future technologies, were modeled and simulated, as part of a swarm, to evaluate their mission performance. In-depth analyses of the experiments, conducted as part of the research, presents quantitative technology gaps that needs to be addressed by technologist for successful mission completion. Quantitative values for vehicle specifications and systems' Measures of Performance were determined for acceptable level of performance for the given missions. The consolidated results were used for defining mission based requirements of MAST systems.Ph.D
    corecore