3 research outputs found

    Remote Pilot Situational Awareness with Augmented Reality Glasses: An Observational Field Study

    Get PDF
    With the use of small unmanned aerial systems (sUAS) proliferating throughout industry and public safety, it is imperative to ensure safety of flight. The Federal Aviation Administration (FAA) published regulations for commercial use of sUAS in 2016 and included the requirement to maintain visual line-of-sight with the aircraft at all times. However, due to the nature of the sUAS ground control stations (GCS), remote pilots time-share between observing the aircraft and interacting with the display on the GCS. Time-sharing between the aircraft and GCS can be seen as similar to the cross-check a pilot uses when flying a manned aircraft. While manned aircraft designers have invested in the ergonomics and understanding of the cognitive process to optimize situational awareness, it has not been a design requirement for sUAS. The result is that the unmanned operator must change head orientation, eye focus, and remove the aircraft from peripheral vision during the cross-check between the GCS and aircraft. This, coupled with the limited field of view of the sUAS GCS displayed camera, leads to loss of situational awareness through task saturation, and misprioritization. Mixed reality, virtual reality, and augmented reality visual devices are being adopted in the gaming and technical world. The application of these devices to the sUAS GCS could mitigate some of the degradation of situational awareness. Specifically, the incorporation of augmented reality devices where a synthetic display is overlaid on the real-world, allows the remote pilot to observe the aircraft, manipulate the camera, and interact with the GCS without changing head position. This participant observational study evaluated the difference between the remote pilot cross-check while flying with a typical GCS display and when flying with an augmented reality headset in a field setting. The results indicate a significant difference between the pilot’s crosscheck when using augmented reality glasses allowing the pilot to maintain the aircraft in their field of view 56.7% of the time compared to 20.5% when not using the glasses

    Ανάπτυξη συστήματος εντοπισμού θέσης σε εσωτερικούς χώρους με χρήση ασύρματων σημάτων και επαυξημένης πραγματικότητας

    Get PDF
    Τα συστήματα εντοπισμού θέσης σε εσωτερικούς χώρους ασχολούνται με την εύρεση της ακριβής θέσης ενός χρήστη σε εσωτερικούς χώρους αξιοποιώντας τεχνολογίες διαφορετικές από αυτή του GPS. Τα συστήματα αυτά έχουν γνωρίσει ιδιαίτερο ενδιαφέρον λόγω της πλειονότητας των σεναρίων που βρίσκουν εφαρμογή. Έχουν μελετηθεί πολλές τεχνολογίες για τη δημιουργία τέτοιων συστημάτων καθώς και διάφορες τεχνικές για την επεξεργασία των δεδομένων από τις τεχνολογίες αυτές με σκοπό την καλύτερη ακρίβεια των συστημάτων. Στόχος της πτυχιακής αυτής είναι η μελέτη και η αξιολόγηση των τεχνολογιών αυτών καθώς και η ανάπτυξη ενός δικού μας συστήματος. Για την ανάπτυξη του συστήματος μας μελετήσαμε τις καινοτόμες τεχνολογίες της αναγνώρισης αντικειμένων μέσω μοντέλων μηχανικής μάθησης και της επαυξημένης πραγματικότητας στο λειτουργικό Android. Καταλήξαμε να συνδυάσουμε την επαυξημένη πραγματικότητα με τις τεχνολογίες του WiFi και των ραδιοσυχνοτήτων μέσω της τεχνικής της χαρτογράφησης (fingerprinting) για την ανάπτυξη του συστήματος μας. Δημιουργήσαμε επομένως ένα σύστημα που μπορεί να χαρτογραφεί τα σήματα ενός χώρου σε συγκεκριμένα σημεία και να τοποθετεί αντικείμενα επαυξημένης πραγματικότητας στον χώρο. Τα σήματα αυτά χρησιμοποιούνται σε συνδυασμό με τις αποστάσεις από τα επαυξημένα αντικείμενα ώστε να προσδιορίζεται η θέση ενός χρήστη. Από μετρήσεις που διενεργήσαμε, διαπιστώσαμε ότι το σύστημα μας επιτυγχάνει μεγάλη ακρίβεια και είναι δυνατόν να χρησιμοποιηθεί και ως εμπορική εφαρμογή αν ξεπεραστεί ένας περιορισμός.Indoor positioning systems deal with the task of finding the location of a user (or a device) in an indoor space. These systems are quite famous nowadays due to the many uses their applications can find. There are many different technologies used for creating such systems and many techniques for assessing the data from these technologies in order to maximize their accuracy. In this thesis we study and evaluate the different technologies used to create indoor positioning systems and create one of our own. Firstly we examined the novel technologies of object detection from the machine learning area and augmented reality in order to create such a system. We finally managed to create a system with the use of augmented reality and the WiFi and Radio technologies utilizing the fingerprinting technique. Our system maps a point in space both for WiFi and radio signals and visually using an augmented object. After a complete fingerprinting map is created, a user can find his location by using the signals that are available on his device and the distances from the augmented objects he finds. Our system has been tested thoroughly and compared to other systems. We concluded that it achieves very good accuracy if it is tuned and can be used as a commercial application if a certain restriction is lifted

    Dynamic Coverage Control and Estimation in Collaborative Networks of Human-Aerial/Space Co-Robots

    Full text link
    In this dissertation, the author presents a set of control, estimation, and decision making strategies to enable small unmanned aircraft systems and free-flying space robots to act as intelligent mobile wireless sensor networks. These agents are primarily tasked with gathering information from their environments in order to increase the situational awareness of both the network as well as human collaborators. This information is gathered through an abstract sensing model, a forward facing anisotropic spherical sector, which can be generalized to various sensing models through adjustment of its tuning parameters. First, a hybrid control strategy is derived whereby a team of unmanned aerial vehicles can dynamically cover (i.e., sweep their sensing footprints through all points of a domain over time) a designated airspace. These vehicles are assumed to have finite power resources; therefore, an agent deployment and scheduling protocol is proposed that allows for agents to return periodically to a charging station while covering the environment. Rules are also prescribed with respect to energy-aware domain partitioning and agent waypoint selection so as to distribute the coverage load across the network with increased priority on those agents whose remaining power supply is larger. This work is extended to consider the coverage of 2D manifolds embedded in 3D space that are subject to collision by stochastic intruders. Formal guarantees are provided with respect to collision avoidance, timely convergence upon charging stations, and timely interception of intruders by friendly agents. This chapter concludes with a case study in which a human acts as a dynamic coverage supervisor, i.e., they use hand gestures so as to direct the selection of regions which ought to be surveyed by the robot. Second, the concept of situational awareness is extended to networks consisting of humans working in close proximity with aerial or space robots. In this work, the robot acts as an assistant to a human attempting to complete a set of interdependent and spatially separated multitasking objectives. The human wears an augmented reality display and the robot must learn the human's task locations online and broadcast camera views of these tasks to the human. The locations of tasks are learned using a parallel implementation of expectation maximization of Gaussian mixture models. The selection of tasks from this learned set is executed by a Markov Decision Process which is trained using Q-learning by the human. This method for robot task selection is compared against a supervised method in IRB approved (HUM00145810) experimental trials with 24 human subjects. This dissertation concludes by discussing an additional case study, by the author, in Bayesian inferred path planning. In addition, open problems in dynamic coverage and human-robot interaction are discussed so as to present an avenue forward for future work.PHDAerospace EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/155147/1/wbentz_1.pd
    corecore