100 research outputs found

    Human factors in instructional augmented reality for intravehicular spaceflight activities and How gravity influences the setup of interfaces operated by direct object selection

    Get PDF
    In human spaceflight, advanced user interfaces are becoming an interesting mean to facilitate human-machine interaction, enhancing and guaranteeing the sequences of intravehicular space operations. The efforts made to ease such operations have shown strong interests in novel human-computer interaction like Augmented Reality (AR). The work presented in this thesis is directed towards a user-driven design for AR-assisted space operations, iteratively solving issues arisen from the problem space, which also includes the consideration of the effect of altered gravity on handling such interfaces.Auch in der bemannten Raumfahrt steigt das Interesse an neuartigen Benutzerschnittstellen, um nicht nur die Mensch-Maschine-Interaktion effektiver zu gestalten, sondern auch um einen korrekten Arbeitsablauf sicherzustellen. In der Vergangenheit wurden wiederholt Anstrengungen unternommen, Innenbordarbeiten mit Hilfe von Augmented Reality (AR) zu erleichtern. Diese Arbeit konzentriert sich auf einen nutzerorientierten AR-Ansatz, welcher zum Ziel hat, die Probleme schrittweise in einem iterativen Designprozess zu lösen. Dies erfordert auch die Berücksichtigung veränderter Schwerkraftbedingungen

    Electroencephalography (EEG), electromyography (EMG) and eye-tracking for astronaut training and space exploration

    Full text link
    The ongoing push to send humans back to the Moon and to Mars is giving rise to a wide range of novel technical solutions in support of prospective astronaut expeditions. Against this backdrop, the European Space Agency (ESA) has recently launched an investigation into unobtrusive interface technologies as a potential answer to such challenges. Three particular technologies have shown promise in this regard: EEG-based brain-computer interfaces (BCI) provide a non-invasive method of utilizing recorded electrical activity of a user's brain, electromyography (EMG) enables monitoring of electrical signals generated by the user's muscle contractions, and finally, eye tracking enables, for instance, the tracking of user's gaze direction via camera recordings to convey commands. Beyond simply improving the usability of prospective technical solutions, our findings indicate that EMG, EEG, and eye-tracking could also serve to monitor and assess a variety of cognitive states, including attention, cognitive load, and mental fatigue of the user, while EMG could furthermore also be utilized to monitor the physical state of the astronaut. In this paper, we elaborate on the key strengths and challenges of these three enabling technologies, and in light of ESA's latest findings, we reflect on their applicability in the context of human space flight. Furthermore, a timeline of technological readiness is provided. In so doing, this paper feeds into the growing discourse on emerging technology and its role in paving the way for a human return to the Moon and expeditions beyond the Earth's orbit

    Wearable and interactive mixed reality solutions for fault diagnosis and assistance in manufacturing systems: Implementation and testing in an aseptic bottling line

    Get PDF
    Thanks to the spread of technologies stemming from the fourth industrial revolution, also the topic of fault diagnosis and assistance in industrial contexts has benefited. Indeed, several smart tools were developed for assisting with maintenance and troubleshooting, without interfering with operations and facilitating tasks. In line with that, the present manuscript aims at presenting a web smart solution with two possible applications installed on an Android smartphone and Microsoft HoloLens. The solution aims at alerting the operators when an alarm occurs on a machine through notifications, and then at providing the instructions needed for solving the alarm detected. The two devices were tested by the operators of an industrial aseptic bottling line consisting of five machines in real working conditions. The usability of both devices was positively rated by these users based on the System Usability Scale (SUS) and additional appropriate statements. Moreover, the in situ application brought out the main difficulties and interesting issues for the practical implementation of the solutions tested

    Towards Gamification for Signed Languages

    Get PDF
    This project created an educational game to support the Mexican Deaf community in teaching and learning the Mexican Sign Language (MSL) with their hearing family and friends. Visual elements understandable by both hearing and non-hearing players were used to bridge the gap between Signed Language and Written Language. The game uses SignWriting® and Augmented Reality to afford a convivial learning experience and to promote MSL in Mexico. A Design Thinking approach was used to empathically examine signals, trends and drivers, and gathering inputs from global experts. Integrating digital and tangible tools, a game was created using Augmented Reality to provide a three- dimensional experience of Sign Language. The app encourages players to create their own word collections that could potentially create a MSL crowd-sourced library. Combining gamification, SignWriting® and crowdsourcing, the tool can be easily customized across different countries in different Sign Languages

    Dynamic analysis of astronaut motions during extravehicular activity

    Get PDF
    Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 1995.Includes bibliographical references (leaves 122-125).by Grant Schaffner.M.S

    Incorporation of privacy elements in space station design

    Get PDF
    Privacy exists to the extent that individuals can control the degree of social contact that they have with one another. The opportunity to withdraw from other people serves a number of important psychological and social functions, and is in the interests of safety, high performance, and high quality of human life. Privacy requirements for Space Station crew members are reviewed, and architectual and other guidelines for helping astronauts achieve desired levels of privacy are suggested. In turn, four dimensions of privacy are discussed: the separation of activities by areas within the Space Station, controlling the extent to which astronauts have visual contact with one another, controlling the extent to which astronauts have auditory contact with one another, and odor control. Each section presents a statement of the problem, a review of general solutions, and specific recommendations. The report is concluded with a brief consideration of how selection, training, and other procedures can also help Space Station occupants achieve satisfactory levels of seclusion

    Framework for autonomous navigation through MS HoloLenses

    Get PDF
    Τα τελευταία χρόνια, η τεράστια ανάπτυξη των τεχνολογιών εικονικής πραγματικότητας φαίνεται να κατακλύζει την τεχνολογική κοινότητα. Οι δυνατότητες που η οικογένεια της εικονικής πραγματικότητας φέρνει στο τραπέζι, αποτελούν μια εμπειρία που αλλάζει τόσο την καθημερινή όσο και τη βιομηχανική ζωή. Πιο συγκεκριμένα, η Επαυξημένη Πραγματικότητα (AR) θεωρείται από ένα μεγάλο μέρος της επιστημονικής κοινότητας, η κυρίαρχη τεχνολογία των Διεπαφών Χρήστη (UI). Το βασικό χαρακτηριστικό του AR είναι ότι προσθέτει ψηφιακό περιεχόμενο στο πραγματικό περιβάλλον χωρίς να απομονώνει το χρήστη από αυτό, παρέχοντας μια πολύ ρεαλιστική αλληλεπίδραση κοντά στην αντίληψη του χρήστη. Λαμβάνοντας υπόψη αυτά τα χαρακτηριστικά, η τεχνολογία AR μπορεί να χρησιμοποιηθεί για παράδειγμα σε περιπτώσεις βελτιωμένης μάθησης, ελέγχου μηχανής, πλοήγησης ανθρώπου / οχήματος. Για παράδειγμα, ένα AR UI ανεπτυγμένο σε γυαλιά AR μπορεί να βοηθήσει τον χειριστή να ελέγξει ένα μηχάνημα εύκολα και χωρίς κίνδυνο από απόσταση. Επιπλέον, αυτή η λειτουργικότητα μπορεί να εμπλουτιστεί χρησιμοποιώντας ένα μη επανδρωμένο όχημα, ένα ρομπότ, ως το μηχάνημα που θα ελέγχεται. Η ρομποτική είναι ένας τομέας της τεχνολογίας, του οποίου η παρέμβαση στη ζωή των ανθρώπων φαίνεται ασταμάτητη σε όλο και περισσότερες πτυχές. Σήμερα, τα μη επανδρωμένα οχήματα χρησιμοποιούνται στην πλειονότητα των βιομηχανικών δραστηριοτήτων και των καθημερινών συνηθειών. Ας εξετάσουμε μια κατάσταση κατά την οποία επιβλαβή απόβλητα πρέπει να εξαχθούν από μια συγκεκριμένη περιοχή. Η χρήση μη επανδρωμένου οχήματος είναι υποχρεωτική για τη συλλογή και την απομάκρυνση των αποβλήτων. Επιπλέον, ένα UI επαυξημένης πραγματικότητας για το τηλεχειριστήριο του UV, προσφέρει τη δυνατότητα στον χειριστή να αξιοποιήσει στο έπακρο τις δεξιότητές του χωρίς να διακινδυνεύσει τη ζωή του. Το AR UI προσφέρει έναν πολύ φυσικό και οικείο έλεγχο στον χρήστη. Σε αυτήν την πτυχιακή εργασία, εξετάζουμε το σενάριο όπου ο χρήστης ελέγχει / πλοηγεί ένα μη επανδρωμένο όχημα εδάφους με τη βοήθεια AR γυαλιών. Τα γυαλιά AR προβάλλουν μία ειδικά σχεδιασμένη διεπαφή χρήστη για τον έλεγχο κίνησης του ρομπότ. Η πλοήγηση του οχήματος εξαρτάται αποκλειστικά από την αντίληψη και την εμπειρία του χρήστη. Εκεί η τεχνολογία AR γίνεται πρακτική καθώς δεν επηρεάζει την όραση και την αντίληψη του περιβάλλοντος για τον χρήστη και το περιβάλλον του. Πιο συγκεκριμένα, πραγματοποιείται μια σειρά πειραμάτων, όπου ο χρήστης φορά τα AR γυαλιά και πλοηγεί το ρομπότ δίνοντας μια σειρά εντολών κίνησης. Φυσικά, το ρομπότ πρέπει να παραμένει πάντα στο οπτικό του πεδίο. Τα πειράματα εκτελέστηκαν τόσο σε προσομοιωμένο όσο και σε πραγματικό κόσμο. Για την προσομοίωση, χρησιμοποιήθηκε ο προσομοιωτής Gazebo με ένα εικονικό Turtlebot 2 με λειτουργικό σύστημα ROS και ο προσομοιωτής Unity για τα AR γυαλιά. Τα πειράματα του πραγματικού κόσμου εκτελέστηκαν με ένα Turtlebot2 που εκτελεί ROS και τα γυαλιά Microsoft HoloLens AR όπου αναπτύχθηκε η εφαρμογή AR.In recent years, the immense development of the virtual reality technologies seems to overwhelm the technological community. The possibilities which the virtual reality family brings to the table, pose a life changing experience for both daily and industrial life. More particular, Augmented Reality (AR) in considered by a large portion of the scientific community, the reign technology of User Interfaces (UI). The key feature of AR is that adds digital content to the real environment without isolating the user from it, providing a very realistic interaction, close to the user’s perception. Considering these features, AR technology can be used for instance in cases of enhanced learning, machine control, human/vehicle navigation. For example, an AR UI deployed in AR glasses can help the actor control a machine easily and without risk from distance. In addition, this functionality can be enriched by using an unmanned vehicle, a robot, as the machine that will be controlled. Robotics is a field of technology, whose intervention in people’s lives seems unstoppable in more and more aspects. Nowadays, unmanned vehicles are used in the majority of industrial operations and daily habits. Let us consider a situation where harmful waste should be extracted from a specific area. The use of an unmanned vehicle is mandatory for the collection and the removal of the waste. On top of this, an Augmented Reality UI for the remote control of the UV, offers the ability to the actor to make the most out of his skills without risking his life. The AR UI offers a very natural an intimate control to the user. In this Thesis, we examine the scenario where the user controls/navigates an unmanned ground vehicle with the aid of an AR headset. The AR headset projects a specially designed UI for the robot’s movement control. The vehicle’s navigation depends solely on the user’s perception and experience. That’s where the AR technology comes in handy as is does not affects the vision and the environment perception of the user and his surroundings. More specifically, a series of experiments are carried out, where the user wears the AR headset and navigates the robot by giving a series of movement commands. Of course, the robot should always remain on his field of view. Experiments were executed both in simulated and real world. For the simulation Gazebo simulator was used with a virtual Turtlebot 2 running ROS operating system and the Unity simulator for the AR headset. The real - world experiments were executed with a Turtlebot2 running ROS and the Microsoft HoloLens AR headset where our AR application was deployed

    The 1990 progress report and future plans

    Get PDF
    This document describes the progress and plans of the Artificial Intelligence Research Branch (RIA) at ARC in 1990. Activities span a range from basic scientific research to engineering development and to fielded NASA applications, particularly those applications that are enabled by basic research carried out at RIA. Work is conducted in-house and through collaborative partners in academia and industry. Our major focus is on a limited number of research themes with a dual commitment to technical excellence and proven applicability to NASA short, medium, and long-term problems. RIA acts as the Agency's lead organization for research aspects of artificial intelligence, working closely with a second research laboratory at JPL and AI applications groups at all NASA centers

    Aerospace medicine and biology: A continuing bibliography with indexes (supplement 388)

    Get PDF
    This bibliography lists 132 reports, articles and other documents introduced into the NASA Scientific and Technical Information Database. Subject coverage includes: aerospace medicine and physiology, life support systems and man/system technology, protective clothing, exobiology and extraterrestrial life, planetary biology, and flight crew behavior and performance
    corecore