9,247 research outputs found

    AFFECTIVE COMPUTING AND AUGMENTED REALITY FOR CAR DRIVING SIMULATORS

    Get PDF
    Car simulators are essential for training and for analyzing the behavior, the responses and the performance of the driver. Augmented Reality (AR) is the technology that enables virtual images to be overlaid on views of the real world. Affective Computing (AC) is the technology that helps reading emotions by means of computer systems, by analyzing body gestures, facial expressions, speech and physiological signals. The key aspect of the research relies on investigating novel interfaces that help building situational awareness and emotional awareness, to enable affect-driven remote collaboration in AR for car driving simulators. The problem addressed relates to the question about how to build situational awareness (using AR technology) and emotional awareness (by AC technology), and how to integrate these two distinct technologies [4], into a unique affective framework for training, in a car driving simulator

    SUPPORTING MISSION PLANNING WITH A PERSISTENT AUGMENTED ENVIRONMENT

    Get PDF
    Includes supplementary materialIncludes Supplementary MaterialThe Department of the Navy relies on current naval practices such as briefs, chat, and voice reports to provide an overall operational assessment of the fleet. That includes the cyber domain, or battlespace, depicting a single snapshot of a ship’s network equipment and service statuses. However, the information can be outdated and inaccurate, creating confusion among decision-makers in understanding the service and availability of equipment in the cyber domain. We examine the ability of a persistent augmented environment (PAE) and 3D visualization to support communications and cyber network operations, reporting, and resource management decision-making. We designed and developed a PAE prototype and tested the usability of its interface. Our study examined users’ comprehension of 3D visualization of the naval cyber battlespace onboard multiple ships and evaluated the PAE’s ability to assist in effective mission planning at the tactical level. The results are highly encouraging: the participants were able to complete their tasks successfully. They found the interface easy to understand and operate, and the prototype was characterized as a valuable alternative to their current practices. Our research provides close insights into the feasibility and effectiveness of the novel form of data representation and its capability to support faster and improved situational awareness and decision-making in a complex operational technology (OT) environment between diverse communities.Lieutenant, United States NavyLieutenant, United States NavyApproved for public release. Distribution is unlimited

    Learning in a Mixed Reality System in the Context of ‚Industrie 4.0‘

    Get PDF
    This contribution in the field of innovative approaches to training and education in technical subjects focuses on the potential of modern teaching and learning environments. The contribution is based on a theoretical introduction to Mixed Reality Systems and virtual teaching and learning systems, and as such provides an overview of current research regarding modern learning environments. In particular, it takes a close look at motivational effects in the context of web-based learning structures, human-object interactions, gamification and immersion. The article discusses both technical, user-relevant and pedagogical aspects as well as suggestions for further research in the context of Ausbildung 4.0.Keywords: Industry 4.0, Vocational Training 4.0, Mixed Reality System, virtual learning AcknowledgementThe author would like to thank the ChinaScholarshipCouncil(CSC) for the financial support (No. 201406030091)

    An Augmented Reality Human-Robot Collaboration System

    Get PDF
    InvitedThis article discusses an experimental comparison of three user interface techniques for interaction with a remotely located robot. A typical interface for such a situation is to teleoperate the robot using a camera that displays the robot's view of its work environment. However, the operator often has a difficult time maintaining situation awareness due to this single egocentric view. Hence, a multimodal system was developed enabling the human operator to view the robot in its remote work environment through an augmented reality interface, the augmented reality human-robot collaboration (AR-HRC) system. The operator uses spoken dialogue, reaches into the 3D representation of the remote work environment and discusses intended actions of the robot. The result of the comparison was that the AR-HRC interface was found to be most effective, increasing accuracy by 30%, while reducing the number of close calls in operating the robot by factors of ~3x. It thus provides the means to maintain spatial awareness and give the users the feeling of working in a true collaborative environment

    Using Augmented Reality as a Medium to Assist Teaching in Higher Education

    Get PDF
    In this paper we describe the use of a high-level augmented reality (AR) interface for the construction of collaborative educational applications that can be used in practice to enhance current teaching methods. A combination of multimedia information including spatial three-dimensional models, images, textual information, video, animations and sound, can be superimposed in a student-friendly manner into the learning environment. In several case studies different learning scenarios have been carefully designed based on human-computer interaction principles so that meaningful virtual information is presented in an interactive and compelling way. Collaboration between the participants is achieved through use of a tangible AR interface that uses marker cards as well as an immersive AR environment which is based on software user interfaces (UIs) and hardware devices. The interactive AR interface has been piloted in the classroom at two UK universities in departments of Informatics and Information Science
    corecore