15,170 research outputs found

    UX Evaluation of a Tractor Cabin Digital Twin Using Mixed Reality

    Get PDF
    Understanding user experience (UX) is essential to design engaging and attractive products, so nowadays has emerged an increasingly interest in user- centred design approach; in this perspective, digital technologies such as Virtual Reality (VR) and Mixed Reality (MR) could help designers and engineers to create a digital prototype through which the user feedback can be considered during the product design stage. This research aims at creating an interactive Digital Twin (DT) using MR to enable a tractor driving simulation and involve real users to carry out an early UX evaluation, with the scope to validate the design of the control dashboard through a transdisciplinary approach. MR combines virtual simulation with real physical hardware devices which the user can interact with and have control through both visual and tactile feedback. The result is a MR simulator that combines virtual contents and physical controls, capable of reproducing a plowing activity close to reality. The principles of UX design was applied to this research for a continuous and dynamic UX evaluation during the project development

    The contribution of closed loop tracking control of motion platform on laterally induced postural instability of the drivers at SAAM dynamic simulator

    Get PDF
    This paper explains the effect of a motion platform closed loop control comparing to the static condition for driving simulators on postural instability. The postural instabilities of the participants (N=18, 15 male and 3 female subjects) were measured as lateral displacements of subject body centre of pressure (YCP ) just before and after each driving session via a balance platform. After having completed the experiments, the two-tailed Mann-Whitney U test was applied to analyze the objective data for merely the post-exposure cases. The objective data analysis revealed that the YCP for the dynamic case indicated a significant lower value than the static situation (U(18), p < 0,0001). It can be concluded that the closed loop tracking control of the hexapod platform of the driving simulator (dynamic platform condition) decreased significantly the lateral postural stability compared to the static operation condition. However the two-tailed Mann-Whitney U test showed that no significant difference was obtained between the two conditions in terms of psychophysical perception

    Prototype gesture recognition interface for vehicular head-up display system

    Get PDF

    FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation

    Full text link
    FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s). While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. The FlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge. We survey approaches and results from the top AlphaPilot teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be found at https://flightgoggles.mit.edu. Revision includes description of new FlightGoggles features, such as a photogrammetric model of the MIT Stata Center, new rendering settings, and a Python AP

    Virtual reality training and assessment in laparoscopic rectum surgery

    Get PDF
    Background: Virtual-reality (VR) based simulation techniques offer an efficient and low cost alternative to conventional surgery training. This article describes a VR training and assessment system in laparoscopic rectum surgery. Methods: To give a realistic visual performance of interaction between membrane tissue and surgery tools, a generalized cylinder based collision detection and a multi-layer mass-spring model are presented. A dynamic assessment model is also designed for hierarchy training evaluation. Results: With this simulator, trainees can operate on the virtual rectum with both visual and haptic sensation feedback simultaneously. The system also offers surgeons instructions in real time when improper manipulation happens. The simulator has been tested and evaluated by ten subjects. Conclusions: This prototype system has been verified by colorectal surgeons through a pilot study. They believe the visual performance and the tactile feedback are realistic. It exhibits the potential to effectively improve the surgical skills of trainee surgeons and significantly shorten their learning curve. © 2014 John Wiley & Sons, Ltd
    • …
    corecore