992 research outputs found

    HoloLens 2 Sensor Streaming

    Full text link
    We present a HoloLens 2 server application for streaming device data via TCP in real time. The server can stream data from the four grayscale cameras, depth sensor, IMU, front RGB camera, microphone, head tracking, eye tracking, and hand tracking. Each sent data frame has a timestamp and, optionally, the instantaneous pose of the device in 3D space. The server allows downloading device calibration data, such as camera intrinsics, and can be integrated into Unity projects as a plugin, with support for basic upstream capabilities. To achieve real time video streaming at full frame rate, we leverage the video encoding capabilities of the HoloLens 2. Finally, we present a Python library for receiving and decoding the data, which includes utilities that facilitate passing the data to other libraries. The source code, Python demos, and precompiled binaries are available at https://github.com/jdibenes/hl2ss.Comment: Technical repor

    Assessing Mixed Reality Voice Dictation with Background Noise

    Get PDF
    Mixed reality devices, such as the Microsoft HoloLens 2, are growing in popularity and have been adopted in domains like education, aviation, and medicine. Text entry interaction is essential for daily activities: texting, using websites, and taking notes. However, typing long messages using the HoloLens’ virtual keyboard can be slow and cumbersome. The voice dictation provides a speech-to-text interaction that requires less physical effort and time by allowing users to verbalize messages and translate them into the text without manual input. Still, HoloLens 1’s dictation method was specifically problematic when the background noise exceeded 60 dB, which is common in many work environments. Consequently, less than half of the participants could dictate phrases in the high noise condition. However, the HoloLens 2 has improved the voice dictation to function in a higher background noise level with a maximum of 90 dB (Strange, 2019). This study will examine the user experience and the extent to which different background noise conditions impact voice dictation efficiency and effectiveness using the Microsoft HoloLens 2. The results will be compared to the past research on the HoloLens 1 (Derby et al., 2020). In addition, user perceptions and feedback will help create recommendations for future improvements

    HoloLens 1 vs. HoloLens 2: Improvements in the New Model for Orthopedic Oncological Interventions

    Get PDF
    This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons’ feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.This work was supported by projects PI18/01625, AC20/00102-3 and Era Permed PerPlanRT (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III, Asociación Española Contra el Cáncer and European Regional Development Fund "Una manera de hacer Europa") and IND2018/TIC-9753 (Comunidad de Madrid)

    An Assessment of the Eye Tracking Signal Quality Captured in the HoloLens 2

    Full text link
    We present an analysis of the eye tracking signal quality of the HoloLens 2s integrated eye tracker. Signal quality was measured from eye movement data captured during a random saccades task from a new eye movement dataset collected on 30 healthy adults. We characterize the eye tracking signal quality of the device in terms of spatial accuracy, spatial precision, temporal precision, linearity, and crosstalk. Most notably, our evaluation of spatial accuracy reveals that the eye movement data in our dataset appears to be uncalibrated. Recalibrating the data using a subset of our dataset task produces notably better eye tracking signal quality.Comment: 10 pages, 10 figure
    • …
    corecore