1,402 research outputs found

    Natural Walking in Virtual Reality:A Review

    Get PDF

    An expandable walking in place platform

    Get PDF
    The control of locomotion in 3D virtual environments should be an ordinary task, from the user point-of-view. Several navigation metaphors have been explored to control locomotion naturally, such as: real walking, the use of simulators, and walking in place. These have proven that the more natural the approach used to control locomotion, the more immerse the user will feel inside the virtual environment. Overcoming the high cost and complexity for the use of most approaches in the field, we introduce a walking in place platform that is able to identify orientation, speed for displacement, as well as lateral steps, of a person mimicking walking pattern. The detection of this information is made without use of additional sensors attached to user body. Our device is simple to mount, inexpensive and allows almost natural use, with lazy steps, thus releasing the hands for other uses. Also, we explore and test a passive, tactile surface for safe use of our platform. The platform was conceived to be utilized as an interface to control navigation in virtual environments, and augmented reality. Extending our device and techniques, we have elaborated a redirection walking metaphor, to be used together with a cave automatic virtual environment. Another metaphor allowed the use of our technique for navigating in point clouds for tagging of data. We tested the use of our technique associated with two different navigation modes: human walking and vehicle driving. In the human walking approach, the virtual orientation inhibits the displacement when sharp turns are made by the user. In vehicle mode, the virtual orientation and displacement occur together, more similar to a vehicle driving approach. We applied tests to detect preferences of navigation mode and ability to use our device to 52 subjects. We identified a preference for the vehicle driving mode of navigation. The use of statistics revealed that users learned easily the use of our technique for navigation. Users were faster walking in vehicle mode; but human mode allowed precise walking in the virtual test environment. The tactile platform proved to allow safe use of our device, being an effective and simple solution for the field. More than 200 people tested our device: UFRGS Portas Abertas in 2013 and 2014, which was a event to present to local community academic works; during 3DUI 2014, where our work was utilized together with a tool for point cloud manipulation. The main contributions of our work are a new approach for detection of walking in place, which allows simple use, with naturalness of movements, expandable for utilization in large areas (such as public spaces), and that efficiently supply orientation and speed to use in virtual environments or augmented reality, with inexpensive hardware.O controle da locomoção em ambientes virtuais 3D deveria ser uma tarefa simples, do ponto de vista do usuário. Durante os anos, metáforas para navegação têm sido exploradas para permitir o controle da locomoção naturalmente, tais como: caminhada real; uso de simuladores e imitação de caminhada. Estas técnicas provaram que, quanto mais natural à abordagem utilizada para controlar a locomoção, mais imerso o usuário vai se sentir dentro do ambiente virtual. Superando o alto custo e complexidade de uso da maioria das abordagens na área, introduzimos uma plataforma para caminhada no lugar, (usualmente reportado como wal king in place), que é capaz de identificar orientação, velocidade de deslocamento, bem como passos laterais, de uma pessoa imitando a caminhada. A detecção desta informação é feita sem o uso de sensores presos no corpo dos usuários, apenas utilizando a plataforma. Nosso dispositivo é simples de montar, barato e permite seu uso por pessoas comuns de forma quase natural, com passos pequenos, assim deixando as mãos livres para outras tarefas. Nós também exploramos e testamos uma superfície táctil passiva para utilização segura de nossa plataforma. A plataforma foi concebida para ser utilizada como uma interface para navegação em ambientes virtuais. Estendendo o uso de nossa técnica e dis positivo, nós elaboramos uma metáfora para caminhada redirecionada, para ser utilizada em conjunto com cavernas de projeção, (usualmente reportado como Cave automatic vir tual environment (CAVE)). Criamos também uma segunda metáfora para navegação, a qual permitiu o uso de nossa técnica para navegação em nuvem de pontos, auxiliando no processo de etiquetagem destes, como parte da competição para o 3D User Interface que ocorreu em Minessota, nos Estados Unidos, em 2014. Nós testamos o uso da técnica e dispositivos associada com duas nuances de navegação: caminhada humana e controle de veiculo. Na abordagem caminhada humana, a taxa de mudança da orientação gerada pelo usuário ao utilizar nosso dispositivo, inibia o deslocamento quando curvas agudas eram efetuadas. No modo veículo, a orientação e o deslocamento ocorriam conjuntamente quando o usuário utilizava nosso dispositivo e técnicas, similarmente ao processo de controle de direção de um veículo. Nós aplicamos testes para determinar o modo de navegação de preferencia para uti lização de nosso dispositivo, em 52 sujeitos. Identificamos uma preferencia pelo modo de uso que se assimila a condução de um veículo. Testes estatísticos revelaram que os usuários aprenderam facilmente a usar nossa técnica para navegar em ambientes virtuais. Os usuários foram mais rápidos utilizando o modo veículo, mas o modo humano garantiu maior precisão no deslocamento no ambiente virtual. A plataforma táctil provou permi tir o uso seguro de nosso dispositivo, sendo uma solução efetiva e simples para a área. Mais de 200 pessoas testaram nosso dispositivo e técnicas: no evento Portas Abertas da UFRGS em 2013 e 2014, um evento onde são apresentados para a comunidade local os trabalhos executados na universidade; e no 3D User Interface, onde nossa técnica e dis positivos foram utilizados em conjunto com uma ferramenta de seleção de pontos numa competição. As principais contribuições do nosso trabalho são: uma nova abordagem para de tecção de imitação de caminhada, a qual permite um uso simples, com naturalidade de movimentos, expansível para utilização em áreas grandes, como espaços públicos e que efetivamente captura informações de uso e fornece orientação e velocidade para uso em ambientes virtuais ou de realidade aumentada, com uso de hardware barato

    Detecting head movement using gyroscope data collected via in-ear wearables

    Get PDF
    Abstract. Head movement is considered as an effective, natural, and simple method to determine the pointing towards an object. Head movement detection technology has significant potentiality in diverse field of applications and studies in this field verify such claim. The application includes fields like users interaction with computers, controlling many devices externally, power wheelchair operation, detecting drivers’ drowsiness while they drive, video surveillance system, and many more. Due to the diversity in application, the method of detecting head movement is also wide-ranging. A number of approaches such as acoustic-based, video-based, computer-vision based, inertial sensor data based head movement detection methods have been introduced by researchers over the years. In order to generate inertial sensor data, various types of wearables are available for example wrist band, smart watch, head-mounted device, and so on. For this thesis, eSense — a representative earable device — that has built-in inertial sensor to generate gyroscope data is employed. This eSense device is a True Wireless Stereo (TWS) earbud. It is augmented with some key equipment such as a 6-axis inertial motion unit, a microphone, and dual mode Bluetooth (Bluetooth Classic and Bluetooth Low Energy). Features are extracted from gyroscope data collected via eSense device. Subsequently, four machine learning models — Random Forest (RF), Support Vector Machine (SVM), Naïve Bayes, and Perceptron — are applied aiming to detect head movement. The performance of these models is evaluated by four different evaluation metrics such as Accuracy, Precision, Recall, and F1 score. Result shows that machine learning models that have been applied in this thesis are able to detect head movement. Comparing the performance of all these machine learning models, Random Forest performs better than others, it is able to detect head movement with approximately 77% accuracy. The accuracy rate of other three models such as Support Vector Machine, Naïve Bayes, and Perceptron is close to each other, where these models detect head movement with about 42%, 40%, and 39% accuracy, respectively. Besides, the result of other evaluation metrics like Precision, Recall, and F1 score verifies that using these machine learning models, different head direction such as left, right, or straight can be detected

    Sensing with Earables: A Systematic Literature Review and Taxonomy of Phenomena

    Get PDF
    Earables have emerged as a unique platform for ubiquitous computing by augmenting ear-worn devices with state-of-the-art sensing. This new platform has spurred a wealth of new research exploring what can be detected on a wearable, small form factor. As a sensing platform, the ears are less susceptible to motion artifacts and are located in close proximity to a number of important anatomical structures including the brain, blood vessels, and facial muscles which reveal a wealth of information. They can be easily reached by the hands and the ear canal itself is affected by mouth, face, and head movements. We have conducted a systematic literature review of 271 earable publications from the ACM and IEEE libraries. These were synthesized into an open-ended taxonomy of 47 different phenomena that can be sensed in, on, or around the ear. Through analysis, we identify 13 fundamental phenomena from which all other phenomena can be derived, and discuss the different sensors and sensing principles used to detect them. We comprehensively review the phenomena in four main areas of (i) physiological monitoring and health, (ii) movement and activity, (iii) interaction, and (iv) authentication and identification. This breadth highlights the potential that earables have to offer as a ubiquitous, general-purpose platform

    VR-Fit: Walking-in-Place Locomotion with Real Time Step Detection for VR-Enabled Exercise

    Get PDF
    With recent advances in mobile and wearable technologies, virtual reality (VR) found many applications in daily use. Today, a mobile device can be converted into a low-cost immersive VR kit thanks to the availability of do-it-yourself viewers in the shape of simple cardboards and compatible software for 3D rendering. These applications involve interacting with stationary scenes or moving in between spaces within a VR environment. VR locomotion can be enabled through a variety of methods, such as head movement tracking, joystick-triggered motion and through mapping natural movements to translate to virtual locomotion. In this study, we implemented a walk-in-place (WIP) locomotion method for a VR-enabled exercise application. We investigate the utility of WIP for exercise purposes, and compare it with joystick-based locomotion in terms of step performance and subjective qualities of the activity, such as enjoyment, encouragement for exercise and ease of use. Our technique uses vertical accelerometer data to estimate steps taken during walking or running, and locomotes the user’s avatar accordingly in virtual space. We evaluated our technique in a controlled experimental study with 12 people. Results indicate that the way users control the simulated locomotion affects how they interact with the VR simulation, and influence the subjective sense of immersion and the perceived quality of the interaction. In particular, WIP encourages users to move further, and creates a more enjoyable and interesting experience in comparison to joystick-based navigation
    corecore