414 research outputs found

    Potential of consumer EEG for real-time interactions in immersive VR

    Get PDF
    Abstract. Virtual reality is an active research subject and has received a lot of attention over the last few years. We have seen multiple commercial VR devices, each improving upon the last iteration become available to the wider public. In addition, interest in brain-computer interface (BCI) devices has increased rapidly. As these devices are becoming more affordable and easy to use, we are presented with more accessible options to measure brain activity. In this study, our aim is to combine these two technologies to enhance the interaction within a virtual environment. In this study we sought to facilitate interaction in VR by using EEG signals. The EEG signals were used to estimate the volume of focus. By applying this concept with VR, we designed two use cases for further exploration. The methods of interactions explored in the study were telekinesis and teleportation. Telekinesis seemed an applicable option for this study since it allows the utilization of the EEG while maintaining a captivating and engaging user experience. With teleportation, the goal was the exploration of different options for locomotion in VR. To test our solution, we built a test environment by using Unity engine. We also invited several participants to gain feedback on the usability and accuracy of our methodology. For evaluation, 13 study participants were divided into two different groups. The other group tested our actual solution for the estimation of the focus. However, the other group used randomized values for the same purpose. Some key differences between the test groups were identified. We were able to create a working prototype where the users could interact with the environment by using their EEG signals. With some improvements, this could be expanded to a more refined solution with a better user experience. There is a lot of potential in combining the use of human brain signals with virtual environments to both enrich the interaction and increase the immersion of virtual reality.Kuluttaja-EEG laitteiden potentiaali reaaliaikaiseen vuorovaikutukseen immersiivisessä virtuaalitodellisuudessa. Tiivistelmä. Virtuaalitodellisuus (VR) on aktiivisen tutkimuksen kohde ja varsinkin viime vuosina herättänyt paljon huomiota. VR-laseissa on tapahtunut huomattavaa kehitystä ja niitä on saatavilla yhä laajemmalle käyttäjäkunnalle. Lisäksi kiinnostus aivo-tietokone -rajapintoihin (BCI) on kiihtymässä. Koska aivokäyrää mittaavat laitteet ovat yhä edullisempia ja kehittymässä helppokäyttöisemmiksi, monia uusia menetelmiä aivosignaalin mittamiseksi on saatavilla. Tässä työssä tavoitteemme oli yhdistää nämä kaksi teknologiaa parantaaksemme vuorovaikutusta virtuaalitodellisuudessa. Tässä tutkimuksessa käytimme aivosähkökäyrää VR-käyttäjäkokemuksen kehittämiseksi. Tätä tekniikkaa hyödyntäen arvioimme käyttäjän keskittymistä. Tutkimusta varten valitsimme kaksi vuorovaikutustapaa. Nämä tutkittavat tavat ovat telekinesia sekä teleportaatio. Telekinesia on mielenkiintoinen tapa hyödyntää aivosähkökäyrää luoden samalla mukaansatempaavan käyttäjäkokemuksen. Teleportaation päämääränä oli löytää uudenlaisia liikkumistapoja VR:ssä. Tutkimustamme varten, suunnittelimme testiympäristön Unity-pelimoottorilla. Kokosimme joukon testaajia, joiden avulla arvioimme työmme käyttökelpoisuutta sekä tarkkuutta. Saadaksemme luotettavampia testituloksia, jaoimme 13 testaajaa kahteen eri ryhmään. Toinen ryhmistä testasi varsinaista toteutustamme ja toinen ryhmä käytti satunnaistettuja keskittymisarvoja. Löysimme ratkaisevia eroja näiden kahden testiryhmän välillä. Onnistuimme kehittämään toimivan prototyypin, jossa käyttäjät kykenivät interaktioon virtuaaliympäristössä hyödyntäen aivosähkökäyrää. Jatkokehitystä tekemällä käyttäjäkokemusta olisi mahdollista parantaa entisestään. Integraatio aivosensoreiden ja virtuaalitodellisuuden välillä huokuu potentiaalia ja tarjoaa mahdollisuuksia tehdä virtuaalimaailmasta yhä immersiivisemmän

    Kinesthetic Illusion of Being Pulled Sensation Enables Haptic Navigation for Broad Social Applications

    Get PDF
    Many handheld force-feedback devices have been proposed to provide a rich experience with mobile devices. However, previously reported devices have been unable to generate both constant and translational force. They can only generate transient rotational force since they use a change in angular momentum. Here, we exploit the nonlinearity of human perception to generate both constant and translational force. Specifically, a strong acceleration is generated for a very brief period in the desired direction, while a weaker acceleration is generated over a longer period in the opposite direction. The internal human haptic sensors do not detect the weaker acceleration, so the original position of the mass is \"washed out\". The result is that the user is tricked into perceiving a unidirectional force. This force can be made continuous by repeating the motions. This chapter describes the pseudoattraction force technique, which is a new force feedback technique that enables mobile devices to create a the sensation of two-dimensional force. A prototype was fabricated in which four slider-crank mechanism pairs were arranged in a cross shape and embedded in a force feedback display. Each slider-crank mechanism generates a force vector. By using the sum of the generated vectors, which are linearly independent, the force feedback display can create a force sensation in any arbitrary direction on a two-dimensional plane. We also introduce an interactive application with the force feedback display, an interactive robot, and a vision-based positioning system

    Home-based rehabilitation of the shoulder using auxiliary systems and artificial intelligence: an overview

    Get PDF
    Advancements in modern medicine have bolstered the usage of home-based rehabilitation services for patients, particularly those recovering from diseases or conditions that necessitate a structured rehabilitation process. Understanding the technological factors that can influence the efficacy of home-based rehabilitation is crucial for optimizing patient outcomes. As technologies continue to evolve rapidly, it is imperative to document the current state of the art and elucidate the key features of the hardware and software employed in these rehabilitation systems. This narrative review aims to provide a summary of the modern technological trends and advancements in home-based shoulder rehabilitation scenarios. It specifically focuses on wearable devices, robots, exoskeletons, machine learning, virtual and augmented reality, and serious games. Through an in-depth analysis of existing literature and research, this review presents the state of the art in home-based rehabilitation systems, highlighting their strengths and limitations. Furthermore, this review proposes hypotheses and potential directions for future upgrades and enhancements in these technologies. By exploring the integration of these technologies into home-based rehabilitation, this review aims to shed light on the current landscape and offer insights into the future possibilities for improving patient outcomes and optimizing the effectiveness of home-based rehabilitation programs.info:eu-repo/semantics/publishedVersio

    An Overview of Self-Adaptive Technologies Within Virtual Reality Training

    Get PDF
    This overview presents the current state-of-the-art of self-adaptive technologies within virtual reality (VR) training. Virtual reality training and assessment is increasingly used for five key areas: medical, industrial & commercial training, serious games, rehabilitation and remote training such as Massive Open Online Courses (MOOCs). Adaptation can be applied to five core technologies of VR including haptic devices, stereo graphics, adaptive content, assessment and autonomous agents. Automation of VR training can contribute to automation of actual procedures including remote and robotic assisted surgery which reduces injury and improves accuracy of the procedure. Automated haptic interaction can enable tele-presence and virtual artefact tactile interaction from either remote or simulated environments. Automation, machine learning and data driven features play an important role in providing trainee-specific individual adaptive training content. Data from trainee assessment can form an input to autonomous systems for customised training and automated difficulty levels to match individual requirements. Self-adaptive technology has been developed previously within individual technologies of VR training. One of the conclusions of this research is that while it does not exist, an enhanced portable framework is needed and it would be beneficial to combine automation of core technologies, producing a reusable automation framework for VR training

    Portable dVRK: an augmented V-REP simulator of the da Vinci Research Kit

    Get PDF
    The da Vinci Research Kit (dVRK) is a first generation da Vinci robot repurposed as a research platform and coupled with software and controllers developed by research users. An already quite wide community is currently sharing the dVRK (32 systems in 28 sites worldwide). The access to the robotic system for training surgeons and for developing new surgical procedures, tools and new control modalities is still difficult due to the limited availability and high maintenance costs. The development of simulation tools provides a low cost, easy and safe alternative to the use of the real platform for preliminary research and training activities. The Portable dVRK, which is described in this work, is based on a V-REP simulator of the dVRK patient side and endoscopic camera manipulators which are controlled through two haptic interfaces and a 3D viewer, respectively. The V-REP simulator is augmented with a physics engine allowing to render the interaction of new developed tools with soft objects. Full integration in the ROS control architecture makes the simulator flexible and easy to be interfaced with other possible devices. Several scenes have been implemented to illustrate performance and potentials of the developed simulator

    A Scoping Review on Virtual Reality-Based Industrial Training

    Get PDF
    The fourth industrial revolution has forced most companies to technologically evolve, applying new digital tools, so that their workers can have the necessary skills to face changing work environments. This article presents a scoping review of the literature on virtual reality-based training systems. The methodology consisted of four steps, which pose research questions, document search, paper selection, and data extraction. From a total of 350 peer-reviewed database articles, such as SpringerLink, IEEEXplore, MDPI, Scopus, and ACM, 44 were eventually chosen, mostly using the virtual reality haptic glasses and controls from Oculus Rift and HTC VIVE. It was concluded that, among the advantages of using this digital tool in the industry, is the commitment, speed, measurability, preservation of the integrity of the workers, customization, and cost reduction. Even though several research gaps were found, virtual reality is presented as a present and future alternative for the efficient training of human resources in the industrial field.This work was supported by Instituto Superior Tecnológico Victoria Vásconez Cuvi. The authors appreciate the opportunity to analyze topics related to this paper. The authors must also recognize the supported bringing by Universidad Tecnica de Ambato (UTA) and their Research and Development Department (DIDE) under project CONIN-P-256-2019, and SENESCYT by grants “Convocatoria Abierta 2011” and “Convocatoria Abierta 2013”

    Addressing the problem of Interaction in fully immersive Virtual Environments: from raw sensor data to effective devices

    Get PDF
    Immersion into Virtual Reality is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system with images, sound or other stimuli that provide an engrossing total environment. The use of technological devices such as stereoscopic cameras, head-mounted displays, tracking systems and haptic interfaces allows for user experiences providing a physical feeling of being in a realistic world, and the term “immersion” is a metaphoric use of the experience of submersion applied to representation, fiction or simulation. One of the main peculiarity of fully immersive virtual reality is the enhancing of the simple passive viewing of a virtual environment with the ability to manipulate virtual objects inside it. This Thesis project investigates such interfaces and metaphors for the interaction and the manipulation tasks. In particular, the research activity conducted allowed the design of a thimble-like interface that can be used to recognize in real-time the human hand’s orientation and infer a simplified but effective model of the relative hand’s motion and gesture. Inside the virtual environment, users provided with the developed systems will be therefore able to operate with natural hand gestures in order to interact with the scene; for example, they could perform positioning task by moving, rotating and resizing existent objects, or create new ones from scratch. This approach is particularly suitable when there is the need for the user to operate in a natural way, performing smooth and precise movements. Possible applications of the system to the industry are the immersive design in which the user can perform Computer- Aided Design (CAD) totally immersed in a virtual environment, and the operators training, in which the user can be trained on a 3D model in assembling or disassembling complex mechanical machineries, following predefined sequences. The thesis has been organized around the following project plan: - Collection of the relevant State Of The Art - Evaluation of design choices and alternatives for the interaction hardware - Development of the necessary embedded firmware - Integration of the resulting devices in a complex interaction test-bed - Development of demonstrative applications implementing the device - Implementation of advanced haptic feedbac
    corecore