261 research outputs found

    The use of the Nintendo Wii in motor rehabilitation for virtual reality interventions:a literature review

    Get PDF
    Several review articles have been published on the use of Virtual Reality (VR) in motor rehabilitation. The majority of these focus on the effectiveness of VR on improving motor function using relatively expensive commercial tools and technologies including robotics, cybergloves, cybergrasps, joysticks, force sensors and motion capture systems. However, we present the case in this chapter that game sensors and VR technologies which can be customized and reconfigured, such as the Nintendo Wii, provide an alternative and affordable VR intervention for rehabilitation. While the performance of many of the Wii based interventions in motor rehabilitation are currently the focus of investigation by researchers, an extensive and holistic discussion on this subject does not yet exist. As such, the purpose of this chapter is to provide readers with an understanding of the advantages and limitations of the Nintendo Wii game sensor device (and its associated accessories) for motor rehabilitation and in addition, to outline the potential for incorporating these into clinical interventions for the benefit of patients and therapists

    Interactive natural user interfaces

    Get PDF
    For many years, science fiction entertainment has showcased holographic technology and futuristic user interfaces that have stimulated the world\u27s imagination. Movies such as Star Wars and Minority Report portray characters interacting with free-floating 3D displays and manipulating virtual objects as though they were tangible. While these futuristic concepts are intriguing, it\u27s difficult to locate a commercial, interactive holographic video solution in an everyday electronics store. As used in this work, it should be noted that the term holography refers to artificially created, free-floating objects whereas the traditional term refers to the recording and reconstruction of 3D image data from 2D mediums. This research addresses the need for a feasible technological solution that allows users to work with projected, interactive and touch-sensitive 3D virtual environments. This research will aim to construct an interactive holographic user interface system by consolidating existing commodity hardware and interaction algorithms. In addition, this work studies the best design practices for human-centric factors related to 3D user interfaces. The problem of 3D user interfaces has been well-researched. When portrayed in science fiction, futuristic user interfaces usually consist of a holographic display, interaction controls and feedback mechanisms. In reality, holographic displays are usually represented by volumetric or multi-parallax technology. In this work, a novel holographic display is presented which leverages a mini-projector to produce a free-floating image onto a fog-like surface. The holographic user interface system will consist of a display component: to project a free-floating image; a tracking component: to allow the user to interact with the 3D display via gestures; and a software component: which drives the complete hardware system. After examining this research, readers will be well-informed on how to build an intuitive, eye-catching holographic user interface system for various application arenas

    Development and Testing of a Self-Contained, Portable Instrumentation System for a Fighter Pilot Helmet

    Get PDF
    A self-contained, portable, inertial and positional measurement system was developed and tested for an HGU-55 model fighter pilot helmet. The system, designated the Portable Helmet Instrumentation System (PHIS), demonstrated the recording of accelerations and rotational rates experienced by the human head in a flight environment. A compact, self-contained, “knee-board” sized computer recorded these accelerations and rotational rates during flight. The present research presents the results of a limited evaluation of this helmet-mounted instrumentation system flown in an Extra 300 fully aerobatic aircraft. The accuracy of the helmet-mounted, inertial head tracker system was compared to the aircraft-mounted referenced system. The ability of the Portable Helmet Instrumentation System to record position, orientation and inertial information in ground and flight conditions was evaluated. The capability of the Portable Helmet Instrumentation System to provide position, orientation and inertial information with sufficient fidelity was evaluated. The concepts demonstrated in this system are: 1) calibration of the inertial sensing element without external equipment 2) the use of differential inertial sensing equipment to remove the accelerations and rotational rates of a moving vehicle from the pilot’s head-tracking measurements 3) the determination of three-dimensional position and orientation from three corresponding points using a range sensor. The range sensor did not operate as planned. The helmet only managed to remain within the range sensor’s field of view for 37% of flight time. Vertical accelerations showed the greatest correlation when comparing helmet measurements to aircraft measurements. The PHIS operated well during level flight

    Enabling natural interaction for virtual reality

    Get PDF
    This research focuses on the exploration of software and methods to support natural interaction within a virtual environment. Natural interaction refers to the ability of the technology to support human interactions with computer generated simulations that most accurately reflect interactions with real objects. Over the years since the invention of computer-aided design tools, computers have become ubiquitous in the product design process. Increasingly, engineers and designers are using immersive virtual reality to evaluate virtual products throughout the entire design process. The goal of this research is to develop tools that support verisimilitude, or likeness to reality, particularly with respect to human interaction with virtual objects. Increasing the verisimilitude of the interactions and experiences in a virtual environment has the potential to increase the external validity of such data, resulting in more reliable decisions and better products. First, interface software is presented that extends the potential reach of virtual reality to include low-cost, consumer-grade motion sensing devices, thus enabling virtual reality on a broader scale. Second, a software platform, VR JuggLua, is developed to enable rapid and iterative creation of natural interactions in virtual environments, including by end-user programmers. Based on this software platform, the focus of the rest of the research is on supporting virtual assembly and decision making. The SPARTA software incorporates a powerful physically-based modeling simulation engine tuned for haptic interaction. The workspace of a haptic device is both virtually expanded, though an extension to the bubble technique, and physically expanded, through integration of a haptic device with a multi-directional mobile platform. Finally, a class of hybrid methods for haptic collision detection and response is characterized in terms of five independent tasks. One such novel hybrid method, which selectively restores degrees of freedom in haptic assembly, is developed and assessed with respect to low-clearance CAD assembly. It successfully maintains the high 1000 Hz update rate required for stable haptics unlike previous related approaches. Overall, this work forms a pattern of contributions towards enabling natural interaction for virtual reality and advances the ability to use an immersive environment in decision making during product design

    Wearable and IoT technologies application for physical rehabilitation

    Get PDF
    This research consists in the development an IoT Physical Rehabilitation solution based on wearable devices, combining a set of smart gloves and smart headband for use in natural interactions with a set of VR therapeutic serious games developed on the Unity 3D gaming platform. The system permits to perform training sessions for hands and fingers motor rehabilitation. Data acquisition is performed by Arduino Nano Microcontroller computation platform with ADC connected to the analog measurement channels materialized by piezo-resistive force sensors and connected to an IMU module via I2C. Data communication is performed using the Bluetooth wireless communication protocol. The smart headband, designed to be used as a first- person-controller in game scenes, will be responsible for collecting the patient's head rotation value, this parameter will be used as the player's avatar head rotation value, approaching the user and the virtual environment in a semi-immersive way. The acquired data are stored and processed on a remote server, which will help the physiotherapist to evaluate the patients' performance around the different physical activities during a rehabilitation session, using a Mobile Application developed for the configuration of games and visualization of results. The use of serious games allows a patient with motor impairments to perform exercises in a highly interactive and non-intrusive way, based on different scenarios of Virtual Reality, contributing to increase the motivation during the rehabilitation process. The system allows to perform an unlimited number of training sessions, making possible to visualize historical values and compare the results of the different performed sessions, for objective evolution of rehabilitation outcome. Some metrics associated with upper limb exercises were also considered to characterize the patient’s movement during the session.Este trabalho de pesquisa consiste no desenvolvimento de uma solução de Reabilitação Física IoT baseada em dispositivos de vestuário, combinando um conjunto de luvas inteligentes e uma fita-de-cabeça inteligente para utilização em interações naturais com um conjunto de jogos terapêuticos sérios de Realidade Virtual desenvolvidos na plataforma de jogos Unity 3D. O sistema permite realizar sessões de treino para reabilitação motora de mãos e dedos. A aquisição de dados é realizada pela plataforma de computação Arduino utilizando um Microcontrolador Nano com ADC (Conversor Analógico-Digital) conectado aos canais de medição analógicos materializados por sensores de força piezo-resistivos e a um módulo IMU por I2C. A comunicação de dados é realizada usando o protocolo de comunicação sem fio Bluetooth. A fita-de-cabeça inteligente, projetada para ser usada como controlador de primeira pessoa nos cenários de jogo, será responsável por coletar o valor de rotação da cabeça do paciente, esse parâmetro será usado como valor de rotação da cabeça do avatar do jogador, aproximando o utilizador e o ambiente virtual de forma semi-imersiva. Os dados adquiridos são armazenados e processados num servidor remoto, o que ajudará o fisioterapeuta a avaliar o desempenho dos pacientes em diferentes atividades físicas durante uma sessão de reabilitação, utilizando uma Aplicação Móvel desenvolvido para configuração de jogos e visualização de resultados. A utilização de jogos sérios permite que um paciente com deficiências motoras realize exercícios de forma altamente interativa e não intrusiva, com base em diferentes cenários de Realidade Virtual, contribuindo para aumentar a motivação durante o processo de reabilitação. O sistema permite realizar um número ilimitado de sessões de treinamento, possibilitando visualizar valores históricos e comparar os resultados das diferentes sessões realizadas, para a evolução objetiva do resultado da reabilitação. Algumas métricas associadas aos exercícios dos membros superiores também foram consideradas para caracterizar o movimento do paciente durante a sessão
    corecore