905 research outputs found

    Realnav: Exploring Natural User Interfaces For Locomotion In Video Games

    Get PDF
    We present an exploration into realistic locomotion interfaces in video games using spatially convenient input hardware. In particular, we use Nintendo Wii Remotes to create natural mappings between user actions and their representation in a video game. Targeting American Football video games, we used the role of the quarterback as an exemplar since the game player needs to maneuver effectively in a small area, run down the field, and perform evasive gestures such as spinning, jumping, or the juke . In our study, we developed three locomotion techniques. The first technique used a single Wii Remote, placed anywhere on the user\u27s body, using only the acceleration data. The second technique just used the Wii Remote\u27s infrared sensor and had to be placed on the user\u27s head. The third technique combined a Wii Remote\u27s acceleration and infrared data using a Kalman filter. The Wii Motion Plus was also integrated to add the orientation of the user into the video game. To evaluate the different techniques, we compared them with a cost effective six degree of freedom (6DOF) optical tracker and two Wii Remotes placed on the user\u27s feet. Experiments were performed comparing each to this technique. Finally, a user study was performed to determine if a preference existed among these techniques. The results showed that the second and third technique had the same location accuracy as the cost effective 6DOF tracker, but the first was too inaccurate for video game players. Furthermore, the range of the Wii remote infrared and Motion Plus exceeded the optical tracker of the comparison technique. Finally, the user study showed that video game players preferred the third method over the second, but were split on the use of the Motion Plus when the tasks did not require it

    Airborne Infrared Target Tracking with the Nintendo Wii Remote Sensor

    Get PDF
    Intelligence, surveillance, and reconnaissance unmanned aircraft systems (UAS) are the most common variety of UAS in use today and provide invaluable capabilities to both the military and civil services. Keeping the sensors centered on a point of interest for an extended period of time is a demanding task requiring the full attention and cooperation of the UAS pilot and sensor operator. There is great interest in developing technologies which allow an operator to designate a target and allow the aircraft to automatically maneuver and track the designated target without operator intervention. Presently, the barriers to entry for developing these technologies are high: expertise in aircraft dynamics and control as well as in real- time motion video analysis is required and the cost of the systems required to flight test these technologies is prohibitive. However, if the research intent is purely to develop a vehicle maneuvering controller then it is possible to obviate the video analysis problem entirely. This research presents a solution to the target tracking problem which reliably provides automatic target detection and tracking with low expense and computational overhead by making use of the infrared sensor from a Nintendo Wii Remote Controller

    Dual-camera infrared guidance for computed tomography biopsy procedures

    Get PDF
    A CT-guided biopsy is a specialised surgical procedure whereby a needle is used to withdraw tissue or fluid specimen from a lesion of interest. The needle is guided while being viewed by a clinician on a computed tomography (CT) scan. CT guided biopsies invariably expose patients and operators to high dosage of radiation and are lengthy procedures where the lack of spatial referencing while guiding the needle along the required entry path are some of the diffculties currently encountered. This research focuses on addressing two of the challenges clinicians currently face when performing CT-guided biopsy procedures. The first challenge is the lack of spatial referencing during a biopsy procedure, with the requirement for improved accuracy and reduction in the number of repeated scans. In order to achieve this an infrared navigation system was designed and implemented where an existing approach was subsequently extended to help guide the clinician in advancing the biopsy needle. This extended algorithm computed a scaled estimate of the needle endpoint and assists with navigating the biopsy needle through a dedicated and custom built graphical user interface. The second challenge was to design and implement a training environment where clinicians could practice different entry angles and scenarios. A prototype training module was designed and built to provide simulated biopsy procedures in order to help increase spatial referencing. Various experiments and different scenarios were designed and tested to demonstrate the correctness of the algorithm and provide real-life simulated scenarios where the operators had a chance to practice different entry angles and familiarise themselves with the equipment. A comprehensive survey was also undertaken to investigate the advantages and disadvantages of the system

    3D Capturing Performances of Low-Cost Range Sensors for Mass-Market Applications

    Get PDF
    Since the advent of the first Kinect as motion controller device for the Microsoft XBOX platform (November 2010), several similar active and low-cost range sensing devices have been introduced on the mass-market for several purposes, including gesture based interfaces, 3D multimedia interaction, robot navigation, finger tracking, 3D body scanning for garment design and proximity sensors for automotive. However, given their capability to generate a real time stream of range images, these has been used in some projects also as general purpose range devices, with performances that for some applications might be satisfying. This paper shows the working principle of the various devices, analyzing them in terms of systematic errors and random errors for exploring the applicability of them in standard 3D capturing problems. Five actual devices have been tested featuring three different technologies: i) Kinect V1 by Microsoft, Structure Sensor by Occipital, and Xtion PRO by ASUS, all based on different implementations of the Primesense sensor; ii) F200 by Intel/Creative, implementing the Realsense pattern projection technology; Kinect V2 by Microsoft, equipped with the Canesta TOF Camera. A critical analysis of the results tries first of all to compare them, and secondarily to focus the range of applications for which such devices could actually work as a viable solution

    Development of 3D communication

    Get PDF
    The main problem the project is to find the correct solution for 3D communication between the user and the machine The student has to design the devices needed to a correct communication between the Wii console and the user. It is 3D glasses with movement detector, gloves with detectors and a stand for the Wii remote

    Development and Testing of a Self-Contained, Portable Instrumentation System for a Fighter Pilot Helmet

    Get PDF
    A self-contained, portable, inertial and positional measurement system was developed and tested for an HGU-55 model fighter pilot helmet. The system, designated the Portable Helmet Instrumentation System (PHIS), demonstrated the recording of accelerations and rotational rates experienced by the human head in a flight environment. A compact, self-contained, “knee-board” sized computer recorded these accelerations and rotational rates during flight. The present research presents the results of a limited evaluation of this helmet-mounted instrumentation system flown in an Extra 300 fully aerobatic aircraft. The accuracy of the helmet-mounted, inertial head tracker system was compared to the aircraft-mounted referenced system. The ability of the Portable Helmet Instrumentation System to record position, orientation and inertial information in ground and flight conditions was evaluated. The capability of the Portable Helmet Instrumentation System to provide position, orientation and inertial information with sufficient fidelity was evaluated. The concepts demonstrated in this system are: 1) calibration of the inertial sensing element without external equipment 2) the use of differential inertial sensing equipment to remove the accelerations and rotational rates of a moving vehicle from the pilot’s head-tracking measurements 3) the determination of three-dimensional position and orientation from three corresponding points using a range sensor. The range sensor did not operate as planned. The helmet only managed to remain within the range sensor’s field of view for 37% of flight time. Vertical accelerations showed the greatest correlation when comparing helmet measurements to aircraft measurements. The PHIS operated well during level flight
    corecore