3,581 research outputs found
Evaluation of the Oculus Rift S tracking system in room scale virtual reality
In specific virtual reality applications that require high accuracy it may be advisable to replace the built-in tracking system of the HMD with a third party solution. The purpose of this research work is to evaluate the accuracy of the built-in tracking system of the Oculus Rift S Head Mounted Display (HMD) in room scale environments against a motion capture system. In particular, an experimental evaluation of the Oculus Rift S inside-out tracking technology was carried out, compared to the performance of an outside-in tracking method based on the OptiTrack motion capture system. In order to track the pose of the HMD using the motion capture system the Oculus Rift S was instrumented with passive retro-reflective markers and calibrated. Experiments have been performed on a dataset of multiple paths including simple motions as well as more complex paths. Each recorded path contained simultaneous changes in both position and orientation of the HMD. Our results indicate that in room-scale environments the average translation error for the Oculus Rift S tracking system is about 1.83 cm, and the average rotation error is about 0. 77°, which is 2 orders of magnitude higher than the performance that can be achieved using a motion capture system
Explorable 3D Model of SCU Campus
My project is an interactive 3D model of SCU campus, which prospective students and their parents can explore without actually having to make the journey to campus. The architecture of the university is made traversable by running under Unreal Engine 4 which is a 3D game development framework that supports the Oculus Rift. The Oculus Rift is a virtual reality headset, which enhances the immersive experience for users of the SCU campus application. It accomplishes this by displaying the rendered images in immersive 3D right in front of their faces, and tracking their head motion and moving the viewpoint in the virtual world accordingly, so it will be as if they were actually there. Oculus Rift compatibility is fully integrated into the Unreal Engine, so it’s only natural to take advantage of the technology for this project
Libraries and the rift: Oculus Rift and 4D devices in libraries
Oculus Rift is a headset device that goes over the eyes; created with immersive gaming in mind, the Oculus Rift gives the user a four-dimensional experience. This isn't the same as watching 3D television; this is a headset created to take peripheral vision into account. One hundred percent of the wearer's field of view is covered; every turn of the head is calculated, and the encounter is totally engaging. Virtual reality is not new. But 4D systems are making this type of gaming and learning both fun and exciting. The user experience is very realistic, and the possibilities are wide open. There are multiple options in the 4D realm. Oculus Rift is new, and applications are still being built. The development kit, which consists of a headset, as well as a camera for tracking, is available. This type of technology maybe new and seem a little bit daunting, but integrating it into libraries and education is certain to lead to student discovery and excitement
Implementation Of Camera Arm Control By An Oculus Rift On A Da Vinci Surgical System Simulation
Camera control methods play a significant role in remote surgery. Two methods have been developed to control the camera arm of the da Vinci Surgical System: a standard clutch-based method for manual movement of the camera and an autonomous camera (auto-camera) method. In the standard method, the surgeon positions the camera manually using a pair of hand controllers. This happens frequently during the surgery and may serve as a distraction during surgical procedures. The second method was developed in order to help surgeon to remove the issue mentioned in the standard method. Auto-camera method enables the system to move the camera autonomously. In this method, the camera is moved with-respect-to the center of surgical tool arms with automatic zoom control ability. There are still many issues with automatically moving a camera. We will show the feasibility of an intermediate solution using an Oculus rift head mounted stereo display.
Achieving the optimal camera viewpoint with simple control methods is of utmost importance for remote surgical systems. We propose a new method to move the camera arm based on sensors within the Oculus Rift. Can a surgeon put the Oculus Rift (virtual reality headset), get a stereoscopic view and control the camera with simple head gestures? In this case, the surgeon will be able to see the 3D camera view of scope inside of the Oculus Rift and move the viewpoint by his/her head orientation. Position and orientation of the Oculus rift is measured by an inertial measuring unit and optical tracking sensors within the Oculus platform. These data can be used to control the position and orientation of the camera arm.
In this thesis, a complete system will be created based on the Robot Operating System (ROS) and a 3D simulation of the da Vinci robot in RViz. In addition, a usability study will be conducted to analyze system accuracy. For this system evaluation, headset orientation will be compared to corresponding orientation of the camera in simulation. We will also check whether subjects can use the system comfortable during a simple operation.
In this study, we propose controlling of the camera arm by Oculus Rift as a new method for camera control. It is anticipated that the headset movement will be the same as its corresponding simulation in RViz (simulation environment for the robot). We anticipate that our results will demonstrates feasibility for this method to control a camera. We will propose next steps for testing this system on the da Vinci hardware leading towards a system for the operating room of the future
Exploring Oculus Rift: A Historical Analysis of the ‘Virtual Reality’ Paradigm
This paper will first provide background information about Virtual Reality in order to better analyze its development throughout history and into the future. Next, this essay begins an in-depth historical analysis of how virtual reality has developed prior to 1970, a pivotal year in Virtual Reality history, followed by an exploration of how this development paradigm shifted between the 1970\u27s and the turn of the century. The historical analysis of virtual reality is concluded by covering the modern period from 2000-present. Finally, this paper examines the layout of the virtual reality field in respect to he history and innovations presented
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Viewing the Future? Virtual Reality In Journalism
Journalism underwent a flurry of virtual reality content creation, production and distribution starting in the final months of 2015. The New York Times distributed more than 1 million cardboard virtual reality viewers and released an app showing a spherical video short about displaced refugees. The Los Angeles Times landed people next to a crater on Mars. USA TODAY took visitors on a ride-along in the "Back to the Future" car on the Universal Studios lot and on a spin through Old Havana in a bright pink '57 Ford. ABC News went to North Korea for a spherical view of a military parade and to Syria to see artifacts threatened by war. The Emblematic Group, a company that creates virtual reality content, followed a woman navigating a gauntlet of anti- abortion demonstrators at a family planning clinic and allowed people to witness a murder-suicide stemming from domestic violence.In short, the period from October 2015 through February 2016 was one of significant experimentation with virtual reality (VR) storytelling. These efforts are part of an initial foray into determining whether VR is a feasible way to present news. The year 2016 is shaping up as a period of further testing and careful monitoring of potential growth in the use of virtual reality among consumers
- …