Skip to main content
Article thumbnail
Location of Repository

Gesture Controlled Virtual Reality Desktop

By Jordan Cheney and Davis Ancona


Today, all computers are interfaced through a one or two dimensional input controller such as a keyboard, mouse, or touch screen. This, however, is not how humans normally perceive and interact with their environment. Thus, to do away with this unnatural paradigm and evolve human-computer interactions, we propose a new system which allows for three dimensional interaction with a computer using intuitive and natural gestures, similar to the movements used in everyday life. To create a proof of concept for this new style of interface we combine the MYO Gesture Control Armband, the Unity game engine, and the Oculus Rift Virtual Reality Headset to create an interactive File System Manager application in a three dimensional, virtual reality world. We first use the MYO to read muscle movements in the arm and determine hand poses. This information is then ported to the Unity software which creates the virtual reality environment and maps these poses to functions involving interactive objects. Finally, the Oculus Rift uses stereoscopic vision techniques to display this environment to the user in such a way as to immerse them into the world we have created

Year: 2014
OAI identifier:
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.