1 research outputs found
Multi-person Spatial Interaction in a Large Immersive Display Using Smartphones as Touchpads
In this paper, we present a multi-user interaction interface for a large
immersive space that supports simultaneous screen interactions by combining (1)
user input via personal smartphones and Bluetooth microphones, (2) spatial
tracking via an overhead array of Kinect sensors, and (3) WebSocket interfaces
to a webpage running on the large screen. Users are automatically, dynamically
assigned personal and shared screen sub-spaces based on their tracked location
with respect to the screen, and use a webpage on their personal smartphone for
touchpad-type input. We report user experiments using our interaction framework
that involve image selection and placement tasks, with the ultimate goal of
realizing display-wall environments as viable, interactive workspaces with
natural multimodal interfaces.Comment: 8 pages with reference