3 research outputs found
Universal Gesture Tracking Framework in OpenISS and ROS and its Applications
In this research work, we present a common and extensible framework that abstracts different vision-based gesture recognition middleware and provides uniform gesture recognition data obtained from those via its simpler API to enable hand gesture interaction in different kinds of applications. We demonstrate various aspects of our framework via instrumentation and enable gesture interaction for our two specific yet different needs.
Firstly, we alleviate limited gesture tracking functionality in ISSv2 aka. (Illimitable Space System v2), an interactive and configurable artists' toolbox that is used to create music visualizations, visual effects and interactive documentary film based on the inputs from users such as gestures, voice, motion, etc.
Secondly, we provide a proof-of-concept solution to enhance and demonstrate limited language usability of the FORENSIC LUCID language's composition and compiler interactivity by enabling a forensic investigator to create partial FORENSIC LUCID encoded programs which require manipulation of preloaded digital evidence objects in a 3D warehouse-like application (DigiEVISS) via hand gesture interaction.
We also leverage Robot Operating System (ROS), an open source set of tools and libraries for its communication middleware to broadcast our framework data over the network. We provide this framework as a specialization of the OpenISS core framework and evaluate our framework on various aspects. We employ metrics such as effective frame rate and delay to evaluate our exemplified scenarios that represent our needs
DEMO_109: The CBC Newsworld Holodeck Exploratory Demonstration
<p>DEMO_109: The CBC Newsworld Holodeck Exploratory Demonstration</p
The CBC Newsworld holodeck
For the past 73 years, the CBC has disseminated a unique Canadian perspective across the world, producing a phenomenally rich multimedia record of the country and our social, political and cultural heritage and news. This project utilizes visualization and sonification of portions of an enormous historical CBC Newsworld data corpus to enable an "on this day" experience for viewers. The digitized collection of 24-hour news videos spans a 24-year period (1989-2013) within an immersive multiscreen environment, to enable gesture-driven context-aware browsing, information seeking, and segment review. Employing natural language processing technologies, the interface displays keywords and key phrases identified in the transcripts, enabling serendipitous video search and display and offering a unique browsing opportunity within this rich "big data" corpus