12 research outputs found
GALLAG Strip: A Mobile, Programming With Demonstration Environment for Sensor-Based Context-Aware Application Programming
abstract: The Game As Life - Life As Game (GALLAG) project investigates how people might change their lives if they think of and/or experience their life as a game. The GALLAG system aims to help people reach their personal goals through the use of context-aware computing, and tailored games and applications. To accomplish this, the GALLAG system uses a combination of sensing technologies, remote audio/video feedback, mobile devices and an application programming interface (API) to empower users to create their own context-aware applications. However, the API requires programming through source code, a task that is too complicated and abstract for many users. This thesis presents GALLAG Strip, a novel approach to programming sensor-based context-aware applications that combines the Programming With Demonstration technique and a mobile device to enable users to experience their applications as they program them. GALLAG Strip lets users create sensor-based context-aware applications in an intuitive and appealing way without the need of computer programming skills; instead, they program their applications by physically demonstrating their envisioned interactions within a space using the same interface that they will later use to interact with the system, that is, using GALLAG-compatible sensors and mobile devices. GALLAG Strip was evaluated through a study with end users in a real world setting, measuring their ability to program simple and complex applications accurately and in a timely manner. The evaluation also comprises a benchmark with expert GALLAG system programmers in creating the same applications. Data and feedback collected from the study show that GALLAG Strip successfully allows users to create sensor-based context-aware applications easily and accurately without the need of prior programming skills currently required by the GALLAG system and enables them to create almost all of their envisioned applications.Dissertation/ThesisM.S. Computer Science 201
Tangible User Interfaces and Metaphors for 3D Navigation
The most fundamental and common 3D interaction is the control of the virtual camera or viewpoint, commonly referred to as navigation. The navigational requirements of controlling multiple degrees of freedom and maintaining adequate spatial awareness are big challenges to many users. Many tasks additionally demand large portions of cognitive effort from the user for non-navigational aspects. Therefore, new solutions that are simple and naturally efficient are in high demand. These major challenges to 3D navigation have yet to be satisfactorily addressed, and as a result, there has yet to be a declaration of a suitable unified 3D interaction technique or metaphor.
We present a new domain and task independent 3D navigation metaphor, Navigational Puppetry, which we intend to be a candidate for the navigational portion of a unifying 3D interaction metaphor. The major components of the metaphor - the puppet, puppeteer, stage, and puppet-view - enable a new meta-navigational perspective and provide the user with a graspable navigational avatar, within a multiple-view perspective, that allows them to ‘reach’ within the virtual world and manipulate the viewpoint directly. We position this metaphor as a distinct articulation of the front wave of a puppetry related trend in recent 3D navigation solutions. The metaphor was implemented into a tangible user interface prototype called the Navi-Teer. Two usability studies and a unique spatial audio experiment were completed to observe and demonstrate, respectively, the metaphor’s benefits of tactile intimacy, spatial orientation, easy capture of complex input and support for collaboration
Storytelling: The Human Experience Through Data-Driven Instruments
Digital Portfolio Dissertation files include: Dissertation text document; zipped computer files; and 7 performance videos linked for streaming at this URL on University of Oregon Panopto service: https://uoregon.hosted.panopto.com/Panopto/Pages/Sessions/List.aspx#folderID=%22eba8a7df-69b3-4e67-a924-b0a001853032%22. Archival copies of videos have been preserved by the UO Libraries.This Digital Portfolio Dissertation is a collection of seven original real-time, interactive,
multichannel compositions featuring data-driven instruments. This dissertation includes video
recordings of seven individual performances, associated files needed for the performance of each
work, and a descriptive text document for each of the seven compositions. The text document in
this digital portfolio dissertation describes the storytelling components and the musical ideas and
compositional structure of each composition, the design and implementation of each data-driven
instrument, the sonic materials and data mapping strategies, as well as other extra-musical
elements associated with each composition