11,904 research outputs found

    A Mimetic Strategy to Engage Voluntary Physical Activity In Interactive Entertainment

    Full text link
    We describe the design and implementation of a vision based interactive entertainment system that makes use of both involuntary and voluntary control paradigms. Unintentional input to the system from a potential viewer is used to drive attention-getting output and encourage the transition to voluntary interactive behaviour. The iMime system consists of a character animation engine based on the interaction metaphor of a mime performer that simulates non-verbal communication strategies, without spoken dialogue, to capture and hold the attention of a viewer. The system was developed in the context of a project studying care of dementia sufferers. Care for a dementia sufferer can place unreasonable demands on the time and attentional resources of their caregivers or family members. Our study contributes to the eventual development of a system aimed at providing relief to dementia caregivers, while at the same time serving as a source of pleasant interactive entertainment for viewers. The work reported here is also aimed at a more general study of the design of interactive entertainment systems involving a mixture of voluntary and involuntary control.Comment: 6 pages, 7 figures, ECAG08 worksho

    Dynamic Lighting for Tension in Games

    Get PDF
    Video and computer games are among the most complex forms of interactive media. Games simulate many elements of traditional media, such as plot, characters, sound and music, lighting and mise-en-scene. However, games are digital artifacts played through graphic interfaces and controllers. As interactive experiences, games are a host of player challenges ranging from more deliberate decision-making and problem solving strategies, to the immediate charge of reflex action. Games, thus, draw upon a unique mix of player resources, contributing to what Lindley refers to as the "game-play gestalt", "a particular way of thinking about the game state from the perspective of a player, together with a pattern of repetitive perceptual, cognitive, and motor operations" (Lindley, 2003)

    Automating Lighting Design for Interactive Entertainment

    Get PDF
    Recent advances in computer graphics, particularly in real-time rendering, have resulted in major improvements in 3D graphics and rendering techniques in interactive entertainment. In this article we focus on the scenelighting process, which we define as configuring the number of lights in a scene, their properties (e.g., range and attenuation), positions, angles, and colors. Lighting design is well known among designers, directors, and visual artists for its vital role in influencing viewers\u27 perception by evoking moods, directing their gaze to important areas (i.e., providing visual focus), and conveying visual tension. It is, however, difficult to set positions, angles, or colors for lights within interactive scenes to accommodate these goals because an interactive scene?s spatial and dramatic configuration, including mood, dramatic intensity, and the relative importance of different characters, change unpredictably in real-time. There are several techniques developed by the game industry that establish spectacular real-time lighting effects within 3D interactive environments. These techniques are often time- and labor-intensive. In addition, they are not easily used to dynamically mold the visual design to convey communicative, dramatic, and aesthetic functions as addressed in creative disciplines such as art, film, and theatre. In this article we present a new real-time lighting design model based on cinematic and theatric lighting design theory. The proposed model is designed to automatically, and in realtime, adjust lighting in an interactive scene to accommodate the dramatic, aesthetic, and communicative functions described by traditional lighting design theories, while taking artistic constraints on style, visual continuity, and aesthetic function into account

    DigitalBeing – Using the Environment as an Expressive Medium for Dance

    Get PDF
    Dancers express their feelings and moods through gestures and body movements. We seek to extend this mode of expression by dynamically and automatically adjusting music and lighting in the dance environment to reflect the dancer’s arousal state. Our intention is to offer a space that performance artists can use as a creative tool that extends the grammar of dance. To enable the dynamic manipulation of lighting and music, the performance space will be augmented with several sensors: physiological sensors worn by a dancer to measure her arousal state, as well as pressure sensors installed in a floor mat to track the dancers’ locations and movements. Data from these sensors will be passed to a three layered architecture. Layer 1 is composed of a sensor analysis system that analyzes and synthesizes physiological and pressure sensor signals. Layer 2 is composed of intelligent systems that adapt lighting and music to portray the dancer’s arousal state. The intelligent on-stage lighting system dynamically adjusts on-stage lighting direction and color. The intelligent virtual lighting system dynamically adapts virtual lighting in the projected imagery. The intelligent music system dynamically and unobtrusively adjusts the music. Layer 3 translates the high-level adjustments made by the intelligent systems in layer 2 to appropriate lighting board, image rendering, and audio box commands. In this paper, we will describe this architecture in detail as well as the equipment and control systems used

    Projecting Tension in Virtual Environments through Lighting.

    Get PDF
    Interactive synthetic environments are currently used in a wide variety of applications, including video games, exposure therapy, education, and training. Their success in such domains relies on their immersive and engagement qualities. Filmmakers and theatre directors use many techniques to project tension in the hope of affecting audiences’ affective states. These techniques include narrative, sound effects, camera movements, and lighting. This paper focuses on temporal variation of lighting color and its use in evoking tension within interactive virtual worlds. Many game titles adopt some cinematic lighting effects to evoke certain moods, particularly saturated red colored lighting, flickering lights, and very dark lighting. Such effects may result in user frustration due to the lack of balance between the desire to project tension and the desire to use lighting for other goals, such as visibility and depth projection. In addition, many of the lighting effects used in game titles are very obvious and obtrusive. In this paper, the author will identify several lighting color patterns, both obtrusive and subtle, based on a qualitative study of several movies and lighting design theories. In addition to identifying these patterns, the author also presents a system that dynamically modulates the lighting within an interactive environment to project the desired tension while balancing other lighting goals, such as establishing visibility, projecting depth, and providing motivation for lighting direction. This work extends the author’s previous work on the Expressive Lighting Engine [1-3]. Results of incorporating this system within a game will be discussed

    DigitalBeing: an Ambient Intelligent Dance Space.

    Get PDF
    DigitalBeing is an ambient intelligent system that aims to use stage lighting and lighting in projected imagery within a dance performance to portray dancer’s arousal state. The dance space will be augmented with pressure sensors to track dancers’ movements; dancers will also wear physiological sensors. Sensor data will be passed to a three layered architecture. Layer 1 is composed of a system that analyzes sensor data. Layer 2 is composed of two intelligent lighting systems that use the analyzed sensor information to adapt onstage and virtual lighting to show dancer’s arousal level. Layer 3 translates lighting changes to appropriate lighting board commands as well as rendering commands to render the projected imagery

    Dynamic Intelligent Lighting for Directing Visual Attention in Interactive 3D Scenes

    Get PDF
    Recent enhancements in real-time graphics have facilitated the design of high fidelity game environments with complex 3D worlds inhabited by animated characters. Under such settings, it is hard, especially for the untrained eyes, to attend to an object of interest. Neuroscience research as well as film and theatre practice identified several visual properties, such as contrast, orientation, and color that play a major role in channeling attention. In this paper, we discuss an adaptive lighting design system called ALVA (Adaptive Lighting for Visual Attention) that dynamically adjusts the lighting color and brightness to enhance visual attention within game environments using features identified by neuroscience, psychophysics, and visual design literature. We also discuss some preliminary results showing the utility of ALVA in directing player’s attention to important elements in a fast paced 3D game, and thus enhancing the game experience especially for non-gamers who are not visually trained to spot objects or characters in such complex 3D worlds

    Music Maker – A Camera-based Music Making Tool for Physical Rehabilitation

    Full text link
    The therapeutic effects of playing music are being recognized increasingly in the field of rehabilitation medicine. People with physical disabilities, however, often do not have the motor dexterity needed to play an instrument. We developed a camera-based human-computer interface called "Music Maker" to provide such people with a means to make music by performing therapeutic exercises. Music Maker uses computer vision techniques to convert the movements of a patient's body part, for example, a finger, hand, or foot, into musical and visual feedback using the open software platform EyesWeb. It can be adjusted to a patient's particular therapeutic needs and provides quantitative tools for monitoring the recovery process and assessing therapeutic outcomes. We tested the potential of Music Maker as a rehabilitation tool with six subjects who responded to or created music in various movement exercises. In these proof-of-concept experiments, Music Maker has performed reliably and shown its promise as a therapeutic device.National Science Foundation (IIS-0308213, IIS-039009, IIS-0093367, P200A01031, EIA-0202067 to M.B.); National Institutes of Health (DC-03663 to E.S.); Boston University (Dudley Allen Sargent Research Fund (to A.L.)
    • …
    corecore