506,189 research outputs found

    GEMINI: A Generic Multi-Modal Natural Interface Framework for Videogames

    Full text link
    In recent years videogame companies have recognized the role of player engagement as a major factor in user experience and enjoyment. This encouraged a greater investment in new types of game controllers such as the WiiMote, Rock Band instruments and the Kinect. However, the native software of these controllers was not originally designed to be used in other game applications. This work addresses this issue by building a middleware framework, which maps body poses or voice commands to actions in any game. This not only warrants a more natural and customized user-experience but it also defines an interoperable virtual controller. In this version of the framework, body poses and voice commands are respectively recognized through the Kinect's built-in cameras and microphones. The acquired data is then translated into the native interaction scheme in real time using a lightweight method based on spatial restrictions. The system is also prepared to use Nintendo's Wiimote as an auxiliary and unobtrusive gamepad for physically or verbally impractical commands. System validation was performed by analyzing the performance of certain tasks and examining user reports. Both confirmed this approach as a practical and alluring alternative to the game's native interaction scheme. In sum, this framework provides a game-controlling tool that is totally customizable and very flexible, thus expanding the market of game consumers.Comment: WorldCIST'13 Internacional Conferenc

    EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays

    Get PDF
    While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confined tracking box restrict users to a single stationary position in front of the display. We present EyeScout, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user's lateral movement. EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interaction modes for a single user: In "Walk then Interact" the user can walk up to an arbitrary position in front of the display and interact, while in "Walk and Interact" the user can interact even while on the move. We report on a user study that shows that EyeScout is well perceived by users, extends a public display's sweet spot into a sweet line, and reduces gaze interaction kick-off time to 3.5 seconds -- a 62% improvement over state of the art solutions. We discuss sample applications that demonstrate how EyeScout can enable position and movement-independent gaze interaction with large public displays

    Exploring the Affective Loop

    Get PDF
    Research in psychology and neurology shows that both body and mind are involved when experiencing emotions (Damasio 1994, Davidson et al. 2003). People are also very physical when they try to communicate their emotions. Somewhere in between beings consciously and unconsciously aware of it ourselves, we produce both verbal and physical signs to make other people understand how we feel. Simultaneously, this production of signs involves us in a stronger personal experience of the emotions we express. Emotions are also communicated in the digital world, but there is little focus on users' personal as well as physical experience of emotions in the available digital media. In order to explore whether and how we can expand existing media, we have designed, implemented and evaluated /eMoto/, a mobile service for sending affective messages to others. With eMoto, we explicitly aim to address both cognitive and physical experiences of human emotions. Through combining affective gestures for input with affective expressions that make use of colors, shapes and animations for the background of messages, the interaction "pulls" the user into an /affective loop/. In this thesis we define what we mean by affective loop and present a user-centered design approach expressed through four design principles inspired by previous work within Human Computer Interaction (HCI) but adjusted to our purposes; /embodiment/ (Dourish 2001) as a means to address how people communicate emotions in real life, /flow/ (Csikszentmihalyi 1990) to reach a state of involvement that goes further than the current context, /ambiguity/ of the designed expressions (Gaver et al. 2003) to allow for open-ended interpretation by the end-users instead of simplistic, one-emotion one-expression pairs and /natural but designed expressions/ to address people's natural couplings between cognitively and physically experienced emotions. We also present results from an end-user study of eMoto that indicates that subjects got both physically and emotionally involved in the interaction and that the designed "openness" and ambiguity of the expressions, was appreciated and understood by our subjects. Through the user study, we identified four potential design problems that have to be tackled in order to achieve an affective loop effect; the extent to which users' /feel in control/ of the interaction, /harmony and coherence/ between cognitive and physical expressions/,/ /timing/ of expressions and feedback in a communicational setting, and effects of users' /personality/ on their emotional expressions and experiences of the interaction

    Media Presence and Inner Presence: The Sense of Presence in Virtual Reality Technologies

    Get PDF
    Abstract. Presence is widely accepted as the key concept to be considered in any research involving human interaction with Virtual Reality (VR). Since its original description, the concept of presence has developed over the past decade to be considered by many researchers as the essence of any experience in a virtual environment. The VR generating systems comprise two main parts: a technological component and a psychological experience. The different relevance given to them produced two different but coexisting visions of presence: the rationalist and the psychological/ecological points of view. The rationalist point of view considers a VR system as a collection of specific machines with the necessity of the inclusion \ud of the concept of presence. The researchers agreeing with this approach describe the sense of presence as a function of the experience of a given medium (Media Presence). The main result of this approach is the definition of presence as the perceptual illusion of non-mediation produced by means of the disappearance of the medium from the conscious attention of the subject. At the other extreme, there \ud is the psychological or ecological perspective (Inner Presence). Specifically, this perspective considers presence as a neuropsychological phenomenon, evolved from the interplay of our biological and cultural inheritance, whose goal is the control of the human activity. \ud Given its key role and the rate at which new approaches to understanding and examining presence are appearing, this chapter draws together current research on presence to provide an up to date overview of the most widely accepted approaches to its understanding and measurement

    Interacting Unities: An Agent-Based System

    Get PDF
    Recently architects have been inspired by Thompsonis Cartesian deformations and Waddingtonis flexible topological surface to work within a dynamic field characterized by forces. In this more active space of interactions, movement is the medium through which form evolves. This paper explores the interaction between pedestrians and their environment by regarding it as a process occurring between the two. It is hypothesized that the recurrent interaction between pedestrians and environment can lead to a structural coupling between those elements. Every time a change occurs in each one of them, as an expression of its own structural dynamics, it triggers changes to the other one. An agent-based system has been developed in order to explore that interaction, where the two interacting elements, agents (pedestrians) and environment, are autonomous units with a set of internal rules. The result is a landscape where each agent locally modifies its environment that in turn affects its movement, while the other agents respond to the new environment at a later time, indicating that the phenomenon of stigmergy is possible to take place among interactions with human analogy. It is found that it is the environmentis internal rules that determine the nature and extent of change

    The interaction between a sexually transferred steroid hormone and a female protein regulates oogenesis in the malaria mosquito anopheles gambiae

    Get PDF
    Molecular interactions between male and female factors during mating profoundly affect the reproductive behavior and physiology of female insects. In natural populations of the malaria mosquito Anopheles gambiae, blood-fed females direct nutritional resources towards oogenesis only when inseminated. Here we show that the mating-dependent pathway of egg development in these mosquitoes is regulated by the interaction between the steroid hormone 20-hydroxy-ecdysone (20E) transferred by males during copulation and a female Mating-Induced Stimulator of Oogenesis (MISO) protein. RNAi silencing of MISO abolishes the increase in oogenesis caused by mating in blood-fed females, causes a delay in oocyte development, and impairs the function of male-transferred 20E. Co-immunoprecipitation experiments show that MISO and 20E interact in the female reproductive tract. Moreover MISO expression after mating is induced by 20E via the Ecdysone Receptor, demonstrating a close cooperation between the two factors. Male-transferred 20E therefore acts as a mating signal that females translate into an increased investment in egg development via a MISO-dependent pathway. The identification of this male–female reproductive interaction offers novel opportunities for the control of mosquito populations that transmit malaria
    • 

    corecore