8 research outputs found

    Navigating a Maze with Balance Board and Wiimote

    No full text
    Input from the lower body in human-computer interfaces can be beneficial, enjoyable and even entertaining when users are expected to perform tasks simultaneously. Users can navigate a virtual (game) world or even an (empirical) dataset while having their hands free to issue commands. We compared the Wii Balance Board to a hand-held Wiimote for navigating a maze and found that users completed this task slower with the Balance Board. However, the Balance Board was considered more intuitive, easy to learn and ‘much fun’

    Come, see and experience affective interactive art

    No full text
    The progress in the field of affective computing enables the realization of affective consumer products, affective games, and affective art. This paper describes the affective interactive art system Mood Swings, which interprets and visualizes affect expressed by a person. Mood Swings is founded on the integration of a framework for affective movements and a color model. This enables Mood Swings to recognize affective movement characteristics as expressed by a person and display a color that matches the expressed emotion. With that, a unique interactive system is introduced, which can be considered as art, a game, or a combination of both

    Supporting Engagement and Floor Control in Hybrid Meetings

    No full text
    Remote participants in hybrid meetings often have problems to follow what is going on in the (physical) meeting room they are connected with. This paper describes a videoconferencing system for participation in hybrid meetings. The system has been developed as a research vehicle to see how technology based on automatic real-time recognition of conversational behavior in meetings can be used to improve engagement and floor control by remote participants. The system uses modules for online speech recognition, real-time visual focus of attention as well as a module that signals who is being addressed by the speaker. A built-in keyword spotter allows an automatic meeting assistant to call the remote participant’s attention when a topic of interest is raised, pointing at the transcription of the fragment to help him catch-up

    Engagement and Floor Control in Hybrid Meetings

    Get PDF
    The Human Media Interaction group of the University of Twente has developed a User Engagment and Floor Control Demonstrator, a system that uses modules for online speech recognition, real-time visual focus of attention as well as a module that signals who is being addressed by the speaker. A built-in keyword spotter allows an automatic meeting assistant to call the remote participant’s attention when a topic of interest is raised, pointing at the transcription of the fragment to help him catch-up
    corecore