2 research outputs found

    Investigating tangible user interaction in mixed-reality robotic games

    No full text
    Among the emerging trends in Human-Robot Interaction, some of the most frequently used paradigms of interaction involve the use of Tangible User Interfaces. This is especially true also in the field of robotic gaming and, more specifically, in application domains in which commercial off-the-shelf robots and projected Mixed Reality (MR) technology are combined together. The popularity of such interfaces, also in other domains of Human-Machine Interaction, has led to an abundance in the number of gestures that can be used to perform tangible action using these interfaces. However, there are not sufficient pieces of evidence on how these different modalities can impact the user experience, in particular when interacting with a robot in a ``phygital play'' environment. By moving from this consideration, this paper reports on the efforts that are ongoing with the aim to investigate the impact of diverse gesture sets (which can be performed with the same physical prop) on the perception of interaction with the robotic system. It also presents preliminary insights obtained, which could be exploited to orient further research about the use of such interfaces for interaction in MR-based robotic gaming and related scenarios
    corecore