5 research outputs found
Customizing by Doing for Responsive Video Game Characters
This paper presents a game in which players can customize the behavior of their characters using their own movements while playing the game. Players’ movements are recorded with a motion capture system. The player then labels the movements and uses them as input to a machine learning algorithm that generates a responsive behavior model. This interface supports a more embodied approach to character design that we call “Customizing by Doing”. We present a user study that shows that using their own movements made the users feel more engaged with the game and the design process, due in large part to a feeling of personal ownership of the movement
Programming by Moving: Interactive Machine Learning for Embodied Interaction Design
Interactive Machine Learning is a promising approach for designing movement interaction because it allows creators to implement even complex movement designs by simply performing them with their bodies. We introduce a new tool, InteractML and accompanying ideation method, being developed to make movement interaction design faster, adaptable and accessible to creators of varying experience and backgrounds. By closing the gap between ideation and implementation stages of designing movement interaction, we hope to apply embodied sketching all the way through the creation process, supporting the design of more expressive and reflective range of movement interaction
Player–video game interaction: A systematic review of current concepts
International audienceVideo game design requires a user-centered approach to ensure that the experience enjoyed by players is as good as possible. However, the nature of player-video game interactions has not as yet been clearly defined in the scientific literature. The purpose of the present study was to provide a systematic review of empirical evidences of the current concepts of player-video game interactions in entertainment situations. A total of 72 articles published in scientific journals that deal with human-computer interaction met the criteria for inclusion in the present review. Major findings of these articles were presented in a narrative synthesis. Results showed that player-video game interactions could be defined with multiple concepts that are closely linked and intertwined. These concepts concern player aspects of player-video game interactions, namely engagement and enjoyment, and video game aspects, namely information input/output techniques, game contents and multiplayer games. Global approaches, such as playability, also exist to qualify player-video game interactions. Limitations of these findings are discussed to help researchers to plan future advances of the field and provide supplementary effort to better know the role of less-studied aspects. Practical implications are also discussed to help game designers to optimize the design of player-video game interactions
Understanding the role of Interactive Machine Learning in Movement Interaction Design
Interaction based on human movement has the potential to become an important new paradigm of human computer interaction, but for it to become mainstream there need to be effective tools and techniques to support designers. A promising approach to movement interaction design is Interactive Machine Learning, in which designing is done by physically performing . This paper brings together many different perspectives on understand human movement knowledge and movement interaction. This understanding shows that the embodied knowledge involved in movement interaction is very different from the representational knowledge involved in a traditional interface so a very different approach to design is needed. We apply this knowledge to understanding why interactive machine learning is an effective tool for motion interaction designers and to make a number of suggestions for future development of the techniqu
Recommended from our members
A Review of User Interface Design for Interactive Machine Learning
Interactive Machine Learning (IML) seeks to complement human perception and intelligence by tightly integrating these strengths with the computational power and speed of computers. The interactive process is designed to involve input from the user but does not require the background knowledge or experience that might be necessary to work with more traditional machine learning techniques. Under the IML process, non-experts can apply their domain knowledge and insight over otherwise unwieldy datasets to find patterns of interest or develop complex data driven applications. This process is co-adaptive in nature and relies on careful management of the interaction between human and machine. User interface design is fundamental to the success of this approach, yet there is a lack of consolidated principles on how such an interface should be implemented. This article presents a detailed review and characterisation of Interactive Machine Learning from an interactive systems perspective. We propose and describe a structural and behavioural model of a generalised IML system and identify solution principles for building effective interfaces for IML. Where possible, these emergent solution principles are contextualised by reference to the broader human-computer interaction literature. Finally, we identify strands of user interface research key to unlocking more efficient and productive non-expert interactive machine learning applications