19 research outputs found

    Turning the (virtual) world around: Patterns in saccade direction vary with picture orientation and shape in virtual reality

    Get PDF
    Research investigating gaze in natural scenes has identified a number of spatial biases in where people look, but it is unclear whether these are partly due to constrained testing environments (e.g., a participant with their head restrained and looking at a landscape image framed within a computer monitor). We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influence eye and head movements in virtual reality (VR). Both the eyes and head were tracked while observers looked at natural scenes in a virtual environment. In line with previous work, we found a bias for saccade directions parallel to the image horizon, regardless of image shape or content. We found that, when allowed to do so, observers move both their eyes and head to explore images. Head rotation, however, was idiosyncratic; some observers rotated a lot, whereas others did not. Interestingly, the head rotated in line with the rotation of landscape but not fractal images. That head rotation and gaze direction respond differently to image content suggests that they may be under different control systems. We discuss our findings in relation to current theories on head and eye movement control and how insights from VR might inform more traditional eye-tracking studies

    Recognizing Affiliation: Using Behavioural Traces to Predict the Quality of Social Interactions in Online Games

    Full text link
    Online social interactions in multiplayer games can be supportive and positive or toxic and harmful; however, few methods can easily assess interpersonal interaction quality in games. We use behavioural traces to predict affiliation between dyadic strangers, facilitated through their social interactions in an online gaming setting. We collected audio, video, in-game, and self-report data from 23 dyads, extracted 75 features, trained Random Forest and Support Vector Machine models, and evaluated their performance predicting binary (high/low) as well as continuous affiliation toward a partner. The models can predict both binary and continuous affiliation with up to 79.1% accuracy (F1) and 20.1% explained variance (R2) on unseen data, with features based on verbal communication demonstrating the highest potential. Our findings can inform the design of multiplayer games and game communities, and guide the development of systems for matchmaking and mitigating toxic behaviour in online games.Comment: CHI '2

    Hybrid Prototype-in-the-Loop

    No full text

    Clusters, Trends, and Outliers : How Immersive Technologies Can Facilitate the Collaborative Analysis of Multidimensional Data

    No full text
    Immersive technologies such as augmented reality devices are opening up a new design space for the visual analysis of data. This paper studies the potential of an augmented reality environment for the purpose of collaborative analysis of multidimensional, abstract data. We present ART, a collaborative analysis tool to visualize multidimensional data in augmented reality using an interactive, 3D parallel coordinates visualization. The visualization is anchored to a touch-sensitive tabletop, benefiting from well-established interaction techniques. The results of group-based, expert walkthroughs show that ART can facilitate immersion in the data, a fluid analysis process, and collaboration. Based on the results, we provide a set of guidelines and discuss future research areas to foster the development of immersive technologies as tools for the collaborative analysis of multidimensional data.publishe

    User guided movement analysis in games using semantic trajectories

    No full text
    Understanding how players navigate through virtual worlds can offer useful guidance for map and level design of video games. One way to handle large-scale movement data obtained within games is by modelling movement as a sequence of visited locations instead of focusing on raw trajectory data. In this paper, we introduce a visualization approach for movement analysis based on semantic trajectories derived from a user-guided segmentation of the game environment. Based on this concept, the visualization offers an aggregated view of movement patterns together with the possibility to view individual paths for detailed inspection. We report on a user study with six experts from the game industry and compare the insights they have gleaned from the visualization with feedback from players. Our results indicate that the approach is successful in localizing problematic areas and that semantic trajectories can be a valuable addition to existing approaches for player movement analysis
    corecore