6 research outputs found

    Through the Eyes of Women in Engineering: An immersive VR experience

    Get PDF
    International audienceYoung women frequently face implicit behavioral biases – through unfriendly gaze, gestures, or speech – in all aspects of their daily lives. This is particularly true in the field of engineering, where women are a minority, making it emotionally difficult for them to fit in and feel at home, and deterring young women to enter these fields. Yet these systematic implicit biases are very hard to observe without being the target of such behavior and being accustomed to the current “norms” and “ways of acting” in the society. At their root is the misalignment between the stereotypes associated with femininity, and those associated with computer engineering − a misalignment whose keystone is how the gaze of others objectifies the body of women, and whose consequence and means of perpetuation is discomfort. Our objective is to create an interactive piece to communicate the discomfort of women in engineering, to tell their stories and expose this misalignment. To address this objective, we harness virtual reality to create an experience that embodies the user in the place of a woman character who travels through three common scenes in her life as an engineer. We design different gaze interactions, gestures, and speech styles that reflect common patterns for implicit behavioral biases, including objectification, isolation, and belittlement. The piece is an application that can be experienced in a headset or in a web browser

    PTVR : a user-friendly open-source script programming package to create Virtual Reality experiments

    Get PDF
    International audienceUsing Virtual Reality (VR) to investigate visual processing is a growing trend due to the high scientific potential of VR (allowing experiments in highly controlled environments and ecological scenarios), and to the increasing power of ever cheaper VR headsets. However, implementing VR experiments requires non-trivial programming skills that are long to learn. Alleviating this implementation process is thus a great challenge and should be guided by the success of existing script programming packages used to display stimuli on 2D monitors (e.g. the free, and open-source package PsychoPy). A step in this direction was achieved by the «Perception Toolbox for Virtual Reality» (PTVR) package (first presented at ECVP 2018) with the ambition to follow the same Open Science philosophy as PsychoPy but applied to VR. At ECVP 2022, we propose a consolidated and extended version of PTVR with many new features. We will describe, from scratch to finish, how any researcher familiar with Python programming can create and analyze a sophisticated experiment in VR with parsimonious code. A 3D visual search experiment will serve to illustrate the easiness with which: (1) 3D stimuli are positioned thanks to different coordinate systems, (2) online positions of the head, gaze, or remote controllers are used to point at the target interactively, (3) all the data are accurately recorded across time. We will also present the resources allowing researchers to quickly learn PTVR, notably hands-on demos included with PTVR and a website (https://ptvr.inria.fr/) offering an extensive user manual

    PTVR : a user-friendly open-source script programming package to create Virtual Reality experiments

    Get PDF
    International audienceUsing Virtual Reality (VR) to investigate visual processing is a growing trend due to the high scientific potential of VR (allowing experiments in highly controlled environments and ecological scenarios), and to the increasing power of ever cheaper VR headsets. However, implementing VR experiments requires non-trivial programming skills that are long to learn. Alleviating this implementation process is thus a great challenge and should be guided by the success of existing script programming packages used to display stimuli on 2D monitors (e.g. the free, and open-source package PsychoPy). A step in this direction was achieved by the «Perception Toolbox for Virtual Reality» (PTVR) package (first presented at ECVP 2018) with the ambition to follow the same Open Science philosophy as PsychoPy but applied to VR. At ECVP 2022, we propose a consolidated and extended version of PTVR with many new features. We will describe, from scratch to finish, how any researcher familiar with Python programming can create and analyze a sophisticated experiment in VR with parsimonious code. A 3D visual search experiment will serve to illustrate the easiness with which: (1) 3D stimuli are positioned thanks to different coordinate systems, (2) online positions of the head, gaze, or remote controllers are used to point at the target interactively, (3) all the data are accurately recorded across time. We will also present the resources allowing researchers to quickly learn PTVR, notably hands-on demos included with PTVR and a website (https://ptvr.inria.fr/) offering an extensive user manual

    À travers les yeux d'un patient malvoyant : une expérience de sensibilisation axée sur les interactions sociales en réalité virtuelle

    No full text
    National audienceLa réalité virtuelle (RV) possède un fort potentiel de sensibilisation, grâce à sa capacité à immerger les utilisateurs dans des situations différentes de leurs expériences quotidiennes, comme divers handicaps. Dans ce travail, nous présentons une application en RV visant à sensibiliser à la basse vision, plus particulièrement à la Dégénérescence Maculaire Liée à l'Âge (DMLA), la première cause de déficience visuelle dans les pays industrialisés, entraînant une perte de la vision centrale. Les travaux existants décrivent uniquement des simulations de basse vision qui reproduisent les effets perceptifs de la maladie, avec peu ou pas d'inclusion de situations réelles auxquelles les patients sont confrontés, en particulier celles impliquant des interactions sociales. Notre application est une expérience en RV où les participants incarnent un patient atteint de DMLA qui doit interagir avec plusieurs agents virtuels dans des situations réalistes. Ces scénarios ont été conçus et élaborés sur la base d'entretiens avec des patients atteints de DMLA et d'analyses de la littérature sur les interactions sociales impliquant la DMLA. Nous avons proposé notre application à des participants à vision normale équipés d'un simulateur de perte de vision centrale asservi au regard. Nous avons recueilli et analysé des données physiologiques, comportementales et subjectives de chaque participant pour évaluer l'efficacité de notre expérience en termes de sensibilisation. Nous pensons que cette nouvelle approche peut sensibiliser le grand public et aider les professionnels de la vision à mieux comprendre les besoins de leurs patients

    PTVR - a visual perception software in Python to make virtual reality experiments easier to build and more reproducible

    No full text
    Researchers increasingly use Virtual Reality (VR) to perform behavioral experiments, especially in Vision Science. These experiments are usually programmed directly in so-called game engines that are extremely powerful. However, this process is tricky and time-consuming as it requires solid knowledge of game engines. Consequently, the anticipated prohibitive effort discourages many researchers who want to engage in VR. This paper introduces the Perception Toolbox for Virtual Reality (PTVR) library, allowing visual perception studies in VR to be created using high-level Python script programming. A crucial consequence of using a script is that an experiment can be described by a single, easy-to-read piece of code, thus improving VR studies' transparency, reproducibility, and reusability. We built our library upon a seminal open-source library released in 2018 that we have considerably developed since then. This paper aims to provide a comprehensive overview of the PTVR software for the first time. We introduce the main objects and features of PTVR and some general concepts related to the 3D world. This new library should dramatically reduce the difficulty of programming experiments in VR and elicit a whole new set of visual perception studies with high ecological validity

    PTVR - a visual perception software in Python to make virtual reality experiments easier to build and more reproducible

    No full text
    Researchers increasingly use Virtual Reality (VR) to perform behavioral experiments, especially in Vision Science. These experiments are usually programmed directly in so-called game engines that are extremely powerful. However, this process is tricky and time-consuming as it requires solid knowledge of game engines. Consequently, the anticipated prohibitive effort discourages many researchers who want to engage in VR. This paper introduces the Perception Toolbox for Virtual Reality (PTVR) library, allowing visual perception studies in VR to be created using high-level Python script programming. A crucial consequence of using a script is that an experiment can be described by a single, easy-to-read piece of code, thus improving VR studies' transparency, reproducibility, and reusability. We built our library upon a seminal open-source library released in 2018 that we have considerably developed since then. This paper aims to provide a comprehensive overview of the PTVR software for the first time. We introduce the main objects and features of PTVR and some general concepts related to the 3D world. This new library should dramatically reduce the difficulty of programming experiments in VR and elicit a whole new set of visual perception studies with high ecological validity
    corecore