5,624 research outputs found

    Health literacy practices in social virtual worlds and the influence on health behaviour

    Get PDF
    This study explored how health information accessed via a 3D social virtual world and the representation of ‘self’ through the use of an avatar impact physical world health behaviour. In-depth interviews were conducted in a sample of 25 people, across 10 countries, who accessed health information in a virtual world (VW): 12 females and 13 males. Interviews were audio-recorded via private in-world voice chat or via private instant message. Thematic analysis was used to analyse the data. The social skills and practices evidenced demonstrate how the collective knowledge and skills of communities in VWs can influence improvements in individual and community health literacy through a distributed model. The findings offer support for moving away from the idea of health literacy as a set of skills which reside within an individual to a sociocultural model of health literacy. Social VWs can offer a place where people can access health information in multiple formats through the use of an avatar, which can influence changes in behaviour in the physical world and the VW. This can lead to an improvement in social skills and health literacy practices and represents a social model of health literacy

    Designing a 3D Gestural Interface to Support User Interaction with Time-Oriented Data as Immersive 3D Radar Chart

    Full text link
    The design of intuitive three-dimensional user interfaces is vital for interaction in virtual reality, allowing to effectively close the loop between a human user and the virtual environment. The utilization of 3D gestural input allows for useful hand interaction with virtual content by directly grasping visible objects, or through invisible gestural commands that are associated with corresponding features in the immersive 3D space. The design of such interfaces remains complex and challenging. In this article, we present a design approach for a three-dimensional user interface using 3D gestural input with the aim to facilitate user interaction within the context of Immersive Analytics. Based on a scenario of exploring time-oriented data in immersive virtual reality using 3D Radar Charts, we implemented a rich set of features that is closely aligned with relevant 3D interaction techniques, data analysis tasks, and aspects of hand posture comfort. We conducted an empirical evaluation (n=12), featuring a series of representative tasks to evaluate the developed user interface design prototype. The results, based on questionnaires, observations, and interviews, indicate good usability and an engaging user experience. We are able to reflect on the implemented hand-based grasping and gestural command techniques, identifying aspects for improvement in regard to hand detection and precision as well as emphasizing a prototype's ability to infer user intent for better prevention of unintentional gestures.Comment: 30 pages, 6 figures, 2 table

    Rewilding with AR and VR: Facilitating Care with Photography in Physically Immersive Apps

    Get PDF
    In this dissertation I analyze two AR apps from Internet of Elephants, Safari Central and Wildeverse, and one VR app from National Geographic, National Geographic: Explore VR. These three apps use photography as the central tool for engagement, attempt to educate users, and prompt them to care about wildlife and wilderness. However, the ethical consideration of design has largely ignored representations of the environment, especially as it may intersect with facilitating care for wilderness and wildlife that is experiencing the effects of habitat destruction and environmental degradation. This project begins developing a critical discussion of how wilderness and wildlife are selectively created through CGI by asking two research questions: What kinds of relationships are facilitated between users and representations of wildlife by designing around photography in AR and VR? and How can these designs be revised or leveraged for more beneficial environmental communication through care ethics? To answer my research questions, I use methods from game studies and a methodological lens informed by care ethics, new materialism, and feminist materialism. My results show that these apps facilitate an underdeveloped researcher/subject and patron/recipient roles. In answering my second research question, I craft three approaches for applying care ethics: 1) designing based on performances, 2) modeling behavior, and 3) engaging in a reflective photographic review process. This dissertation attempts to support the rewilding of media, which helps people reconnect (rewild) with (other) forms of wildlife and wilderness

    Virtual reality for assembly methods prototyping: a review

    Get PDF
    Assembly planning and evaluation is an important component of the product design process in which details about how parts of a new product will be put together are formalized. A well designed assembly process should take into account various factors such as optimum assembly time and sequence, tooling and fixture requirements, ergonomics, operator safety, and accessibility, among others. Existing computer-based tools to support virtual assembly either concentrate solely on representation of the geometry of parts and fixtures and evaluation of clearances and tolerances or use simulated human mannequins to approximate human interaction in the assembly process. Virtual reality technology has the potential to support integration of natural human motions into the computer aided assembly planning environment (Ritchie et al. in Proc I MECH E Part B J Eng 213(5):461–474, 1999). This would allow evaluations of an assembler’s ability to manipulate and assemble parts and result in reduced time and cost for product design. This paper provides a review of the research in virtual assembly and categorizes the different approaches. Finally, critical requirements and directions for future research are presented

    Validity of a Fully-Immersive VR-Based Version of the Box and Blocks Test for Upper Limb Function Assessment in Parkinson's Disease

    Get PDF
    In recent decades, gaming technology has been accepted as a feasible method for complementing traditional clinical practice, especially in neurorehabilitation; however, the viability of using 3D Virtual Reality (VR) for the assessment of upper limb motor function has not been fully explored. For that purpose, we developed a VR-based version of the Box and Blocks Test (BBT), a clinical test for the assessment of manual dexterity, as an automated alternative to the classical procedure. Our VR-based BBT (VR-BBT) integrates the traditional BBT mechanics into gameplay using the Leap Motion Controller (LMC) to capture the user’s hand motion and the Oculus Rift headset to provide a fully immersive experience. This paper focuses on evaluating the validity of our VR-BBT to reliably measure the manual dexterity in a sample of patients with Parkinson’s Disease (PD). For this study, a group of twenty individuals in a mild to moderate stage of PD were recruited. Participants were asked to perform the physical BBT (once) and our proposed VR-BBT (twice) system,separately. Correlation analysis of collected data was carried out. Statistical analysis proved that the performance data collected by the VR-BBT significantly correlated with the conventional assessment of the BBT. The VR-BBT scores have shown a significant association with PD severity measured by the Hoehn and Yahr scale. This fact suggests that the VR-BBT could be used as a reliable indicator for health improvements in patients with PD. Finally, the VR-BBT system presented high usability and acceptability rated by clinicians and patients.This work was supported in part by the Spanish Ministry of Economy and Competitiveness via the ROBOESPAS project (DPI2017-87562-C2-1-R), and in part by the RoboCity2030-DIH-CM, Madrid Robotics Digital Innovation Hub (S2018/NMT-4331), which is funded by the Programas de Actividades I+D Comunidad de Madrid and cofunded by the Structural Funds of the EU

    Designing 3D scenarios and interaction tasks for immersive environments

    Get PDF
    In the world of today, immersive reality such as virtual and mixed reality, is one of the most attractive research fields. Virtual Reality, also called VR, has a huge potential to be used in in scientific and educational domains by providing users with real-time interaction or manipulation. The key concept in immersive technologies to provide a high level of immersive sensation to the user, which is one of the main challenges in this field. Wearable technologies play a key role to enhance the immersive sensation and the degree of embodiment in virtual and mixed reality interaction tasks. This project report presents an application study where the user interacts with virtual objects, such as grabbing objects, open or close doors and drawers while wearing a sensory cyberglove developed in our lab (Cyberglove-HT). Furthermore, it presents the development of a methodology that provides inertial measurement unit(IMU)-based gesture recognition. The interaction tasks and 3D immersive scenarios were designed in Unity 3D. Additionally, we developed an inertial sensor-based gesture recognition by employing an Long short-term memory (LSTM) network. In order to distinguish the effect of wearable technologies in the user experience in immersive environments, we made an experimental study comparing the Cyberglove-HT to standard VR controllers (HTC Vive Controller). The quantitive and subjective results indicate that we were able to enhance the immersive sensation and self embodiment with the Cyberglove-HT. A publication resulted from this work [1] which has been developed in the framework of the R&D project Human Tracking and Perception in Dynamic Immersive Rooms (HTPDI
    • …
    corecore