11,992 research outputs found

    Piloting Multimodal Learning Analytics using Mobile Mixed Reality in Health Education

    Get PDF
    © 2019 IEEE. Mobile mixed reality has been shown to increase higher achievement and lower cognitive load within spatial disciplines. However, traditional methods of assessment restrict examiners ability to holistically assess spatial understanding. Multimodal learning analytics seeks to investigate how combinations of data types such as spatial data and traditional assessment can be combined to better understand both the learner and learning environment. This paper explores the pedagogical possibilities of a smartphone enabled mixed reality multimodal learning analytics case study for health education, focused on learning the anatomy of the heart. The context for this study is the first loop of a design based research study exploring the acquisition and retention of knowledge by piloting the proposed system with practicing health experts. Outcomes from the pilot study showed engagement and enthusiasm of the method among the experts, but also demonstrated problems to overcome in the pedagogical method before deployment with learners

    Tangible user interfaces : past, present and future directions

    Get PDF
    In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research

    Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain

    Get PDF
    Room-scale Virtual Reality (VR) has become an affordable consumer reality, with applications ranging from entertainment to productivity. However, the limited physical space available for room-scale VR in the typical home or office environment poses a significant problem. To solve this, physical spaces can be extended by amplifying the mapping of physical to virtual movement (translational gain). Although amplified movement has been used since the earliest days of VR, little is known about how it influences reach-based interactions with virtual objects, now a standard feature of consumer VR. Consequently, this paper explores the picking and placing of virtual objects in VR for the first time, with translational gains of between 1x (a one-to-one mapping of a 3.5m*3.5m virtual space to the same sized physical space) and 3x (10.5m*10.5m virtual mapped to 3.5m*3.5m physical). Results show that reaching accuracy is maintained for up to 2x gain, however going beyond this diminishes accuracy and increases simulator sickness and perceived workload. We suggest gain levels of 1.5x to 1.75x can be utilized without compromising the usability of a VR task, significantly expanding the bounds of interactive room-scale VR

    Exploring a Cognitive Basis for Learning Spatial Relationships with Augmented Reality

    Get PDF
    Augmented reality (AR) is an emergent class of interface that presents compelling possibilities for advancing spatial visualization. We offer a brief overview of AR technology and current research with in the educational realm. AR interfaces appear to provide a unique combination of visual display properties, modes of user manipulation, and interaction with spatial information. Drawing upon aspects of proprioception and sensorimotor function, we discuss how AR may have a unique and powerful link to spatial knowledge acquisition through visuo-motor involvement in the processing of information. We identify key properties of AR interfaces and how they differ from conventional visualization interfaces, followed by a discussion of theoretical perspectives that make a case for learning spatial relationships using first person manipulative AR.Recent research provides evidence that this form of AR holds cognitive advantages for learning when compared with traditional desktop 2D interfaces. We review the visual-physical connections to learning using first person manipulative AR within educational contexts. We then provide some suggestions for building future research in this area and explore its significance in the realm of spatial knowledge acquisition

    Planar Refrains

    Get PDF
    My practice explores phenomenal poetic truths that exist in fissures between the sensual and physical qualities of material constructs. Magnifying this confounding interspace, my work activates specific instruments within mutable, relational systems of installation, movement, and documentation. The tools I fabricate function within variable orientations and are implemented as both physical barriers and thresholds into alternate, virtual domains. Intersecting fragments of sound and moving image build a nexus of superimposed spatialities, while material constructions are enveloped in ephemeral intensities. Within this compounded environment, both mind and body are charged as active sites through which durational, contemplative experiences can pass. Reverberation, the ghostly refrain of a sound calling back to our ears from a distant plane, can intensify our emotional experience of place. My project Planar Refrains utilizes four electro-mechanical reverb plates, analog audio filters designed to simulate expansive acoustic arenas. Historically these devices have provided emotive voicings to popular studio recordings, dislocating the performer from the commercial studio and into a simulated reverberant territory of mythic proportions. The material resonance of steel is used to filter a recorded signal, shaping the sound of a human performance into something more transformative, a sound embodying otherworldly dynamics. In subverting the designed utility of reverb plates, I am exploring their value as active surfaces extending across different spatial realities. The background of ephemeral sonic residue is collapsed into the foreground, a filter becomes sculpture, and this sculpture becomes an instrument in an evolving soundscape

    An augmented reality interface for visualising and interacting with virtual content

    Get PDF
    In this paper, a novel AR interface is proposed that provides generic solutions to the tasks involved in augmenting simultaneously different types of virtual information and processing of tracking data for natural interaction. Participants within the system can experience a real-time mixture of 3D objects, static video, images, textual information and 3D sound with the real environment. The userfriendly AR interface can achieve maximum interaction using simple but effective forms of collaboration based on the combinations of humancomputer interaction techniques. To prove the feasibility of the interface, the use of indoor AR techniques are employed to construct innovative applications and demonstrate examples from heritage to learning systems. Finally, an initial evaluation of the AR interface including some initial results is presented

    In-home and remote use of robotic body surrogates by people with profound motor deficits

    Get PDF
    By controlling robots comparable to the human body, people with profound motor deficits could potentially perform a variety of physical tasks for themselves, improving their quality of life. The extent to which this is achievable has been unclear due to the lack of suitable interfaces by which to control robotic body surrogates and a dearth of studies involving substantial numbers of people with profound motor deficits. We developed a novel, web-based augmented reality interface that enables people with profound motor deficits to remotely control a PR2 mobile manipulator from Willow Garage, which is a human-scale, wheeled robot with two arms. We then conducted two studies to investigate the use of robotic body surrogates. In the first study, 15 novice users with profound motor deficits from across the United States controlled a PR2 in Atlanta, GA to perform a modified Action Research Arm Test (ARAT) and a simulated self-care task. Participants achieved clinically meaningful improvements on the ARAT and 12 of 15 participants (80%) successfully completed the simulated self-care task. Participants agreed that the robotic system was easy to use, was useful, and would provide a meaningful improvement in their lives. In the second study, one expert user with profound motor deficits had free use of a PR2 in his home for seven days. He performed a variety of self-care and household tasks, and also used the robot in novel ways. Taking both studies together, our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates, and that they can gain benefit with only low-level robot autonomy and without invasive interfaces. However, methods to reduce the rate of errors and increase operational speed merit further investigation.Comment: 43 Pages, 13 Figure
    corecore