535 research outputs found

    The State of the Art of Spatial Interfaces for 3D Visualization

    Get PDF
    International audienceWe survey the state of the art of spatial interfaces for 3D visualization. Interaction techniques are crucial to data visualization processes and the visualization research community has been calling for more research on interaction for years. Yet, research papers focusing on interaction techniques, in particular for 3D visualization purposes, are not always published in visualization venues, sometimes making it challenging to synthesize the latest interaction and visualization results. We therefore introduce a taxonomy of interaction technique for 3D visualization. The taxonomy is organized along two axes: the primary source of input on the one hand and the visualization task they support on the other hand. Surveying the state of the art allows us to highlight specific challenges and missed opportunities for research in 3D visualization. In particular, we call for additional research in: (1) controlling 3D visualization widgets to help scientists better understand their data, (2) 3D interaction techniques for dissemination, which are under-explored yet show great promise for helping museum and science centers in their mission to share recent knowledge, and (3) developing new measures that move beyond traditional time and errors metrics for evaluating visualizations that include spatial interaction

    Portallax:bringing 3D displays capabilities to handhelds

    Get PDF
    We present Portallax, a clip-on technology to retrofit mobile devices with 3D display capabilities. Available technologies (e.g. Nintendo 3DS or LG Optimus) and clip-on solutions (e.g. 3DeeSlide and Grilli3D) force users to have a fixed head and device positions. This is contradictory to the nature of a mobile scenario, and limits the usage of interaction techniques such as tilting the device to control a game. Portallax uses an actuated parallax barrier and face tracking to realign the barrier's position to the user's position. This allows us to provide stereo, motion parallax and perspective correction cues in 60 degrees in front of the device. Our optimized design of the barrier minimizes colour distortion, maximizes resolution and produces bigger view-zones, which support ~81% of adults' interpupillary distances and allow eye tracking implemented with the front camera. We present a reference implementation, evaluate its key features and provide example applications illustrating the potential of Portallax

    Exploring the Front Touch Interface for Virtual Reality Headsets

    Full text link
    In this paper, we propose a new interface for virtual reality headset: a touchpad in front of the headset. To demonstrate the feasibility of the front touch interface, we built a prototype device, explored VR UI design space expansion, and performed various user studies. We started with preliminary tests to see how intuitively and accurately people can interact with the front touchpad. Then, we further experimented various user interfaces such as a binary selection, a typical menu layout, and a keyboard. Two-Finger and Drag-n-Tap were also explored to find the appropriate selection technique. As a low-cost, light-weight, and in low power budget technology, a touch sensor can make an ideal interface for mobile headset. Also, front touch area can be large enough to allow wide range of interaction types such as multi-finger interactions. With this novel front touch interface, we paved a way to new virtual reality interaction methods

    Piloting Multimodal Learning Analytics using Mobile Mixed Reality in Health Education

    Get PDF
    © 2019 IEEE. Mobile mixed reality has been shown to increase higher achievement and lower cognitive load within spatial disciplines. However, traditional methods of assessment restrict examiners ability to holistically assess spatial understanding. Multimodal learning analytics seeks to investigate how combinations of data types such as spatial data and traditional assessment can be combined to better understand both the learner and learning environment. This paper explores the pedagogical possibilities of a smartphone enabled mixed reality multimodal learning analytics case study for health education, focused on learning the anatomy of the heart. The context for this study is the first loop of a design based research study exploring the acquisition and retention of knowledge by piloting the proposed system with practicing health experts. Outcomes from the pilot study showed engagement and enthusiasm of the method among the experts, but also demonstrated problems to overcome in the pedagogical method before deployment with learners

    Interaction for Immersive Analytics

    Get PDF
    International audienceIn this chapter, we briefly review the development of natural user interfaces and discuss their role in providing human-computer interaction that is immersive in various ways. Then we examine some opportunities for how these technologies might be used to better support data analysis tasks. Specifically, we review and suggest some interaction design guidelines for immersive analytics. We also review some hardware setups for data visualization that are already archetypal. Finally, we look at some emerging system designs that suggest future directions

    Advanced displays and natural user interfaces to support learning

    Full text link
    [EN] Advanced displays and Natural User Interfaces (NUI) are a very suitable combination for developing systems to provide an enhanced and richer user experience. This combination can be appropriate in several fields and has not been extensively exploited. One of the fields that this combination is especially suitable for is education. Nowadays, children are growing up playing with computer games, using mobile devices, and other technological devices. New learning methods that use these new technologies can help in the learning process. In this paper, two new methods that use advanced displays and NUI for learning about a period of history are presented. One of the methods is an autostereoscopic system that lets children see themselves as a background in the game and renders the elements in 3D without the need for special glasses; the second method is a frontal projection system that projects the image on a table in 2D and works similarly to a touch table. The Microsoft Kinect© is used in both systems for the interaction. A comparative study to check different aspects was carried out. A total of 128 children from 7 to 11 years old participated in the study. From the results, we observed that the different characteristics of the systems did not influence the children s acquired knowledge, engagement, or satisfaction. There were statistically significant differences for depth perception and presence in which the autostereoscopic system was scored higher. However, of the two systems, the children considered the frontal projection to be easier to use. We would like to highlight that the scores for the two systems and for all the questions were very high. These results suggest that games of this kind (advanced displays and NUI) could be appropriate educational games and that autostereoscopy is a technology to exploit in their development.This work was funded by the Spanish Ministry of Science and Innovation through the APRENDRA project (TIN2009-14319-C02-01).Martín San José, JF.; Juan, M.; Mollá Vayá, RP.; Vivó Hernando, RA. (2017). Advanced displays and natural user interfaces to support learning. Interactive Learning Environments. https://doi.org/10.1080/10494820.2015.1090455

    MUITC - IIF 2013-2014

    Get PDF
    Principal Investigator: Newton D'souza, PhD, Associate Professor, Department of Architectural Studies."Final Project Report."Final report of a 2013/2014 IIF project, "iSTUDIO: An Interactive Form-making Environment for Art and Architectural Teaching." Proposal Summary: Our MUITC-IIF proposal involved a proof-of-concept for an interactive form-making environment for Art and Architectural Studies. We created a 3D digital environment intended to foster interaction between students and instructors for creative generation of ideas, critique and debate. This is a platform in which students and instructors can 'visualize' their designs in a shared forum. The 3D environment is intended to extend and enhance physical artifacts that students already do in visualizing art and space.MU Interdisciplinary Innovations Fun
    corecore