283,971 research outputs found

    Gesture-Controlled Interaction with Aesthetic Information Sonification

    Full text link
    Information representation in augmented and virtual reality systems, and social physical (building) spaces can enhance the efficacy of interacting with and assimilating abstract, non-visual data. Sanification is the process of automatically generated real time information representation. There is a gap in our implementation and knowledge of auditory display systems used to enhance interaction in virtual and augmented reality. This paper addresses that gap by examining methodologies for mapping socio-spatial data to spatialised sanification manipulated with gestural controllers. This is a system of interactive knowledge representation that completes the human integration loop, enabling the user to interact with and manipulate data using 3D spatial gesture and 3D auditory display. Benefits include 1) added immersion in an augmented or virtual reality interface; 2) auditory display avoids visual overload in visually-saturated processes such as designing, evacuation in emergencies, flying aircraft; computer gaming; and 3) bi-modal or auditory representation, due to its time-based character, facilitates cognition of complex information

    Designing a 3D Gestural Interface to Support User Interaction with Time-Oriented Data as Immersive 3D Radar Chart

    Full text link
    The design of intuitive three-dimensional user interfaces is vital for interaction in virtual reality, allowing to effectively close the loop between a human user and the virtual environment. The utilization of 3D gestural input allows for useful hand interaction with virtual content by directly grasping visible objects, or through invisible gestural commands that are associated with corresponding features in the immersive 3D space. The design of such interfaces remains complex and challenging. In this article, we present a design approach for a three-dimensional user interface using 3D gestural input with the aim to facilitate user interaction within the context of Immersive Analytics. Based on a scenario of exploring time-oriented data in immersive virtual reality using 3D Radar Charts, we implemented a rich set of features that is closely aligned with relevant 3D interaction techniques, data analysis tasks, and aspects of hand posture comfort. We conducted an empirical evaluation (n=12), featuring a series of representative tasks to evaluate the developed user interface design prototype. The results, based on questionnaires, observations, and interviews, indicate good usability and an engaging user experience. We are able to reflect on the implemented hand-based grasping and gestural command techniques, identifying aspects for improvement in regard to hand detection and precision as well as emphasizing a prototype's ability to infer user intent for better prevention of unintentional gestures.Comment: 30 pages, 6 figures, 2 table

    Deep Learning-Based Human Pose Estimation: A Survey

    Full text link
    Human pose estimation aims to locate the human body parts and build human body representation (e.g., body skeleton) from input data such as images and videos. It has drawn increasing attention during the past decade and has been utilized in a wide range of applications including human-computer interaction, motion analysis, augmented reality, and virtual reality. Although the recently developed deep learning-based solutions have achieved high performance in human pose estimation, there still remain challenges due to insufficient training data, depth ambiguities, and occlusion. The goal of this survey paper is to provide a comprehensive review of recent deep learning-based solutions for both 2D and 3D pose estimation via a systematic analysis and comparison of these solutions based on their input data and inference procedures. More than 240 research papers since 2014 are covered in this survey. Furthermore, 2D and 3D human pose estimation datasets and evaluation metrics are included. Quantitative performance comparisons of the reviewed methods on popular datasets are summarized and discussed. Finally, the challenges involved, applications, and future research directions are concluded. We also provide a regularly updated project page: \url{https://github.com/zczcwh/DL-HPE

    Designing interactive virtual environments with feedback in health applications.

    Get PDF
    One of the most important factors to influence user experience in human-computer interaction is the user emotional reaction. Interactive environments including serious games that are responsive to user emotions improve their effectiveness and user satisfactions. Testing and training for user emotional competence is meaningful in healthcare field, which has motivated us to analyze immersive affective games using emotional feedbacks. In this dissertation, a systematic model of designing interactive environment is presented, which consists of three essential modules: affect modeling, affect recognition, and affect control. In order to collect data for analysis and construct these modules, a series of experiments were conducted using virtual reality (VR) to evoke user emotional reactions and monitoring the reactions by physiological data. The analysis results lead to the novel approach of a framework to design affective gaming in virtual reality, including the descriptions on the aspects of interaction mechanism, graph-based structure, and user modeling. Oculus Rift was used in the experiments to provide immersive virtual reality with affective scenarios, and a sample application was implemented as cross-platform VR physical training serious game for elderly people to demonstrate the essential parts of the framework. The measurements of playability and effectiveness are discussed. The introduced framework should be used as a guiding principle for designing affective VR serious games. Possible healthcare applications include emotion competence training, educational softwares, as well as therapy methods

    Enabling natural interaction for virtual reality

    Get PDF
    This research focuses on the exploration of software and methods to support natural interaction within a virtual environment. Natural interaction refers to the ability of the technology to support human interactions with computer generated simulations that most accurately reflect interactions with real objects. Over the years since the invention of computer-aided design tools, computers have become ubiquitous in the product design process. Increasingly, engineers and designers are using immersive virtual reality to evaluate virtual products throughout the entire design process. The goal of this research is to develop tools that support verisimilitude, or likeness to reality, particularly with respect to human interaction with virtual objects. Increasing the verisimilitude of the interactions and experiences in a virtual environment has the potential to increase the external validity of such data, resulting in more reliable decisions and better products. First, interface software is presented that extends the potential reach of virtual reality to include low-cost, consumer-grade motion sensing devices, thus enabling virtual reality on a broader scale. Second, a software platform, VR JuggLua, is developed to enable rapid and iterative creation of natural interactions in virtual environments, including by end-user programmers. Based on this software platform, the focus of the rest of the research is on supporting virtual assembly and decision making. The SPARTA software incorporates a powerful physically-based modeling simulation engine tuned for haptic interaction. The workspace of a haptic device is both virtually expanded, though an extension to the bubble technique, and physically expanded, through integration of a haptic device with a multi-directional mobile platform. Finally, a class of hybrid methods for haptic collision detection and response is characterized in terms of five independent tasks. One such novel hybrid method, which selectively restores degrees of freedom in haptic assembly, is developed and assessed with respect to low-clearance CAD assembly. It successfully maintains the high 1000 Hz update rate required for stable haptics unlike previous related approaches. Overall, this work forms a pattern of contributions towards enabling natural interaction for virtual reality and advances the ability to use an immersive environment in decision making during product design

    An Augmented Reality Platform for Preoperative Surgical Planning

    Get PDF
    Researching in new technologies for diagnosis, planning and medical treatment have allowed the development of computer tools that provide new ways of representing data obtained from patient's medical images such as computed tomography (CT) and magnetic resonance imaging (MRI). In this sense, augmented reality (AR) technologies provide a new form of data representation by combining the common analysis using images and the ability to superimpose virtual 3D representations of the organs of the human body in the real environment. In this paper the development of a generic computer platform based on augmented reality technology for surgical preoperative planning is presented. In particular, the surgeon can navigate in the 3D models of the patient's organs in order to have the possibility to perfectly understand the anatomy and plan in the best way the surgical procedure. In addition, a touchless interaction with the virtual organs is available thanks to the use of an armband provided of electromiographic muscle sensors. To validate the system, we focused in a navigation through aorta artery for mitral valve repair surgery

    Visualizing Big Data with augmented and virtual reality: challenges and research agenda

    Get PDF
    This paper provides a multi-disciplinary overview of the research issues and achievements in the field of Big Data and its visualization techniques and tools. The main aim is to summarize challenges in visualization methods for existing Big Data, as well as to offer novel solutions for issues related to the current state of Big Data Visualization. This paper provides a classification of existing data types, analytical methods, visualization techniques and tools, with a particular emphasis placed on surveying the evolution of visualization methodology over the past years. Based on the results, we reveal disadvantages of existing visualization methods. Despite the technological development of the modern world, human involvement (interaction), judgment and logical thinking are necessary while working with Big Data. Therefore, the role of human perceptional limitations involving large amounts of information is evaluated. Based on the results, a non-traditional approach is proposed: we discuss how the capabilities of Augmented Reality and Virtual Reality could be applied to the field of Big Data Visualization. We discuss the promising utility of Mixed Reality technology integration with applications in Big Data Visualization. Placing the most essential data in the central area of the human visual field in Mixed Reality would allow one to obtain the presented information in a short period of time without significant data losses due to human perceptual issues. Furthermore, we discuss the impacts of new technologies, such as Virtual Reality displays and Augmented Reality helmets on the Big Data visualization as well as to the classification of the main challenges of integrating the technology.publishedVersionPeer reviewe

    Towards a taxonomy of virtual reality user interfaces

    Get PDF
    Virtual Reality-based user interfaces (VRUIs) are expected to bring about a revolution in computing. VR can potentially communicate large amounts of data in an easily understandable format. VR looks very promising, but it is still a very new interface technology for which very little application-oriented knowledge is available. As a basis for such a future VRUI-design theory, a taxonomy of VRUIs is required. In this paper, a general model of human-computer communication is formulated. This model constitutes a frame for the integration of partial taxonomies of human computer interaction that are found in the literature. The whole constitutes a general user interface taxonomy. The field of VRUIs is described and delimited with respect to this taxonomy
    corecore