27 research outputs found

    Embodied geosensification-models, taxonomies and applications for engaging the body in immersive analytics of geospatial data

    Get PDF
    This thesis examines how we can use immersive multisensory displays and body-focused interaction technologies to analyze geospatial data. It merges relevant aspects from an array of interdisciplinary research areas, from cartography to the cognitive sciences, to form three taxonomies that describe the senses, data representations, and interactions made possible by these technologies. These taxonomies are then integrated into an overarching design model for such "Embodied Geosensifications". This model provides guidance for system specification and is validated with practical examples

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Multisensory Integration as per Technological Advances: A Review

    Get PDF
    Multisensory integration research has allowed us to better understand how humans integrate sensory information to produce a unitary experience of the external world. However, this field is often challenged by the limited ability to deliver and control sensory stimuli, especially when going beyond audio–visual events and outside laboratory settings. In this review, we examine the scope and challenges of new technology in the study of multisensory integration in a world that is increasingly characterized as a fusion of physical and digital/virtual events. We discuss multisensory integration research through the lens of novel multisensory technologies and, thus, bring research in human–computer interaction, experimental psychology, and neuroscience closer together. Today, for instance, displays have become volumetric so that visual content is no longer limited to 2D screens, new haptic devices enable tactile stimulation without physical contact, olfactory interfaces provide users with smells precisely synchronized with events in virtual environments, and novel gustatory interfaces enable taste perception through levitating stimuli. These technological advances offer new ways to control and deliver sensory stimulation for multisensory integration research beyond traditional laboratory settings and open up new experimentations in naturally occurring events in everyday life experiences. Our review then summarizes these multisensory technologies and discusses initial insights to introduce a bridge between the disciplines in order to advance the study of multisensory integration

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Audio-Material Modeling and Reconstruction for Multimodal Interaction

    Get PDF
    Interactive virtual environments enable the creation of training simulations, games, and social applications. These virtual environments can create a sense of presence in the environment: a sensation that its user is truly in another location. To maintain presence, interactions with virtual objects should engage multiple senses. Furthermore, multisensory input should be consistent, e.g. a virtual bowl that visually appears plastic should also sound like plastic when dropped on the floor. In this dissertation, I propose methods to improve the perceptual realism of virtual object impact sounds and ensure consistency between those sounds and the input from other senses. Recreating the impact sound of a real-world object requires an accurate estimate of that object's material parameters. The material parameters that affect impact sound---collectively forming the audio-material---include the material damping parameters for a damping model. I propose and evaluate damping models and use them to estimate material damping parameters for real-world objects. I also consider how interaction with virtual objects can be made more consistent between the senses of sight, hearing, and touch. First, I present a method for modeling the damping behavior of impact sounds, using generalized proportional damping to both estimate more expressive material damping parameters from recorded impact sounds and perform impact sound synthesis. Next, I present a method for estimating material damping parameters in the presence of confounding factors and with no knowledge of the object's shape. To accomplish this, a probabilistic damping model captures various external effects to produce robust damping parameter estimates. Next, I present a method for consistent multimodal interaction with textured surfaces. Texture maps serve as a single unified representation of mesoscopic detail for the purposes of visual rendering, sound synthesis, and rigid-body simulation. Finally, I present a method for geometry and material classification using multimodal audio-visual input. Using this method, a real-world scene can be scanned and virtually reconstructed while accurately modeling both the visual appearances and audio-material parameters of each object.Doctor of Philosoph

    Gesture Interaction at a Distance

    Get PDF
    The aim of this work is to explore, from a perspective of human behavior, which\ud gestures are suited to control large display surfaces from a short distance away; why that is so; and, equally important, how such an interface can be made a reality. A well-known example of the type of interface that is the focus in this thesis is portrayed in the science fiction movie ‘Minority Report’. The lead character of this movie uses hand gestures such as pointing, picking-up and throwing-away to interact with a wall-sized display in a believable way. Believable, because the gestures are familiar from everyday life and because the interface responds predictably. Although only fictional in this movie, such gesture-based interfaces can, when realized, be applied in any environment that is equipped with large display surfaces. For example, in a laboratory for analyzing and interpreting large data sets; in interactive shopping windows to casually browse a product list; and in the operating room to easily access a patient’s MRI scans. The common denominator is that the user cannot or may not touch the display: the interaction occurs at arms-length and larger distances

    Merging the Real and the Virtual: An Exploration of Interaction Methods to Blend Realities

    Get PDF
    We investigate, build, and design interaction methods to merge the real with the virtual. An initial investigation looks at spatial augmented reality (SAR) and its effects on pointing with a real mobile phone. A study reveals a set of trade-offs between the raycast, viewport, and direct pointing techniques. To further investigate the manipulation of virtual content within a SAR environment, we design an interaction technique that utilizes the distance that a user holds mobile phone away from their body. Our technique enables pushing virtual content from a mobile phone to an external SAR environment, interact with that content, rotate-scale-translate it, and pull the content back into the mobile phone. This is all done in a way that ensures seamless transitions between the real environment of the mobile phone and the virtual SAR environment. To investigate the issues that occur when the physical environment is hidden by a fully immersive virtual reality (VR) HMD, we design and investigate a system that merges a realtime 3D reconstruction of the real world with a virtual environment. This allows users to freely move, manipulate, observe, and communicate with people and objects situated in their physical reality without losing their sense of immersion or presence inside a virtual world. A study with VR users demonstrates the affordances provided by the system and how it can be used to enhance current VR experiences. We then move to AR, to investigate the limitations of optical see-through HMDs and the problem of communicating the internal state of the virtual world with unaugmented users. To address these issues and enable new ways to visualize, manipulate, and share virtual content, we propose a system that combines a wearable SAR projector. Demonstrations showcase ways to utilize the projected and head-mounted displays together, such as expanding field of view, distributing content across depth surfaces, and enabling bystander collaboration. We then turn to videogames to investigate how spectatorship of these virtual environments can be enhanced through expanded video rendering techniques. We extract and combine additional data to form a cumulative 3D representation of the live game environment for spectators, which enables each spectator to individually control a personal view into the stream while in VR. A study shows that users prefer spectating in VR when compared with a comparable desktop rendering

    Small UAS Detect and Avoid Requirements Necessary for Limited Beyond Visual Line of Sight (BVLOS) Operations

    Get PDF
    Potential small Unmanned Aircraft Systems (sUAS) beyond visual line of sight (BVLOS) operational scenarios/use cases and Detect And Avoid (DAA) approaches were collected through a number of industry wide data calls. Every 333 Exemption holder was solicited for this same information. Summary information from more than 5,000 exemption holders is documented, and the information received had varied level of detail but has given relevant experiential information to generalize use cases. A plan was developed and testing completed to assess Radio Line Of Sight (RLOS), a potential key limiting factors for safe BVLOS ops. Details of the equipment used, flight test area, test payload, and fixtures for testing at different altitudes is presented and the resulting comparison of a simplified mathematical model, an online modeling tool, and flight data are provided. An Operational Framework that defines the environment, conditions, constraints, and limitations under which the recommended requirements will enable sUAS operations BVLOS is presented. The framework includes strategies that can build upon Federal Aviation Administration (FAA) and industry actions that should result in an increase in BVLOS flights in the near term. Evaluating approaches to sUAS DAA was accomplished through five subtasks: literature review of pilot and ground observer see and avoid performance, survey of DAA criteria and recommended baseline performance, survey of existing/developing DAA technologies and performance, assessment of risks of selected DAA approaches, and flight testing. Pilot and ground observer see and avoid performance were evaluated through a literature review. Development of DAA criteria—the emphasis here being well clear— was accomplished through working with the Science And Research Panel (SARP) and through simulations of manned and unmanned aircraft interactions. Information regarding sUAS DAA approaches was collected through a literature review, requests for information, and direct interactions. These were analyzed through delineation of system type and definition of metrics and metric values. Risks associated with sUAS DAA systems were assessed by focusing on the Safety Risk Management (SRM) pillar of the SMS (Safety Management System) process. This effort (1) identified hazards related to the operation of sUAS in BVLOS, (2) offered a preliminary risk assessment considering existing controls, and (3) recommended additional controls and mitigations to further reduce risk to the lowest practical level. Finally, flight tests were conducted to collect preliminary data regarding well clear and DAA system hazards
    corecore