9,461 research outputs found

    User quality of experience of mulsemedia applications

    Get PDF
    User Quality of Experience (QoE) is of fundamental importance in multimedia applications and has been extensively studied for decades. However, user QoE in the context of the emerging multiple-sensorial media (mulsemedia) services, which involve different media components than the traditional multimedia applications, have not been comprehensively studied. This article presents the results of subjective tests which have investigated user perception of mulsemedia content. In particular, the impact of intensity of certain mulsemedia components including haptic and airflow on user-perceived experience are studied. Results demonstrate that by making use of mulsemedia the overall user enjoyment levels increased by up to 77%

    Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery

    Get PDF
    Multiple sensorial media (mulsemedia) combines multiple media elements which engage three or more of human senses, and as most other media content, requires support for delivery over the existing networks. This paper proposes an adaptive mulsemedia framework (ADAMS) for delivering scalable video and sensorial data to users. Unlike existing two-dimensional joint source-channel adaptation solutions for video streaming, the ADAMS framework includes three joint adaptation dimensions: video source, sensorial source, and network optimization. Using an MPEG-7 description scheme, ADAMS recommends the integration of multiple sensorial effects (i.e., haptic, olfaction, air motion, etc.) as metadata into multimedia streams. ADAMS design includes both coarse- and fine-grained adaptation modules on the server side: mulsemedia flow adaptation and packet priority scheduling. Feedback from subjective quality evaluation and network conditions is used to develop the two modules. Subjective evaluation investigated users' enjoyment levels when exposed to mulsemedia and multimedia sequences, respectively and to study users' preference levels of some sensorial effects in the context of mulsemedia sequences with video components at different quality levels. Results of the subjective study inform guidelines for an adaptive strategy that selects the optimal combination for video segments and sensorial data for a given bandwidth constraint and user requirement. User perceptual tests show how ADAMS outperforms existing multimedia delivery solutions in terms of both user perceived quality and user enjoyment during adaptive streaming of various mulsemedia content. In doing so, it highlights the case for tailored, adaptive mulsemedia delivery over traditional multimedia adaptive transport mechanisms

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Perceived synchronization of mulsemedia services

    Get PDF
    Multimedia synchronization involves a temporal relationship between audio and visual media components. The presentation of "in-sync" data streams is essential to achieve a natural impression, as "out-of-sync" effects are often associated with user quality of experience (QoE) decrease. Recently, multi-sensory media (mulsemedia) has been demonstrated to provide a highly immersive experience for its users. Unlike traditional multimedia, mulsemedia consists of other media types (i.e., haptic, olfaction, taste, etc.) in addition to audio and visual content. Therefore, the goal of achieving high quality mulsemedia transmission is to present no or little synchronization errors between the multiple media components. In order to achieve this ideal synchronization, there is a need for comprehensive knowledge of the synchronization requirements at the user interface. This paper presents the results of a subjective study carried out to explore the temporal boundaries within which haptic and air-flow media objects can be successfully synchronized with video media. Results show that skews between sensorial media and multimedia might still give the effect that the mulsemedia sequence is "in-sync" and provide certain constraints under which synchronization errors might be tolerated. The outcomes of the paper are used to provide recommendations for mulsemedia service providers in order for their services to be associated with acceptable user experience levels, e.g. haptic media could be presented with a delay of up to 1 s behind video content, while air-flow media could be released either 5 s ahead of or 3 s behind video content

    Impact of haptic 'touching' technology on cultural applications

    Get PDF
    No abstract available

    The Perceptual Experience Of Slope By Foot And By Finger

    Get PDF
    Historically, the bodily senses have often been regarded as impeccable sources of spatial information and as being the teacher of vision. Here, the authors report that the haptic perception of slope by means of the foot is greatly exaggerated. The exaggeration is present in verbal as well as proprioceptive judgments. It is shown that this misperception of pedal slope is not caused by calibration to the well-established visual misperception of slope because it is present in congenitally blind individuals as well. The pedal misperception of slope is contrasted with the perception of slope by dynamic touch with a finger in a force-feedback device. Although slopes feel slightly exaggerated even when explored by finger, they tend to show much less exaggeration than when equivalent slopes are stood on. The results are discussed in terms of a theory of coding efficiency. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract

    Haptic-GeoZui3D: Exploring the Use of Haptics in AUV Path Planning

    Get PDF
    We have developed a desktop virtual reality system that we call Haptic-GeoZui3D, which brings together 3D user interaction and visualization to provide a compelling environment for AUV path planning. A key component in our system is the PHANTOM haptic device (SensAble Technologies, Inc.), which affords a sense of touch and force feedback – haptics – to provide cues and constraints to guide the user’s interaction. This paper describes our system, and how we use haptics to significantly augment our ability to lay out a vehicle path. We show how our system works well for quickly defining simple waypoint-towaypoint (e.g. transit) path segments, and illustrate how it could be used in specifying more complex, highly segmented (e.g. lawnmower survey) paths
    corecore