11 research outputs found

    Improved Touchless Respiratory Rate Sensing

    Full text link
    Recently, remote respiratory rate measurement techniques gained much attention as they were developed to overcome the limitations of device-based classical methods and manual counting. Many approaches for RR extraction from the video stream of the visible light camera were proposed, including the pixel intensity changes method. In this paper, we propose a new method for 1D profile creation for pixel intensity changes-based method, which significantly increases the algorithm's performance. Additional accuracy gain is obtained via a new method of motion signals grouping presented in this work. We introduce several changes to the standard pipeline, which enables real-time continuous RR monitoring and allows applications in the human-computer interaction systems. Evaluation results on two internal and one public datasets showed 0.7 BPM, 0.6 BPM, and 1.4 BPM MAE, respectively.Comment: 5 pages, 1 figure, 2 tables. This work was presented on the IMET 2022 workshop on Haptics, AI and RR

    An Exploration of Just Noticeable Differences in Mid-Air Haptics

    Get PDF
    Mid-air haptic feedback technology produces tactile sensations that are felt without the need for physical interactions, wearables or controllers. When designing mid-air haptic stimuli, it is important that they are sufficiently different in terms of their perceived sensation.This paper presents the results of two user studies on mid-air haptic feedback technology, with a focus on the sensations of haptic strength and haptic roughness. More specifically, we used the acoustic pressure intensity and the rotation frequency of the mid-air haptic stimulus as proxies to the two sensations of interest and investigated their Just Noticeable Difference (JND) and Weber fractions. Our results indicate statistical significance in the JND for frequency, with a finer resolution compared to intensity. Moreover, correlations are observed in terms of participants' sensitivity to small changes across the different stimuli presented. We conclude that frequency and intensity are mid-air haptic dimensions of depth 5 and 3, respectively, that we can use for the design of distinct stimuli that convey perceptually different tactile information to the user

    Object selection and scaling using multimodal interaction in mixed reality

    Get PDF
    Mixed Reality (MR) is the next evolution of human interacting with the computer as MR has the ability to combine the physical environment and digital environment and making them coexist with each other. Interaction is still a huge research area in Augmented Reality (AR) but very less in MR, this is due to current advanced MR display techniques still not robust and intuitive enough to let the user to naturally interact with 3D content. New techniques on user interaction have been widely studied, the advanced technique in interaction when the system able to invoke more than one input modalities. Multimodal interaction undertakes to deliver intuitive multiple objects manipulation with gestures. This paper discusses the multimodal interaction technique using gesture and speech which the proposed experimental setup to implement multimodal in the MR interface. The real hand gesture is combined with speech inputs in MR to perform spatial object manipulations. The paper explains the implementation stage that involves interaction using gesture and speech inputs to enhance user experience in MR workspace. After acquiring gesture input and speech commands, spatial manipulation for selection and scaling using multimodal interaction has been invoked, and this paper ends with a discussion

    Manifesto for Digital Social Touch in Crisis

    Get PDF
    This qualitative exploratory research paper presents a Manifesto for Digital Social Touch in Crisis - a provocative call to action to designers, developers and researchers to rethink and reimagine social touch through a deeper engagement with the social and sensory aspects of touch. This call is motivated by concerns that social touch is in a crisis signaled by a decline in social touch over the past 2 decades, the problematics of inappropriate social touch, and the well documented impact of a lack of social touch on communication, relationships, and well-being and health. These concerns shape how social touch enters the digital realm and raise questions for how and when the complex space of social touch is mediated by technologies, as well the societal implications. The paper situates the manifesto in the key challenges facing haptic designers and developers identified through a series of interdisciplinary collaborative workshops with participants from computer science, design, engineering, HCI and social science from both within industry and academia, and the research literature on haptics. The features and purpose of the manifesto form are described, along with our rationale for its use, and the method of the manifesto development. The starting points, opportunities and challenges, dominant themes and tensions that shaped the manifesto statements are then elaborated on. The paper shows the potential of the manifesto form to bridge between HCI, computer science and engineers, and social scientists on the topic of social touch

    An empirical evaluation of two natural hand interaction systems in augmented reality

    Get PDF
    Human-computer interaction based on hand gesture tracking is not uncommon in Augmented Reality. In fact, the most recent optical Augmented Reality devices include this type of natural interaction. However, due to hardware and system limitations, these devices, more often than not, settle for semi-natural interaction techniques, which may not always be appropriate for some of the tasks needed in Augmented Reality applications. For this reason, we compare two different optical Augmented Reality setups equipped with hand tracking. The first one is based on a Microsoft HoloLens (released in 2016) and the other one is based on a Magic Leap One (released more than two years later). Both devices offer similar solutions for the visualization and registration problems but differ in the hand tracking approach, since the former uses a metaphoric hand-gesture tracking and the latter relies on an isomorphic approach. We raise seven research questions regarding these two setups, which we answer after performing two task-based experiments using virtual elements, of different sizes, that are moved using natural hand interaction. The questions deal with the accuracy and performance achieved with these setups and also with user preference, recommendation and perceived usefulness. For this purpose, we collect both subjective and objective data about the completion of these tasks. Our initial hypothesis was that there would be differences, in favor of the isomorphic and newer setup, in the use of hand interaction. However, the results surprisingly show that there are very small objective differences between these setups, and the isomorphic approach is not significantly better in terms of accuracy and mistakes, although it allows a faster completion of one of the tasks. In addition, no remarkable statistically significant differences can be found between the two setups in the subjective datasets gathered through a specific questionnaire. We also analyze the opinions of the participants in terms of usefulness, preference and recommendation. The results show that, although the Magic Leap-based system gets more support, the differences are not statistically significant

    Haptic Interfaces for Virtual Reality: Challenges and Research Directions

    Get PDF
    The sense of touch (haptics) has been applied in several areas such as tele-haptics, telemedicine, training, education, and entertainment. As of today, haptics is used and explored by researchers in many more multi-disciplinary and inter-disciplinary areas. The utilization of haptics is also enhanced with other forms of media such as audio, video, and even sense of smell. For example, the use of haptics is prevalent in virtual reality environments to increase the immersive experience for users. However, while there has been significant progress within haptic interfaces throughout the years, there are still many challenges that limit their development. This review highlights haptic interfaces for virtual reality ranging from wearables, handhelds, encountered-type devices, and props, to mid-air approaches. We discuss and summarize these approaches, along with interaction domains such as skin receptors, object properties, and force. This is in order to arrive at design challenges for each interface, along with existing research gaps

    Multisensory Integration as per Technological Advances: A Review

    Get PDF
    Multisensory integration research has allowed us to better understand how humans integrate sensory information to produce a unitary experience of the external world. However, this field is often challenged by the limited ability to deliver and control sensory stimuli, especially when going beyond audio–visual events and outside laboratory settings. In this review, we examine the scope and challenges of new technology in the study of multisensory integration in a world that is increasingly characterized as a fusion of physical and digital/virtual events. We discuss multisensory integration research through the lens of novel multisensory technologies and, thus, bring research in human–computer interaction, experimental psychology, and neuroscience closer together. Today, for instance, displays have become volumetric so that visual content is no longer limited to 2D screens, new haptic devices enable tactile stimulation without physical contact, olfactory interfaces provide users with smells precisely synchronized with events in virtual environments, and novel gustatory interfaces enable taste perception through levitating stimuli. These technological advances offer new ways to control and deliver sensory stimulation for multisensory integration research beyond traditional laboratory settings and open up new experimentations in naturally occurring events in everyday life experiences. Our review then summarizes these multisensory technologies and discusses initial insights to introduce a bridge between the disciplines in order to advance the study of multisensory integration

    Social Touch

    Get PDF
    Interpersonal or social touch is an intuitive and powerful way to express and communicate emotions, comfort a friend, bond with teammates, comfort a child in pain, and soothe someone who is stressed. If there is one thing that the current pandemic is showing us, it is that social distancing can make some people crave physical interaction through social touch. The notion of “skin-hunger” has become tangible for many.Social touch differs at a functional and anatomical level from discriminative touch, and has clear effects at physiological, emotional, and behavioural levels. Social touch is a topic in psychology (perception, emotion, behaviour), neuroscience (neurophysiological pathways), computer science (mediated touch communication), engineering (haptic devices), robotics (social robots that can touch), humanities (science and technology studies), and sociology (the social implications of touch). Our current scientific knowledge of social touch is scattered across disciplines and not yet adequate for the purpose of meeting today's challenges of connecting human beings through the mediating channel of technology
    corecore