6,340 research outputs found
Force-Aware Interface via Electromyography for Natural VR/AR Interaction
While tremendous advances in visual and auditory realism have been made for
virtual and augmented reality (VR/AR), introducing a plausible sense of
physicality into the virtual world remains challenging. Closing the gap between
real-world physicality and immersive virtual experience requires a closed
interaction loop: applying user-exerted physical forces to the virtual
environment and generating haptic sensations back to the users. However,
existing VR/AR solutions either completely ignore the force inputs from the
users or rely on obtrusive sensing devices that compromise user experience.
By identifying users' muscle activation patterns while engaging in VR/AR, we
design a learning-based neural interface for natural and intuitive force
inputs. Specifically, we show that lightweight electromyography sensors,
resting non-invasively on users' forearm skin, inform and establish a robust
understanding of their complex hand activities. Fuelled by a
neural-network-based model, our interface can decode finger-wise forces in
real-time with 3.3% mean error, and generalize to new users with little
calibration. Through an interactive psychophysical study, we show that human
perception of virtual objects' physical properties, such as stiffness, can be
significantly enhanced by our interface. We further demonstrate that our
interface enables ubiquitous control via finger tapping. Ultimately, we
envision our findings to push forward research towards more realistic
physicality in future VR/AR.Comment: ACM Transactions on Graphics (SIGGRAPH Asia 2022
Toward Optimized VR/AR Ergonomics: Modeling and Predicting User Neck Muscle Contraction
Ergonomic efficiency is essential to the mass and prolonged adoption of VR/AR
experiences. While VR/AR head-mounted displays unlock users' natural wide-range
head movements during viewing, their neck muscle comfort is inevitably
compromised by the added hardware weight. Unfortunately, little quantitative
knowledge for understanding and addressing such an issue is available so far.
Leveraging electromyography devices, we measure, model, and predict VR users'
neck muscle contraction levels (MCL) while they move their heads to interact
with the virtual environment. Specifically, by learning from collected
physiological data, we develop a bio-physically inspired computational model to
predict neck MCL under diverse head kinematic states. Beyond quantifying the
cumulative MCL of completed head movements, our model can also predict
potential MCL requirements with target head poses only. A series of objective
evaluations and user studies demonstrate its prediction accuracy and
generality, as well as its ability in reducing users' neck discomfort by
optimizing the layout of visual targets. We hope this research will motivate
new ergonomic-centered designs for VR/AR and interactive graphics applications.
Source code is released at:
https://github.com/NYU-ICL/xr-ergonomics-neck-comfort.Comment: ACM SIGGRAPH 2023 Conference Proceeding
Ono: an open platform for social robotics
In recent times, the focal point of research in robotics has shifted from industrial ro- bots toward robots that interact with humans in an intuitive and safe manner. This evolution has resulted in the subfield of social robotics, which pertains to robots that function in a human environment and that can communicate with humans in an int- uitive way, e.g. with facial expressions. Social robots have the potential to impact many different aspects of our lives, but one particularly promising application is the use of robots in therapy, such as the treatment of children with autism. Unfortunately, many of the existing social robots are neither suited for practical use in therapy nor for large scale studies, mainly because they are expensive, one-of-a-kind robots that are hard to modify to suit a specific need. We created Ono, a social robotics platform, to tackle these issues. Ono is composed entirely from off-the-shelf components and cheap materials, and can be built at a local FabLab at the fraction of the cost of other robots. Ono is also entirely open source and the modular design further encourages modification and reuse of parts of the platform
Design and Implementation of Bio-inspired Underwater Electrosense
Underwater electrosense, manipulating underwater electric field for sensing purpose, is a growing technology bio-inspired by weakly electric fish that can navigate in dark or cluttered water. We studied its theoretical foundations and developed sophisticated sensing algorithms including some first-introduced techniques such as discrete dipole approximation (DDA) and convolutional neural networks (CNN), which were tested and validated by simulation and a planar sensor prototype. This work pave a solid way to applications on practical underwater robots
A survey on bio-signal analysis for human-robot interaction
The use of bio-signals analysis in human-robot interaction is rapidly increasing. There is an urgent demand for it in various applications, including health care, rehabilitation, research, technology, and manufacturing. Despite several state-of-the-art bio-signals analyses in human-robot interaction (HRI) research, it is unclear which one is the best. In this paper, the following topics will be discussed: robotic systems should be given priority in the rehabilitation and aid of amputees and disabled people; second, domains of feature extraction approaches now in use, which are divided into three main sections (time, frequency, and time-frequency). The various domains will be discussed, then a discussion of each domain's benefits and drawbacks, and finally, a recommendation for a new strategy for robotic systems
- …