5,076 research outputs found
Levitating Particle Displays with Interactive Voxels
Levitating objects can be used as the primitives in a new type of display. We present levitating particle displays and show how research into object levitation is enabling a new way of presenting and interacting with information. We identify novel properties of levitating particle displays and give examples of the interaction techniques and applications they allow. We then discuss design challenges for these displays, potential solutions, and promising areas for future research
Trajectory Deformations from Physical Human-Robot Interaction
Robots are finding new applications where physical interaction with a human
is necessary: manufacturing, healthcare, and social tasks. Accordingly, the
field of physical human-robot interaction (pHRI) has leveraged impedance
control approaches, which support compliant interactions between human and
robot. However, a limitation of traditional impedance control is that---despite
provisions for the human to modify the robot's current trajectory---the human
cannot affect the robot's future desired trajectory through pHRI. In this
paper, we present an algorithm for physically interactive trajectory
deformations which, when combined with impedance control, allows the human to
modulate both the actual and desired trajectories of the robot. Unlike related
works, our method explicitly deforms the future desired trajectory based on
forces applied during pHRI, but does not require constant human guidance. We
present our approach and verify that this method is compatible with traditional
impedance control. Next, we use constrained optimization to derive the
deformation shape. Finally, we describe an algorithm for real time
implementation, and perform simulations to test the arbitration parameters.
Experimental results demonstrate reduction in the human's effort and
improvement in the movement quality when compared to pHRI with impedance
control alone
Haptic-GeoZui3D: Exploring the Use of Haptics in AUV Path Planning
We have developed a desktop virtual reality system that we call Haptic-GeoZui3D, which brings together 3D user interaction and visualization to provide a compelling environment for AUV path planning. A key component in our system is the PHANTOM haptic device (SensAble Technologies, Inc.), which affords a sense of touch and force feedback – haptics – to provide cues and constraints to guide the user’s interaction. This paper describes our system, and how we use haptics to significantly augment our ability to lay out a vehicle path. We show how our system works well for quickly defining simple waypoint-towaypoint (e.g. transit) path segments, and illustrate how it could be used in specifying more complex, highly segmented (e.g. lawnmower survey) paths
Towards a multi-layer architecture for multi-modal rendering of expressive actions
International audienceExpressive content has multiple facets that can be conveyed by music, gesture, actions. Different application scenarios can require different metaphors for expressiveness control. In order to meet the requirements for flexible representation, we propose a multi-layer architecture structured into three main levels of abstraction. At the top (user level) there is a semantic description, which is adapted to specific user requirements and conceptualization. At the other end are low-level features that describe parameters strictly related to the rendering model. In between these two extremes, we propose an intermediate layer that provides a description shared by the various high-level representations on one side, and that can be instantiated to the various low-level rendering models on the other side. In order to provide a common representation of different expressive semantics and different modalities, we propose a physically-inspired description specifically suited for expressive actions
Recommended from our members
Trends in virtual reality technologies for the learning patient
NextMed convened the Medicine Meets Virtual Reality 22 (MMVR 22) conference in 2016. Since 1992, the conference has brought together a diverse group of researchers to share creative solutions for the evolving challenge of integrating virtual reality tools into medical education. Virtual reality (VR) and its enabling technologies utilize hardware and software to simulate environments and encounters where users can interact and learn. The MMVR 22 symposium proceedings contain projects that support a variety of learners: medical students, practitioners, soldiers, and patients. This report will contemplate the trends in virtual reality technologies for patients navigating their medical and healthcare learning. The learning patient seeks more than intervention; they seek prevention. From virtual humans and environments to motion sensors and haptic devices, patients are surrounded by increasingly rich and transformative data-driven tools. Applied data enables VR applications to simulate experience, predict health outcomes, and motivate new behavior. The MMVR 22 presents investigations into the usability of wearable devices, the efficacy of avatar inclusion, and the viability of multi-player gaming. With increasing need for individualized and scalable programming, only committed open source efforts will align instructional designers, technology integrators, trainers, and clinicians. Curriculum and InstructionCurriculum and Instructio
- …