363 research outputs found

    Learning to Navigate Cloth using Haptics

    Full text link
    We present a controller that allows an arm-like manipulator to navigate deformable cloth garments in simulation through the use of haptic information. The main challenge of such a controller is to avoid getting tangled in, tearing or punching through the deforming cloth. Our controller aggregates force information from a number of haptic-sensing spheres all along the manipulator for guidance. Based on haptic forces, each individual sphere updates its target location, and the conflicts that arise between this set of desired positions is resolved by solving an inverse kinematic problem with constraints. Reinforcement learning is used to train the controller for a single haptic-sensing sphere, where a training run is terminated (and thus penalized) when large forces are detected due to contact between the sphere and a simplified model of the cloth. In simulation, we demonstrate successful navigation of a robotic arm through a variety of garments, including an isolated sleeve, a jacket, a shirt, and shorts. Our controller out-performs two baseline controllers: one without haptics and another that was trained based on large forces between the sphere and cloth, but without early termination.Comment: Supplementary video available at https://youtu.be/iHqwZPKVd4A. Related publications http://www.cc.gatech.edu/~karenliu/Robotic_dressing.htm

    Manipulating Highly Deformable Materials Using a Visual Feedback Dictionary

    Full text link
    The complex physical properties of highly deformable materials such as clothes pose significant challenges fanipulation systems. We present a novel visual feedback dictionary-based method for manipulating defoor autonomous robotic mrmable objects towards a desired configuration. Our approach is based on visual servoing and we use an efficient technique to extract key features from the RGB sensor stream in the form of a histogram of deformable model features. These histogram features serve as high-level representations of the state of the deformable material. Next, we collect manipulation data and use a visual feedback dictionary that maps the velocity in the high-dimensional feature space to the velocity of the robotic end-effectors for manipulation. We have evaluated our approach on a set of complex manipulation tasks and human-robot manipulation tasks on different cloth pieces with varying material characteristics.Comment: The video is available at goo.gl/mDSC4

    Incorporating haptic features into physics-based simulation

    Get PDF
    In our graphic lab, we have developed many physics-based animations focusing on muscles and we hope to create an interactive interface with tactile feedback so that the users can not only see those physical features but also experience the forces in the muscle line. They will be able to touch on the surface of muscles and feel the muscle texture and they will also be able to drag the muscle line and feel the tension and forces. This is especially important for co-contraction of two opposing muscles, since co-contractions do not produce any motion by changes the stiffness of the joint. Therefore, we used the Geomagic Touch (tm) haptic device for generating the haptic feedbacks and to incorporate OpenHaptics for haptic programming

    Sensory Substitution, Key to Inclusive Learning

    Get PDF
    Visually impaired students, in primary education, encounter unique challenges while learning creative skills, exploring artistic expression and developing problem-solving skills, because so much instructional content is delivered visually. Sensory substitution—an approach that replaces visual information with feedback from other intact senses like touch, sound, taste or smell—provides an opportunity to address those challenges. Through the use of sensory substitution, this thesis proposes concrete ways to capitalize on the enhanced abilities of visually impaired primary school students. The research outcome of this thesis is a system of templates that puts these enhanced abilities to work for visually impaired students, to support them while they learn creative skills and practice problem-solving in a classroom setting. Each template contains a lesson that can be learned by using the process of paper quilling. The templates work equally well for sighted and visually impaired students, since all will be able to understand the lesson by using the sense of touch, as they learn by making

    Testing a Shape-Changing Haptic Navigation Device With Vision-Impaired and Sighted Audiences in an Immersive Theater Setting

    Get PDF
    Flatland was an immersive “in-the-wild” experimental theater and technology project, undertaken with the goal of developing systems that could assist “real-world” pedestrian navigation for both vision-impaired (VI) and sighted individuals, while also exploring inclusive and equivalent cultural experiences for VI and sighted audiences. A novel shape-changing handheld haptic navigation device, the “Animotus,” was developed. The device has the ability to modify its form in the user's grasp to communicate heading and proximity to navigational targets. Flatland provided a unique opportunity to comparatively study the use of novel navigation devices with a large group of individuals (79 sighted, 15 VI) who were primarily attending a theater production rather than an experimental study. In this paper, we present our findings on comparing the navigation performance (measured in terms of efficiency, average pace, and time facing targets) and opinions of VI and sighted users of the Animotus as they negotiated the 112 m2 production environment. Differences in navigation performance were nonsignificant across VI and sighted individuals and a similar range of opinions on device function and engagement spanned both groups. We believe more structured device familiarization, particularly for VI users, could improve performance and incorrect technology expectations (such as obstacle avoidance capability), which influenced overall opinion. This paper is intended to aid the development of future inclusive technologies and cultural experiences

    Wearable performance

    Get PDF
    This is the post-print version of the article. The official published version can be accessed from the link below - Copyright @ 2009 Taylor & FrancisWearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment. Wearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment

    Physically Interacting With Four Dimensions

    Get PDF
    Thesis (Ph.D.) - Indiana University, Computer Sciences, 2009People have long been fascinated with understanding the fourth dimension. While making pictures of 4D objects by projecting them to 3D can help reveal basic geometric features, 3D graphics images by themselves are of limited value. For example, just as 2D shadows of 3D curves may have lines crossing one another in the shadow, 3D graphics projections of smooth 4D topological surfaces can be interrupted where one surface intersects another. The research presented here creates physically realistic models for simple interactions with objects and materials in a virtual 4D world. We provide methods for the construction, multimodal exploration, and interactive manipulation of a wide variety of 4D objects. One basic achievement of this research is to exploit the free motion of a computer-based haptic probe to support a continuous motion that follows the \emph{local continuity\/} of a 4D surface, allowing collision-free exploration in the 3D projection. In 3D, this interactive probe follows the full local continuity of the surface as though we were in fact \emph{physically touching\/} the actual static 4D object. Our next contribution is to support dynamic 4D objects that can move, deform, and collide with other objects as well as with themselves. By combining graphics, haptics, and collision-sensing physical modeling, we can thus enhance our 4D visualization experience. Since we cannot actually place interaction devices in 4D, we develop fluid methods for interacting with a 4D object in its 3D shadow image using adapted reduced-dimension 3D tools for manipulating objects embedded in 4D. By physically modeling the correct properties of 4D surfaces, their bending forces, and their collisions in the 3D interactive or haptic controller interface, we can support full-featured physical exploration of 4D mathematical objects in a manner that is otherwise far beyond the real-world experience accessible to human beings

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility
    • …
    corecore