894 research outputs found

    Enabling audio-haptics

    Get PDF
    This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s

    Haptic Experience and the Design of Drawing Interfaces

    Get PDF
    Haptic feedback has the potential to enhance users’ sense of being engaged and creative in their artwork. Current work on providing haptic feedback in computer-based drawing applications has focused mainly on the realism of the haptic sensation rather than the users’ experience of that sensation in the context of their creative work. We present a study that focuses on user experience of three haptic drawing interfaces. These interfaces were based on two different haptic metaphors, one of which mimicked familiar drawing tools (such as pen, pencil or crayon on smooth or rough paper) and the other of which drew on abstract descriptors of haptic experience (roughness, stickiness, scratchiness and smoothness). It was found that users valued having control over the haptic sensation; that each metaphor was preferred by approximately half of the participants; and that the real world metaphor interface was considered more helpful than the abstract one, whereas the abstract interface was considered to better support creativity. This suggests that future interfaces for artistic work should have user-modifiable interaction styles for controlling the haptic sensation

    The digitally 'Hand Made' object

    Get PDF
    This article will outline the author’s investigations of types of computer interfaces in practical three-dimensional design practice. The paper contains a description of two main projects in glass and ceramic tableware design, using a Microscribe G2L digitising arm as an interface to record three-dimensional spatial\ud design input.\ud \ud The article will provide critical reflections on the results of the investigations and will argue that new approaches in digital design interfaces could have relevance in developing design methods which incorporate more physical ‘human’ expressions in a three-dimensional design practice. The research builds on concepts indentified in traditional craft practice as foundations for constructing new types of creative practices based on the use of digital technologies, as outlined by McCullough (1996)

    Biosensing and Actuation—Platforms Coupling Body Input-Output Modalities for Affective Technologies

    Get PDF
    Research in the use of ubiquitous technologies, tracking systems and wearables within mental health domains is on the rise. In recent years, affective technologies have gained traction and garnered the interest of interdisciplinary fields as the research on such technologies matured. However, while the role of movement and bodily experience to affective experience is well-established, how to best address movement and engagement beyond measuring cues and signals in technology-driven interactions has been unclear. In a joint industry-academia effort, we aim to remodel how affective technologies can help address body and emotional self-awareness. We present an overview of biosignals that have become standard in low-cost physiological monitoring and show how these can be matched with methods and engagements used by interaction designers skilled in designing for bodily engagement and aesthetic experiences. Taking both strands of work together offers unprecedented design opportunities that inspire further research. Through first-person soma design, an approach that draws upon the designer’s felt experience and puts the sentient body at the forefront, we outline a comprehensive work for the creation of novel interactions in the form of couplings that combine biosensing and body feedback modalities of relevance to affective health. These couplings lie within the creation of design toolkits that have the potential to render rich embodied interactions to the designer/user. As a result we introduce the concept of “orchestration”. By orchestration, we refer to the design of the overall interaction: coupling sensors to actuation of relevance to the affective experience; initiating and closing the interaction; habituating; helping improve on the users’ body awareness and engagement with emotional experiences; soothing, calming, or energising, depending on the affective health condition and the intentions of the designer. Through the creation of a range of prototypes and couplings we elicited requirements on broader orchestration mechanisms. First-person soma design lets researchers look afresh at biosignals that, when experienced through the body, are called to reshape affective technologies with novel ways to interpret biodata, feel it, understand it and reflect upon our bodies

    Physical contraptions as social interaction catalysts

    Get PDF

    I’m sensing in the rain: spatial incongruity in visual-tactile mid-air stimulation can elicit ownership in VR users

    Get PDF
    Major virtual reality (VR) companies are trying to enhance the sense of immersion in virtual environments by implementing haptic feedback in their systems (e.g., Oculus Touch). It is known that tactile stimulation adds realism to a virtual environment. In addition, when users are not limited by wearing any attachments (e.g., gloves), it is even possible to create more immersive experiences. Mid-air haptic technology provides contactless haptic feedback and offers the potential for creating such immersive VR experiences. However, one of the limitations of mid-air haptics resides in the need for freehand tracking systems (e.g., Leap Motion) to deliver tactile feedback to the user's hand. These tracking systems are not accurate, limiting designers capability of delivering spatially precise tactile stimulation. Here, we investigated an alternative way to convey incongruent visual-tactile stimulation that can be used to create the illusion of a congruent visual-tactile experience, while participants experience the phenomenon of the rubber hand illusion in VR

    Semefulness: A social semiotics of touch

    Full text link
    This paper explores the multiple significances (semefulness) of touch, as experienced by us as embodied subjects. Prompted by the development of a range of touch-based technologies, I consider the current writings about touch in a range of fields and how these have contributed to contemporary understandings of the meanings of touch. I then explore a number of these meanings - connection, engagement, contiguity, differentiation, positioning - for their contribution to our understanding of the world and of our own embodied subjectivity. I also explore the deployment of these meanings by contemporary technologies. © 2011 Taylor & Francis

    I'm sensing in the rain: Spatial incongruity in visual-tactile mid-air stimulation can elicit ownership in VR users

    Get PDF
    Major virtual reality (VR) companies are trying to enhance the sense of immersion in virtual environments by implementing haptic feedback in their systems (e.g., Oculus Touch). It is known that tactile stimulation adds realism to a virtual environment. In addition, when users are not limited by wearing any attachments (e.g., gloves), it is even possible to create more immersive experiences. Mid-air haptic technology provides contactless haptic feedback and offers the potential for creating such immersive VR experiences. However, one of the limitations of mid-air haptics resides in the need for freehand tracking systems (e.g., Leap Motion) to deliver tactile feedback to the user's hand. These tracking systems are not accurate, limiting designers capability of delivering spatially precise tactile stimulation. Here, we investigated an alternative way to convey incongruent visual-tactile stimulation that can be used to create the illusion of a congruent visual-tactile experience, while participants experience the phenomenon of the rubber hand illusion in VR
    corecore