2,204 research outputs found
From presence to consciousness through virtual reality
Immersive virtual environments can break the deep, everyday connection between where our senses tell us we are and where we are actually located and whom we are with. The concept of 'presence' refers to the phenomenon of behaving and feeling as if we are in the virtual world created by computer displays. In this article, we argue that presence is worthy of study by neuroscientists, and that it might aid the study of perception and consciousness
An aesthetics of touch: investigating the language of design relating to form
How well can designers communicate qualities of touch?
This paper presents evidence that they have some capability to do so, much of which appears to have been learned, but at present make limited use of such language. Interviews with graduate designer-makers suggest that they are aware of and value the importance of touch and materiality in their work, but lack a vocabulary to fully relate to their detailed explanations of other aspects such as their intent or selection of materials. We believe that more attention should be paid to the verbal dialogue that happens in the design process, particularly as other researchers show that even making-based learning also has a strong verbal element to it. However, verbal language alone does not appear to be adequate for a comprehensive language of touch. Graduate designers-makersâ descriptive practices combined non-verbal manipulation within verbal accounts. We thus argue that haptic vocabularies do not simply describe material qualities, but rather are situated competences that physically demonstrate the presence of haptic qualities. Such competencies are more important than groups of verbal vocabularies in isolation. Design support for developing and extending haptic competences must take this wide range of considerations into account to comprehensively improve designersâ capabilities
Silent Light, Luminous Noise: Photophonics, Machines and the Senses
This research takes the basic physical premise that sound can be synthesized using light, explores how this has historically been, and still is achieved, and how it can still be a fertile area for creative, theoretical and critical exploration in sound and the arts. Through the author's own artistic practice, different techniques of generating sound using the sonification of light are explored, and these techniques are then contextualised by their historical and theoretical setting in the time-based arts. Specifically, this text draws together diverse strands of scholarship on experimental sound and film practices, cultural histories, the senses, media theory and engineering to address effects and outcomes specific to photophonic sound and its relation to the moving image, and the sculptural and media works devised to produce it.
The sonifier, or device engendering the transformations discussed is specifically addressed in its many forms, and a model proposed, whereby these devices and systems are an integral, readably inscribed component - both materially and culturally - in both the works they produce, and via our reflexive understanding of the processes involved, of the images or light signals used to produce them. Other practitioners' works are critically engaged to demonstrate how a sense of touch, or the haptic, can be thought of as an emergent property of moving image works which readably and structurally make use of photophonic sound (including the author's), and sound's essential role in this is examined.
In developing, through an integration of theory and practice, a new approach in this under-researched field of sound studies, the author hopes to show how photophonic sound can act as both a metaphorical and material interface between experimental sound and image, and hopefully point the way towards a more comprehensive study of both
Haptics Rendering and Applications
There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future
Towards new modes of collective musical expression through audio augmented reality
We investigate how audio augmented reality can engender new collective modes of musical expression in the context of a sound art installation, Listening Mirrors, exploring the creation of interactive sound environments for musicians and non-musicians alike. Listening Mirrors is designed to incorporate physical objects and computational systems for altering the acoustic environment, to enhance collective listening and challenge traditional musician-instrument performance. At a formative stage in exploring audio AR technology, we conducted an audience experience study investigating questions around the potential of audio AR in creating sound installation environments for collective musical expression.
We collected interview evidence about the participants' experience and analysed the data with using a grounded theory approach. The results demonstrated that the technology has the potential to create immersive spaces where an audience can feel safe to experiment musically, and showed how AR can intervene in sound perception to instrumentalise an environment. The results also revealed caveats about the use of audio AR, mainly centred on social inhibition and seamlessness of experience, and finding a balance between mediated worlds to create space for interplay between the two
Biosensing and ActuationâPlatforms Coupling Body Input-Output Modalities for Affective Technologies
Research in the use of ubiquitous technologies, tracking systems and wearables within
mental health domains is on the rise. In recent years, affective technologies have gained
traction and garnered the interest of interdisciplinary fields as the research on such technologies
matured. However, while the role of movement and bodily experience to affective experience is
well-established, how to best address movement and engagement beyond measuring cues and signals
in technology-driven interactions has been unclear. In a joint industry-academia effort, we aim to
remodel how affective technologies can help address body and emotional self-awareness. We present
an overview of biosignals that have become standard in low-cost physiological monitoring and show
how these can be matched with methods and engagements used by interaction designers skilled in
designing for bodily engagement and aesthetic experiences. Taking both strands of work together offers
unprecedented design opportunities that inspire further research. Through first-person soma design,
an approach that draws upon the designerâs felt experience and puts the sentient body at the forefront,
we outline a comprehensive work for the creation of novel interactions in the form of couplings that
combine biosensing and body feedback modalities of relevance to affective health. These couplings lie
within the creation of design toolkits that have the potential to render rich embodied interactions to
the designer/user. As a result we introduce the concept of âorchestrationâ. By orchestration, we refer
to the design of the overall interaction: coupling sensors to actuation of relevance to the affective
experience; initiating and closing the interaction; habituating; helping improve on the usersâ body
awareness and engagement with emotional experiences; soothing, calming, or energising, depending
on the affective health condition and the intentions of the designer. Through the creation of a
range of prototypes and couplings we elicited requirements on broader orchestration mechanisms.
First-person soma design lets researchers look afresh at biosignals that, when experienced through
the body, are called to reshape affective technologies with novel ways to interpret biodata, feel it,
understand it and reflect upon our bodies
A Device for Mimicking the Contact Force/Contact Area Relationship of Different Materials with Applications to Softness Rendering
In this paper a fabric yielding softness display (FYD-2) is proposed, where the stretching state is controlled using two motors, while the contact area is measured in real-time. In previous works, authors proposed a fabric-based device, with embedded contact area measurement system, which was proved to provide subjects with a compelling and naturalistic softness perception. Compared to it, FYD-2 exhibits reduced dimensions, a more accurate sensorization scheme and an increased actuation velocity, which allows to implement fast changes in the stretching state levels. These changes are mandatory, for example, to properly track typical quadratic force/area curves of real materials. Furthermore, FYD-2 is endowed with an additional degree of freedom that can be used to convey supplementary haptic cues, such as directional cues, which can be exploited to produce more immersive haptic interactions. In this work we describe the mechanical design and the mathematical model of the device. The reliability in real-time tracking of stiffness and force-area curves of real objects is also demonstrated
Enabling audio-haptics
This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s
Da Vinci robot at Hospital Clinic. Manoeuvrability devices and performance in robotic tech
Treballs Finals de Grau d'Enginyeria BiomĂšdica. Facultat de Medicina i CiĂšncies de la Salut. Universitat de Barcelona. Curs: 2020-2021. Tutor: Manel Puig Vidal.Robot-assisted surgical systems are becoming increasingly common in medical procedures as they embrace many of the benefits of minimally invasive surgery including less trauma, recovery time and financial costs associated to the treatment after surgery. These robotic systems allow the surgeons to navigate within confined spaces where an operatorâs human hand would normally be greatly limited. This dexterity is further strengthened through motion scaling, which translates large motions by the operator into diminutive actions of the robotic end effector. An example of this is the Da Vinci System which is coupled to the EndoWrist end effector tool.
Nevertheless, these systems also have some drawbacks such as the high cost of the surgery itself and the lack of tactile or haptic feedback. This means that as the surgeon is performing the procedures outside the patientâs body, he/she can not feel the resistance of the human tissueâs when cutting. Therefore, one can risk damaging healthy tissues if force is not controlled or, when sewing, one can exert an exaggerated force and break the thread.
In this project, a new system is created based on the UR5 robot (Universal Robots) and an EndoWrist needle to mimic the behaviour of the Da Vinci System and implement some improvements regarding the manoeuvrability and haptic feedback performance
- âŠ