1,206 research outputs found
Sonic interactions in virtual environments
This book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments
Sonic Interactions in Virtual Environments
This open access book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments
An environment for studying the impact of spatialising sonified graphs on data comprehension
We describe AudioCave, an environment for exploring the impact of spatialising sonified graphs on a set of numerical data comprehension tasks. Its design builds on findings regarding the effectiveness of sonified graphs for numerical data overview and discovery by visually impaired and blind students. We demonstrate its use as a test bed for comparing the approach of accessing a single sonified numerical datum at a time to one where multiple sonified numerical data can be accessed concurrently. Results from this experiment show that concurrent access facilitates the tackling of our set multivariate data comprehension tasks. AudioCave also demonstrates how the spatialisation of the sonified graphs provides opportunities for sharing the representation. We present two experiments investigating users solving set data comprehension tasks collaboratively by sharing the data representation
Exploring haptic interfacing with a mobile robot without visual feedback
Search and rescue scenarios are often complicated by low or no visibility conditions. The lack of visual feedback hampers orientation and causes significant stress for human rescue workers. The Guardians project [1] pioneered a group of autonomous mobile robots assisting a human rescue worker operating within close range. Trials were held with fire fighters of South Yorkshire Fire and Rescue. It became clear that the subjects by no means were prepared to give up their procedural routine and the feel of security they provide: they simply ignored instructions that contradicted their routines
Sonic Interactions in Virtual Environments
This open access book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments
Acoustic-based Smart Tactile Sensing in Social Robots
Mención Internacional en el título de doctorEl sentido del tacto es un componente crucial de la interacción social humana y es único
entre los cinco sentidos. Como único sentido proximal, el tacto requiere un contacto
físico cercano o directo para registrar la información. Este hecho convierte al tacto en
una modalidad de interacción llena de posibilidades en cuanto a comunicación social. A través
del tacto, podemos conocer la intención de la otra persona y comunicar emociones. De esta
idea surge el concepto de social touch o tacto social como el acto de tocar a otra persona en
un contexto social. Puede servir para diversos fines, como saludar, mostrar afecto, persuadir
y regular el bienestar emocional y físico.
Recientemente, el número de personas que interactúan con sistemas y agentes artificiales
ha aumentado, principalmente debido al auge de los dispositivos tecnológicos, como los smartphones
o los altavoces inteligentes. A pesar del auge de estos dispositivos, sus capacidades de
interacción son limitadas. Para paliar este problema, los recientes avances en robótica social han
mejorado las posibilidades de interacción para que los agentes funcionen de forma más fluida y
sean más útiles. En este sentido, los robots sociales están diseñados para facilitar interacciones
naturales entre humanos y agentes artificiales. El sentido del tacto en este contexto se revela
como un vehículo natural que puede mejorar la Human-Robot Interaction (HRI) debido a su
relevancia comunicativa en entornos sociales. Además de esto, para un robot social, la relación
entre el tacto social y su aspecto es directa, al disponer de un cuerpo físico para aplicar o recibir
toques.
Desde un punto de vista técnico, los sistemas de detección táctil han sido objeto recientemente
de nuevas investigaciones, sobre todo dedicado a comprender este sentido para crear sistemas
inteligentes que puedan mejorar la vida de las personas. En este punto, los robots sociales
se han convertido en dispositivos muy populares que incluyen tecnologías para la detección
táctil. Esto está motivado por el hecho de que un robot puede esperada o inesperadamente
tener contacto físico con una persona, lo que puede mejorar o interferir en la ejecución de sus
comportamientos. Por tanto, el sentido del tacto se antoja necesario para el desarrollo de aplicaciones
robóticas. Algunos métodos incluyen el reconocimiento de gestos táctiles, aunque
a menudo exigen importantes despliegues de hardware que requieren de múltiples sensores. Además, la fiabilidad de estas tecnologías de detección es limitada, ya que la mayoría de ellas
siguen teniendo problemas tales como falsos positivos o tasas de reconocimiento bajas. La detección
acústica, en este sentido, puede proporcionar un conjunto de características capaces de
paliar las deficiencias anteriores. A pesar de que se trata de una tecnología utilizada en diversos
campos de investigación, aún no se ha integrado en la interacción táctil entre humanos y robots.
Por ello, en este trabajo proponemos el sistema Acoustic Touch Recognition (ATR), un sistema
inteligente de detección táctil (smart tactile sensing system) basado en la detección acústica
y diseñado para mejorar la interacción social humano-robot. Nuestro sistema está desarrollado
para clasificar gestos táctiles y localizar su origen. Además de esto, se ha integrado en plataformas
robóticas sociales y se ha probado en aplicaciones reales con éxito. Nuestra propuesta
se ha enfocado desde dos puntos de vista: uno técnico y otro relacionado con el tacto social.
Por un lado, la propuesta tiene una motivación técnica centrada en conseguir un sistema táctil
rentable, modular y portátil. Para ello, en este trabajo se ha explorado el campo de las tecnologías
de detección táctil, los sistemas inteligentes de detección táctil y su aplicación en HRI. Por
otro lado, parte de la investigación se centra en el impacto afectivo del tacto social durante la
interacción humano-robot, lo que ha dado lugar a dos estudios que exploran esta idea.The sense of touch is a crucial component of human social interaction and is unique
among the five senses. As the only proximal sense, touch requires close or direct physical
contact to register information. This fact makes touch an interaction modality
full of possibilities regarding social communication. Through touch, we are able to ascertain
the other person’s intention and communicate emotions. From this idea emerges the concept
of social touch as the act of touching another person in a social context. It can serve various purposes,
such as greeting, showing affection, persuasion, and regulating emotional and physical
well-being.
Recently, the number of people interacting with artificial systems and agents has increased,
mainly due to the rise of technological devices, such as smartphones or smart speakers. Still,
these devices are limited in their interaction capabilities. To deal with this issue, recent developments
in social robotics have improved the interaction possibilities to make agents more seamless
and useful. In this sense, social robots are designed to facilitate natural interactions between
humans and artificial agents. In this context, the sense of touch is revealed as a natural interaction
vehicle that can improve HRI due to its communicative relevance. Moreover, for a social
robot, the relationship between social touch and its embodiment is direct, having a physical
body to apply or receive touches.
From a technical standpoint, tactile sensing systems have recently been the subject of further
research, mostly devoted to comprehending this sense to create intelligent systems that can
improve people’s lives. Currently, social robots are popular devices that include technologies
for touch sensing. This is motivated by the fact that robots may encounter expected or unexpected
physical contact with humans, which can either enhance or interfere with the execution
of their behaviours. There is, therefore, a need to detect human touch in robot applications.
Some methods even include touch-gesture recognition, although they often require significant
hardware deployments primarily that require multiple sensors. Additionally, the dependability
of those sensing technologies is constrained because the majority of them still struggle with issues
like false positives or poor recognition rates. Acoustic sensing, in this sense, can provide a
set of features that can alleviate the aforementioned shortcomings. Even though it is a technology that has been utilised in various research fields, it has yet to be integrated into human-robot
touch interaction.
Therefore, in thiswork,we propose theATRsystem, a smart tactile sensing system based on
acoustic sensing designed to improve human-robot social interaction. Our system is developed
to classify touch gestures and locate their source. It is also integrated into real social robotic platforms
and tested in real-world applications. Our proposal is approached from two standpoints,
one technical and the other related to social touch. Firstly, the technical motivation of thiswork
centred on achieving a cost-efficient, modular and portable tactile system. For that, we explore
the fields of touch sensing technologies, smart tactile sensing systems and their application in
HRI. On the other hand, part of the research is centred around the affective impact of touch
during human-robot interaction, resulting in two studies exploring this idea.Programa de Doctorado en Ingeniería Eléctrica, Electrónica y Automática por la Universidad Carlos III de MadridPresidente: Pedro Manuel Urbano de Almeida Lima.- Secretaria: María Dolores Blanco Rojas.- Vocal: Antonio Fernández Caballer
Recommended from our members
“I always wanted to see the night sky”: blind user preferences for Sensory Substitution Devices
Sensory Substitution Devices (SSDs) convert visual information into another sensory channel (e.g. sound) to improve the everyday functioning of blind and visually impaired persons (BVIP). However, the range of possible functions and options for translating vision into sound is largely open-ended. To provide constraints on the design of this technology, we interviewed ten BVIPs who were briefly trained in the use of three novel devices that, collectively, showcase a large range of design permutations. The SSDs include the ‘Depth-vOICe,’ ‘Synaestheatre’ and ‘Creole’ that offer high spatial, temporal, and colour resolutions respectively via a variety of sound outputs (electronic tones, instruments, vocals). The participants identified a range of practical concerns in relation to the devices (e.g. curb detection, recognition, mental effort) but also highlighted experiential aspects. This included both curiosity about the visual world (e.g. understanding shades of colour, the shape of cars, seeing the night sky) and the desire for the substituting sound to be responsive to movement of the device and aesthetically engaging
Feel it in my bones: Composing multimodal experience through tissue conduction
We outline here the feasibility of coherently utilising tissue conduction for spatial audio and tactile input. Tissue conduction display-specific compositional concerns are discussed; it is hypothesised that the qualia available through this medium substantively differ from those for conventional artificial means of appealing to auditory spatial perception. The implications include that spatial music experienced in this manner constitutes a new kind of experience, and that the ground rules of composition are yet to be established. We refer to results from listening experiences with one hundred listeners in an unstructured attribute elicitation exercise, where prominent themes such as “strange”, “weird”, “positive”, “spatial” and “vibrations” emerged. We speculate on future directions aimed at taking maximal advantage of the principle of multimodal perception to broaden the informational bandwidth of the display system. Some implications for composition for hearing-impaired are elucidated.n/
- …