76 research outputs found

    Tailoring Interaction. Sensing Social Signals with Textiles.

    Get PDF
    Nonverbal behaviour is an important part of conversation and can reveal much about the nature of an interaction. It includes phenomena ranging from large-scale posture shifts to small scale nods. Capturing these often spontaneous phenomena requires unobtrusive sensing techniques that do not interfere with the interaction. We propose an underexploited sensing modality for sensing nonverbal behaviours: textiles. As a material in close contact with the body, they provide ubiquitous, large surfaces that make them a suitable soft interface. Although the literature on nonverbal communication focuses on upper body movements such as gestures, observations of multi-party, seated conversations suggest that sitting postures, leg and foot movements are also systematically related to patterns of social interaction. This thesis addressees the following questions: Can the textiles surrounding us measure social engagement? Can they tell who is speaking, and who, if anyone, is listening? Furthermore, how should wearable textile sensing systems be designed and what behavioural signals could textiles reveal? To address these questions, we have designed and manufactured bespoke chairs and trousers with integrated textile pressure sensors, that are introduced here. The designs are evaluated in three user studies that produce multi-modal datasets for the exploration of fine-grained interactional signals. Two approaches to using these bespoke textile sensors are explored. First, hand crafted sensor patches in chair covers serve to distinguish speakers and listeners. Second, a pressure sensitive matrix in custom-made smart trousers is developed to detect static sitting postures, dynamic bodily movement, as well as basic conversational states. Statistical analyses, machine learning approaches, and ethnographic methods show that by moni- toring patterns of pressure change alone it is possible to not only classify postures with high accuracy, but also to identify a wide range of behaviours reliably in individuals and groups. These findings es- tablish textiles as a novel, wearable sensing system for applications in social sciences, and contribute towards a better understanding of nonverbal communication, especially the significance of posture shifts when seated. If chairs know who is speaking, if our trousers can capture our social engagement, what role can smart textiles have in the future of human interaction? How can we build new ways to map social ecologies and tailor interactions

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Low-Cost Sensors and Biological Signals

    Get PDF
    Many sensors are currently available at prices lower than USD 100 and cover a wide range of biological signals: motion, muscle activity, heart rate, etc. Such low-cost sensors have metrological features allowing them to be used in everyday life and clinical applications, where gold-standard material is both too expensive and time-consuming to be used. The selected papers present current applications of low-cost sensors in domains such as physiotherapy, rehabilitation, and affective technologies. The results cover various aspects of low-cost sensor technology from hardware design to software optimization

    Acoustic-based Smart Tactile Sensing in Social Robots

    Get PDF
    Mención Internacional en el título de doctorEl sentido del tacto es un componente crucial de la interacción social humana y es único entre los cinco sentidos. Como único sentido proximal, el tacto requiere un contacto físico cercano o directo para registrar la información. Este hecho convierte al tacto en una modalidad de interacción llena de posibilidades en cuanto a comunicación social. A través del tacto, podemos conocer la intención de la otra persona y comunicar emociones. De esta idea surge el concepto de social touch o tacto social como el acto de tocar a otra persona en un contexto social. Puede servir para diversos fines, como saludar, mostrar afecto, persuadir y regular el bienestar emocional y físico. Recientemente, el número de personas que interactúan con sistemas y agentes artificiales ha aumentado, principalmente debido al auge de los dispositivos tecnológicos, como los smartphones o los altavoces inteligentes. A pesar del auge de estos dispositivos, sus capacidades de interacción son limitadas. Para paliar este problema, los recientes avances en robótica social han mejorado las posibilidades de interacción para que los agentes funcionen de forma más fluida y sean más útiles. En este sentido, los robots sociales están diseñados para facilitar interacciones naturales entre humanos y agentes artificiales. El sentido del tacto en este contexto se revela como un vehículo natural que puede mejorar la Human-Robot Interaction (HRI) debido a su relevancia comunicativa en entornos sociales. Además de esto, para un robot social, la relación entre el tacto social y su aspecto es directa, al disponer de un cuerpo físico para aplicar o recibir toques. Desde un punto de vista técnico, los sistemas de detección táctil han sido objeto recientemente de nuevas investigaciones, sobre todo dedicado a comprender este sentido para crear sistemas inteligentes que puedan mejorar la vida de las personas. En este punto, los robots sociales se han convertido en dispositivos muy populares que incluyen tecnologías para la detección táctil. Esto está motivado por el hecho de que un robot puede esperada o inesperadamente tener contacto físico con una persona, lo que puede mejorar o interferir en la ejecución de sus comportamientos. Por tanto, el sentido del tacto se antoja necesario para el desarrollo de aplicaciones robóticas. Algunos métodos incluyen el reconocimiento de gestos táctiles, aunque a menudo exigen importantes despliegues de hardware que requieren de múltiples sensores. Además, la fiabilidad de estas tecnologías de detección es limitada, ya que la mayoría de ellas siguen teniendo problemas tales como falsos positivos o tasas de reconocimiento bajas. La detección acústica, en este sentido, puede proporcionar un conjunto de características capaces de paliar las deficiencias anteriores. A pesar de que se trata de una tecnología utilizada en diversos campos de investigación, aún no se ha integrado en la interacción táctil entre humanos y robots. Por ello, en este trabajo proponemos el sistema Acoustic Touch Recognition (ATR), un sistema inteligente de detección táctil (smart tactile sensing system) basado en la detección acústica y diseñado para mejorar la interacción social humano-robot. Nuestro sistema está desarrollado para clasificar gestos táctiles y localizar su origen. Además de esto, se ha integrado en plataformas robóticas sociales y se ha probado en aplicaciones reales con éxito. Nuestra propuesta se ha enfocado desde dos puntos de vista: uno técnico y otro relacionado con el tacto social. Por un lado, la propuesta tiene una motivación técnica centrada en conseguir un sistema táctil rentable, modular y portátil. Para ello, en este trabajo se ha explorado el campo de las tecnologías de detección táctil, los sistemas inteligentes de detección táctil y su aplicación en HRI. Por otro lado, parte de la investigación se centra en el impacto afectivo del tacto social durante la interacción humano-robot, lo que ha dado lugar a dos estudios que exploran esta idea.The sense of touch is a crucial component of human social interaction and is unique among the five senses. As the only proximal sense, touch requires close or direct physical contact to register information. This fact makes touch an interaction modality full of possibilities regarding social communication. Through touch, we are able to ascertain the other person’s intention and communicate emotions. From this idea emerges the concept of social touch as the act of touching another person in a social context. It can serve various purposes, such as greeting, showing affection, persuasion, and regulating emotional and physical well-being. Recently, the number of people interacting with artificial systems and agents has increased, mainly due to the rise of technological devices, such as smartphones or smart speakers. Still, these devices are limited in their interaction capabilities. To deal with this issue, recent developments in social robotics have improved the interaction possibilities to make agents more seamless and useful. In this sense, social robots are designed to facilitate natural interactions between humans and artificial agents. In this context, the sense of touch is revealed as a natural interaction vehicle that can improve HRI due to its communicative relevance. Moreover, for a social robot, the relationship between social touch and its embodiment is direct, having a physical body to apply or receive touches. From a technical standpoint, tactile sensing systems have recently been the subject of further research, mostly devoted to comprehending this sense to create intelligent systems that can improve people’s lives. Currently, social robots are popular devices that include technologies for touch sensing. This is motivated by the fact that robots may encounter expected or unexpected physical contact with humans, which can either enhance or interfere with the execution of their behaviours. There is, therefore, a need to detect human touch in robot applications. Some methods even include touch-gesture recognition, although they often require significant hardware deployments primarily that require multiple sensors. Additionally, the dependability of those sensing technologies is constrained because the majority of them still struggle with issues like false positives or poor recognition rates. Acoustic sensing, in this sense, can provide a set of features that can alleviate the aforementioned shortcomings. Even though it is a technology that has been utilised in various research fields, it has yet to be integrated into human-robot touch interaction. Therefore, in thiswork,we propose theATRsystem, a smart tactile sensing system based on acoustic sensing designed to improve human-robot social interaction. Our system is developed to classify touch gestures and locate their source. It is also integrated into real social robotic platforms and tested in real-world applications. Our proposal is approached from two standpoints, one technical and the other related to social touch. Firstly, the technical motivation of thiswork centred on achieving a cost-efficient, modular and portable tactile system. For that, we explore the fields of touch sensing technologies, smart tactile sensing systems and their application in HRI. On the other hand, part of the research is centred around the affective impact of touch during human-robot interaction, resulting in two studies exploring this idea.Programa de Doctorado en Ingeniería Eléctrica, Electrónica y Automática por la Universidad Carlos III de MadridPresidente: Pedro Manuel Urbano de Almeida Lima.- Secretaria: María Dolores Blanco Rojas.- Vocal: Antonio Fernández Caballer

    Digital fabrication of custom interactive objects with rich materials

    Get PDF
    As ubiquitous computing is becoming reality, people interact with an increasing number of computer interfaces embedded in physical objects. Today, interaction with those objects largely relies on integrated touchscreens. In contrast, humans are capable of rich interaction with physical objects and their materials through sensory feedback and dexterous manipulation skills. However, developing physical user interfaces that offer versatile interaction and leverage these capabilities is challenging. It requires novel technologies for prototyping interfaces with custom interactivity that support rich materials of everyday objects. Moreover, such technologies need to be accessible to empower a wide audience of researchers, makers, and users. This thesis investigates digital fabrication as a key technology to address these challenges. It contributes four novel design and fabrication approaches for interactive objects with rich materials. The contributions enable easy, accessible, and versatile design and fabrication of interactive objects with custom stretchability, input and output on complex geometries and diverse materials, tactile output on 3D-object geometries, and capabilities of changing their shape and material properties. Together, the contributions of this thesis advance the fields of digital fabrication, rapid prototyping, and ubiquitous computing towards the bigger goal of exploring interactive objects with rich materials as a new generation of physical interfaces.Computer werden zunehmend in Geräten integriert, mit welchen Menschen im Alltag interagieren. Heutzutage basiert diese Interaktion weitgehend auf Touchscreens. Im Kontrast dazu steht die reichhaltige Interaktion mit physischen Objekten und Materialien durch sensorisches Feedback und geschickte Manipulation. Interfaces zu entwerfen, die diese Fähigkeiten nutzen, ist allerdings problematisch. Hierfür sind Technologien zum Prototyping neuer Interfaces mit benutzerdefinierter Interaktivität und Kompatibilität mit vielfältigen Materialien erforderlich. Zudem sollten solche Technologien zugänglich sein, um ein breites Publikum zu erreichen. Diese Dissertation erforscht die digitale Fabrikation als Schlüsseltechnologie, um diese Probleme zu adressieren. Sie trägt vier neue Design- und Fabrikationsansätze für das Prototyping interaktiver Objekte mit reichhaltigen Materialien bei. Diese ermöglichen einfaches, zugängliches und vielseitiges Design und Fabrikation von interaktiven Objekten mit individueller Dehnbarkeit, Ein- und Ausgabe auf komplexen Geometrien und vielfältigen Materialien, taktiler Ausgabe auf 3D-Objektgeometrien und der Fähigkeit ihre Form und Materialeigenschaften zu ändern. Insgesamt trägt diese Dissertation zum Fortschritt der Bereiche der digitalen Fabrikation, des Rapid Prototyping und des Ubiquitous Computing in Richtung des größeren Ziels, der Exploration interaktiver Objekte mit reichhaltigen Materialien als eine neue Generation von physischen Interfaces, bei

    Connecting Free Improvisation Performance and Drumming Gestures through Digital Wearables

    Get PDF
    High-level improvising musicians master idiosyncratic gesture vocabularies that allow them to express themselves in unique ways. The full use of such vocabularies is nevertheless challenged when improvisers incorporate electronics in their performances. To control electronic sounds and effects, they typically use commercial interfaces whose physicality is likely to limit their freedom of movement. Based on Jim Black's descriptions of his ideal digital musical instrument, embodied improvisation gestures, and stage performance constraints, we develop the concept of a modular wearable MIDI interface to closely meet the needs of professional improvisers, rather than proposing a new generic instrument that would require substantial practice to adapt improvisational techniques already acquired. Our research draws upon different bodies of knowledge, from theoretical principles on collaboration and embodiment to wearable interface design, in order to create a digital vest called Track It, Zip It (TIZI) that features two innovative on-body sensors. Allowing for sound control, these sensors are seamlessly integrated with Black's improvisational gesture vocabulary. We then detail the design process of three TIZI prototypes structured by the outcomes of a performance test with Black, a public performance by a novice improviser during the 2017 International Guthman Musical Instrument Competition, and measurements of sensor responses. After commenting on the strengths and weaknesses of the final TIZI prototype, we discuss how our interdisciplinary and collective process involving a world-class improviser at the very center of the design process can provide recommendations to designers who wish to create interfaces better adapted to high-level performers. Finally, we present our goals for the future creation of a wireless version of the vest for a female body based on Diana Policarpo's artistic vision

    Development of a practical electrical tomography system for flexible contact sensing applications

    Get PDF
    Tactile sensing is seeing an increase in potential applications, such as in humanoid and industrial robots; health care systems and medical instrumentation; prosthetic devices; and in the context of human-machine interaction. However, these applications require the integration of tactile sensors over various objects with different surface shapes. This emphasises the need of developing sensors which are flexible in contrast with the common rigid type. Moreover, flexible sensing research is considered to be in its infancy. Many technological and system issues are still open, mainly: conformability; scalability; system integration; high system cost; sensor size; and power consumption. In light of the above, this thesis is concerned with the development of a flexible fabric-based contact sensor system. This is done through an interdisciplinary approach whereby electronics, system engineering, electrical tomography, and machine learning have been considered. This results in a practical flexible sensor that is capable of accurately detecting contact locations with high temporal resolution; and requires low power consumption.The sensor is based on the principle of electrical tomography. This is essential since this technique allows us to eliminate electrodes and wiring from within the sensing area, confining them to the periphery of the sensor. This improves flexibility all while eliminating electrode fatigue and deterioration due to repeated loading.We start by developing an electrical tomography sensor system. This comprises of a piezoresistive flexible fabric material, a data acquisition card, and a custom printed circuit board for managing both current injection and data collection. We show that current injection and voltage measurement protocols respond differently to different positions of the input contact region of interest, consequently affecting the overall performance of the tomography sensor system. Then, an approach for classifying contact location over the sensor is presented. This is done using supervised machine learning, namely discriminant analysis. Accurate touch location identification is achieved, along with an increase in the detection speed and sensor versatility. Finally, the sensor is placed over different surfaces in order to show and validate its efficiency. The main finding of this work is that electrical tomography flexible sensor systems present a very promising technology, and can be practically and effectively used for developing inexpensive and durable flexible sensors for tactile applications. The main advantage of this approach is the complete absence of wires in the internal area of the sensor. This allows the sensor to be placed over surfaces with different shapes without losing its functionality. The sensor's applicability can be further improved by using machine learning strategies due to their ability of empirical learning and extracting meaningful tactile information. The research work in this thesis was motivated by the problems faced by industrial partners which were part of the sustainable manufacturing and advanced robotics training network in Europe (SMART-e)

    Intonaspacio : comprehensive study on the conception and design of digital musical instruments : interaction between space and musical gesture

    Get PDF
    Site-specific art understands that the place where the artwork is presented cannot be excluded from the artwork itself. The completion of the work is only achieved when the artwork and place intersect. Acoustically, sound presents a natural relation with place. The perception of sound is the result of place modulation on its spectral content, likewise perception of place is dependent on the sound content of that place. Even so, the number of sound artworks where place has a primary role is still very reduced. We thus purpose to create a tool to compose inherently place-specific sounds. Inherently because the sound is the result of the interaction between place and performer. Place because is the concept that is closer to human perception and of the idea of intimacy. Along this thesis we suggest that this interaction can be mediated by a digital musical instrument - Intonaspacio, that allows the performer to compose place-specific sounds and control it. In the rst part we describe the process of construction and design of Intonaspacio - how to access the sound present in the place, what gestures to measure, what sensors to use and where to place them, what mapping to design in order to compose place-speci c sound. We start by suggesting two di erent mappings to combine place and sound, where we look at di erent approaches on how to excite the structural sound of the place, i.e., the resonant frequencies. The rst one, uses a process where the performer can record a sample of sound ambiance and reproduce it, creating a feedback loop that excites at each iteration the resonances of the room. The second approach suggest a method where the input sound is analyzed and an ensemble of the frequencies of the place with the highest amplitudes is extracted. These are mapped to control several parameters of sound e ects. To evaluate Intonaspacio we conducted an experiment with participants who played the instrument during several trial sessions. The analysis of this experiment led us to propose a third mapping that combines the previous mappings. The second part of the thesis intends to create the conditions to give longevity to Intonaspacio. Starting from the premise that a musical instrument to be classi ed as such needs to have a dedicated instrumental technique and repertoire. These two conditions were achieved first, by suggesting a gestural vocabulary of the idiomatic gestures of Intonaspacio based on direct observation of the most repeated gestures of the participants of our experiment. Second, by collaborating with two composers whom wrote two pieces for Intonaspacio.A arte situada é uma disciplina artística tradicionalmente ligada a Instalação que pretende criar obras que mantêm uma relação directa com o espaço onde são apresentadas. A obra de arte não pode assim ser separada desse mesmo espaço sem perder o significado inicial. O som pelas suas características físicas reflecte naturalmente o espaço onde foi emitido, isto é, a percepção que temos de um som resulta da combinação do som directo com as reflecções do mesmo no espaço (cujo tempo e amplitude est~ao directamente relacionados com a arquitectura do espaço). Nesta lógica a arte sonora seria aquela que mais directamente procuraria compôr som situado. No entanto, o espaço é raramente utilizado como fen omeno criativo intencional. Nesse sentido, o trabalho aqui apresentado propõem-se a investigar a possibiliade de criar sons situados. O termo Espaço está muitas vezes associado a algo de dimensões vastas e ilimitadas. Assim sendo e na óptica da arte situada, onde h a uma necessidade de criar uma relação, parece-nos que lugar é um termo mais adequado para enquadrar o nosso trabalho de investigação. O lugar, para além de representar um espaço onde se podem estabelecer relações de intimidade (proximidade), apresenta dimensões que são moldáveis consoante a percepcção e o corpo humano. Ou seja, o Homem ao deslocar-se no lugar vai ao mesmo tempo de nindo as fronteiras desse mesmo lugar. Esta visão do lugar aparece no final do século dezanove quando a filosofia começa a orientar o pensamento para uma visão mais direccionada para o Homem e para a percepcção humana. O lugar passa então a representar algo que e estabelecido na acção e pela percepcção humana, onde e possível estabelecer relações de intimidade, ao contrário dos não-lugares (sítios mais ou menos descaracterizados onde as pessoas estão só de passagem). Re-adaptámos por isso a nossa questão inicial não só para realçar esta ideia de lugar mas também para reflectir uma bi-direccionalidade perceptiva que é fulcral para a arte situada - como criar e controlar sons inerentemente localizados? Inerentemente porque para existir de facto uma interacção entre lugar e obra de arte sonora são necessárias duas condições: por um lado o som possa provocar uma resposta do lugar, e por outro, o lugar possa modificar a nossa percepcção dele mesmo. A existência de uma relação interactiva abre espaço a um novo ponto que não tinhámos considerado anteriormente e que acrescentámos a nossa nova questão, o controlo. Propomos como possível reposta a esta questão a construção de um instrumento musical digital, o Intonaspacio, que servir a de mediador desta interacção e que possibilitará ao performer a criação e o controlo de sons localizados. Primeiro poque o instrumento musical possibilita o aumento das capacidades humanas, através da extensão do corpo humano (tal como um garfo extende a nossa mão, por exemplo). Segundo, porque o instrumento musical digital pelas suas características, nomeadamente pela separação entre o sistema de controlo e o sistema de geração de som abre novas possibilidades sonoras antes excluídas por limitações mecâncias ou humanas. Podemos por isso visionar um acesso mais alargado a novas dimensões espaciais e temporais. Esta tese está dividida em duas partes, na primeira parte descrevemos a construção do Intonaspacio, e na segunda estabelecemos as bases para permitir a sua longevidade. A primeira parte começa por investigar formas de acesso ao som do lugar, composto pelo conjunto dos sons ambiente e dos sons estruturais do lugar (ressonâncias próprias resultantes da arquitectura). Pensamos que uma das possíveis formas de compôr sons localizados e precisamente através da possibilidade de poder ter os sons ambiente a gerar e a amplificar os sons estruturais. Surgem então duas novas questões de natureza técnica: Como integrar o som ambiente na obra sonora em tempo-real? Como permitir que estes excitem a resposta do espaço? Para as responder desenhámos dois mapeamentos diferentes. Um primeiro em que o performer pode gravar pequenos trechos de som ambiente que são emitidos e re-gravados criando um ciclo de feedback que excita as ressonâncias do lugar. Um segundo método onde se faz uma análise espectral ao som captado e se extrai um conjunto de frequências cujas amplitudes são as mais elevadas. Estas são posteriormente utilizadas para controlar parâmetros de vários efeitos sonoros. Colocámos ainda no instrumento um conjunto de sensores diferentes para captar o gesto do performer. Estes estão localizados em diferentes areas do esqueleto do instrumento de modo a permitir areas sensíveis maiores e consequentemente um maior n umero de graus de liberdade ao performer. Neste momento o Intonaspacio permite extrair cerca de 17 características diferentes, agrupadas em três secções - orientação, impacto e distância. Estas podem ser utilizadas para modelar o som gerado pelo instrumento através dos diferentes mapeamentos. Ambas as propostas de mapeamento foram avaliadas por um conjunto de pessoas durante um teste de utilização do Intonaspacio. Os resultados deste permitiram-nos chegar a uma terceira sugestão de mapeamento onde combinamos características de ambas as propostas anteriores. No terceiro mapeamento mantém-se a análise ao som captado pelo instrumento mas a informação recolhida e usada como material sonoro de um algoritmo de síntese aditiva. A segunda parte da tese parte de uma premissa estabelecida durante o trabalho realizado nesta tese. Um instrumento musical deve possuir uma técnica instrumental própria e um repertório dedicado para que seja considerado enquanto tal. Neste sentido e com base na observação directa dos gestos mais comuns entre participantes do nosso estudo, propusémos um vocabulário gestual dos gestos idiomáticos do Intonaspacio, ou seja, dos gestos que dependem exclusivamente da forma do próprio instrumento e da localização dos sensores na estrutura do instrumento (zonas sensíveis) e são independentes do mapeamento. Colaborámos ainda com dois compositores que escreveram duas peças musicais para o Intonaspacio. O Intonaspacio revelou ser um instrumento complexo e expressivo que possibilita aos performers incluir o lugar enquanto parâmetro criativo, no entanto apresenta ainda alguns problemas de controlo. No primeiro mapeamento, embora a integração do lugar seja sentida como mais directa e apresentando resultados sonoros mais interessantes (de acordo com os participantes do estudo), a sensação de controlo é muito baixa. Já no segundo mapeamento, embora tenha um controlo mais fácil, a presença do lugar é muito subtil e pouco perceptível. Esperamos que o terceiro mapeamento venha contribuir para solucionar este problema e aumentar o interesse no instrumento, principalmente por parte dos compositores com quem colaborámos e iremos colaborar no futuro
    corecore