1,175 research outputs found

    Data-driven body–machine interface for the accurate control of drones

    Get PDF
    The teleoperation of nonhumanoid robots is often a demanding task, as most current control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface, and may therefore be challenging to master. Here, we describe a structured methodology to identify common patterns in spontaneous interaction behaviors, to implement embodied user interfaces, and to select the appropriate sensor type and positioning. Using this method, we developed an intuitive, gesture-based control interface for real and simulated drones, which outperformed a standard joystick in terms of learning time and steering abilities. Implementing this procedure to identify body-machine patterns for specific applications could support the development of more intuitive and effective interfaces

    MosAIck: Staging Contemporary AI Performance - Connecting Live Coding, E-Textiles and Movement

    Get PDF
    This paper introduces our collective work “Patterns in Between Intelligences”, a performance piece that builds an artistic practice between live coding sounds and coding through dance, mediated and shaped through e-textile sensors. This creates a networked system of which both live coded processes and human bodies are part. The paper describes in detail the implementations of technology used in the prototype performance performed at No Bounds Festival in Sheffield UK, October 2022, as well as discussions and concerns the team had related to the use of AI technology on stage. The paper concludes with a narrative reflection on the Sheffield performance, and reflections on it

    Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces

    Get PDF
    This paper contributes to a taxonomy of augmented reality and robotics based on a survey of 460 research papers. Augmented and mixed reality (AR/MR) have emerged as a new way to enhance human-robot interaction (HRI) and robotic interfaces (e.g., actuated and shape-changing interfaces). Recently, an increasing number of studies in HCI, HRI, and robotics have demonstrated how AR enables better interactions between people and robots. However, often research remains focused on individual explorations and key design strategies, and research questions are rarely analyzed systematically. In this paper, we synthesize and categorize this research field in the following dimensions: 1) approaches to augmenting reality; 2) characteristics of robots; 3) purposes and benefits; 4) classification of presented information; 5) design components and strategies for visual augmentation; 6) interaction techniques and modalities; 7) application domains; and 8) evaluation strategies. We formulate key challenges and opportunities to guide and inform future research in AR and robotics

    Brave New GES World:A Systematic Literature Review of Gestures and Referents in Gesture Elicitation Studies

    Get PDF
    How to determine highly effective and intuitive gesture sets for interactive systems tailored to end users’ preferences? A substantial body of knowledge is available on this topic, among which gesture elicitation studies stand out distinctively. In these studies, end users are invited to propose gestures for specific referents, which are the functions to control for an interactive system. The vast majority of gesture elicitation studies conclude with a consensus gesture set identified following a process of consensus or agreement analysis. However, the information about specific gesture sets determined for specific applications is scattered across a wide landscape of disconnected scientific publications, which poses challenges to researchers and practitioners to effectively harness this body of knowledge. To address this challenge, we conducted a systematic literature review and examined a corpus of N=267 studies encompassing a total of 187, 265 gestures elicited from 6, 659 participants for 4, 106 referents. To understand similarities in users’ gesture preferences within this extensive dataset, we analyzed a sample of 2, 304 gestures extracted from the studies identified in our literature review. Our approach consisted of (i) identifying the context of use represented by end users, devices, platforms, and gesture sensing technology, (ii) categorizing the referents, (iii) classifying the gestures elicited for those referents, and (iv) cataloging the gestures based on their representation and implementation modalities. Drawing from the findings of this review, we propose guidelines for conducting future end-user gesture elicitation studies

    Discoverable Free Space Gesture Sets for Walk-Up-and-Use Interactions

    Get PDF
    abstract: Advances in technology are fueling a movement toward ubiquity for beyond-the-desktop systems. Novel interaction modalities, such as free space or full body gestures are becoming more common, as demonstrated by the rise of systems such as the Microsoft Kinect. However, much of the interaction design research for such systems is still focused on desktop and touch interactions. Current thinking in free-space gestures are limited in capability and imagination and most gesture studies have not attempted to identify gestures appropriate for public walk-up-and-use applications. A walk-up-and-use display must be discoverable, such that first-time users can use the system without any training, flexible, and not fatiguing, especially in the case of longer-term interactions. One mechanism for defining gesture sets for walk-up-and-use interactions is a participatory design method called gesture elicitation. This method has been used to identify several user-generated gesture sets and shown that user-generated sets are preferred by users over those defined by system designers. However, for these studies to be successfully implemented in walk-up-and-use applications, there is a need to understand which components of these gestures are semantically meaningful (i.e. do users distinguish been using their left and right hand, or are those semantically the same thing?). Thus, defining a standardized gesture vocabulary for coding, characterizing, and evaluating gestures is critical. This dissertation presents three gesture elicitation studies for walk-up-and-use displays that employ a novel gesture elicitation methodology, alongside a novel coding scheme for gesture elicitation data that focuses on features most important to users’ mental models. Generalizable design principles, based on the three studies, are then derived and presented (e.g. changes in speed are meaningful for scroll actions in walk up and use displays but not for paging or selection). The major contributions of this work are: (1) an elicitation methodology that aids users in overcoming biases from existing interaction modalities; (2) a better understanding of the gestural features that matter, e.g. that capture the intent of the gestures; and (3) generalizable design principles for walk-up-and-use public displays.Dissertation/ThesisDoctoral Dissertation Computer Science 201

    Finding Music in Chaos: Designing and Composing with Virtual Instruments Inspired by Chaotic Equations

    Get PDF
    Using chaos theory to design novel audio synthesis engines has been explored little in computer music. This could be because of the difficulty of obtaining harmonic tones or the likelihood of chaos-based synthesis engines to explode, which then requires re-instantiating of the engine to proceed with sound production. This process is not desirable when composing because of the time wasted fixing the synthesis engine instead of the composer being able to focus completely on the creative aspects of composition. One way to remedy these issues is to connect chaotic equations to individual parts of the synthesis engine instead of relying on the chaos as the primary source of all sound-producing procedures. To do this, one can create a physically-based synthesis model and connect chaotic equations to individual parts of the model. The goal of this project is to design a physically-inspired virtual instrument based on a conceptual percussion instrument model that utilizes chaos theory in the synthesis engine to explore novel sounds in a reliable and repeatable way for other composers and performers to use. This project presents a two-movement composition utilizing these concepts and a modular set of virtual instruments that can be used by anyone, which can be interacted with by a new electronic music controller called the Hexapad controller and standard MIDI controllers. The physically-inspired instrument created for the Hexapad controller is called the Ambi-Drum and standard MIDI controllers are used to control synthesis parameters and other virtual instruments

    Towards Intelligent Playful Environments for Animals based on Natural User Interfaces

    Full text link
    Tesis por compendioEl estudio de la interacción de los animales con la tecnología y el desarrollo de sistemas tecnológicos centrados en el animal está ganando cada vez más atención desde la aparición del área de Animal Computer Interaction (ACI). ACI persigue mejorar el bienestar de los animales en diferentes entornos a través del desarrollo de tecnología adecuada para ellos siguiendo un enfoque centrado en el animal. Entre las líneas de investigación que ACI está explorando, ha habido bastante interés en la interacción de los animales con la tecnología basada en el juego. Las actividades de juego tecnológicas tienen el potencial de proveer estimulación mental y física a los animales en diferentes contextos, pudiendo ayudar a mejorar su bienestar. Mientras nos embarcamos en la era de la Internet de las Cosas, las actividades de juego tecnológicas actuales para animales todavía no han explorado el desarrollo de soluciones pervasivas que podrían proveerles de más adaptación a sus preferencias a la vez que ofrecer estímulos tecnológicos más variados. En su lugar, estas actividades están normalmente basadas en interacciones digitales en lugar de explorar dispositivos tangibles o aumentar las interacciones con otro tipo de estímulos. Además, estas actividades de juego están ya predefinidas y no cambian con el tiempo, y requieren que un humano provea el dispositivo o la tecnología al animal. Si los humanos pudiesen centrarse más en su participación como jugadores de un sistema interactivo para animales en lugar de estar pendientes de sujetar un dispositivo para el animal o de mantener el sistema ejecutándose, esto podría ayudar a crear lazos más fuertes entre especies y promover mejores relaciones con los animales. Asimismo, la estimulación mental y física de los animales son aspectos importantes que podrían fomentarse si los sistemas de juego diseñados para ellos pudieran ofrecer un variado rango de respuestas, adaptarse a los comportamientos del animal y evitar que se acostumbre al sistema y pierda el interés. Por tanto, esta tesis propone el diseño y desarrollo de entornos tecnológicos de juego basados en Interfaces Naturales de Usuario que puedan adaptarse y reaccionar a las interacciones naturales de los animales. Estos entornos pervasivos permitirían a los animales jugar por si mismos o con una persona, ofreciendo actividades de juego más dinámicas y atractivas capaces de adaptarse con el tiempo.L'estudi de la interacció dels animals amb la tecnologia i el desenvolupament de sistemes tecnològics centrats en l'animal està guanyant cada vegada més atenció des de l'aparició de l'àrea d'Animal Computer Interaction (ACI) . ACI persegueix millorar el benestar dels animals en diferents entorns a través del desenvolupament de tecnologia adequada per a ells amb un enfocament centrat en l'animal. Entre totes les línies d'investigació que ACI està explorant, hi ha hagut prou interès en la interacció dels animals amb la tecnologia basada en el joc. Les activitats de joc tecnològiques tenen el potencial de proveir estimulació mental i física als animals en diferents contextos, podent ajudar a millorar el seu benestar. Mentre ens embarquem en l'era de la Internet de les Coses, les activitats de joc tecnològiques actuals per a animals encara no han explorat el desenvolupament de solucions pervasives que podrien proveir-los de més adaptació a les seues preferències al mateix temps que oferir estímuls tecnològics més variats. En el seu lloc, estes activitats estan normalment basades en interaccions digitals en compte d'explorar dispositius tangibles o augmentar les interaccions amb estímuls de diferent tipus. A més, aquestes activitats de joc estan ja predefinides i no canvien amb el temps, mentre requereixen que un humà proveïsca el dispositiu o la tecnologia a l'animal. Si els humans pogueren centrar-se més en la seua participació com a jugadors actius d'un sistema interactiu per a animals en compte d'estar pendents de subjectar un dispositiu per a l'animal o de mantenir el sistema executant-se, açò podria ajudar a crear llaços més forts entre espècies i promoure millors relacions amb els animals. Així mateix, l'estimulació mental i física dels animals són aspectes importants que podrien fomentar-se si els sistemes de joc dissenyats per a ells pogueren oferir un rang variat de respostes, adaptar-se als comportaments de l'animal i evitar que aquest s'acostume al sistema i perda l'interès. Per tant, esta tesi proposa el disseny i desenvolupament d'entorns tecnològics de joc basats en Interfícies Naturals d'Usuari que puguen adaptar-se i reaccionar a les interaccions naturals dels animals. Aquestos escenaris pervasius podrien permetre als animals jugar per si mateixos o amb una persona, oferint activitats de joc més dinàmiques i atractives que siguen capaces d'adaptar-se amb el temps.The study of animals' interactions with technology and the development of animal-centered technological systems is gaining attention since the emergence of the research area of Animal Computer Interaction (ACI). ACI aims to improve animals' welfare and wellbeing in several scenarios by developing suitable technology for the animal following an animal-centered approach. Among all the research lines ACI is exploring, there has been significant interest in animals' playful interactions with technology. Technologically mediated playful activities have the potential to provide mental and physical stimulation for animals in different environmental contexts, which could in turn help to improve their wellbeing. As we embark in the era of the Internet of Things, current technological playful activities for animals have not yet explored the development of pervasive solutions that could provide animals with more adaptation to their preferences as well as offering varied technological stimuli. Instead, playful technology for animals is usually based on digital interactions rather than exploring tangible devices or augmenting the interactions with different stimuli. In addition, these playful activities are already predefined and do not change over time, while they require that a human has to be the one providing the device or technology to the animal. If humans could focus more on their participation as active players of an interactive system aimed for animals instead of being concerned about holding a device for the animal or keep the system running, this might help to create stronger bonds between species and foster better relationships with animals. Moreover, animals' mental and physical stimulation are important aspects that could be fostered if the playful systems designed for animals could offer a varied range of outputs, be tailored to the animal's behaviors and prevented the animal to get used to the system and lose interest. Therefore, this thesis proposes the design and development of technological playful environments based on Natural User Interfaces that could adapt and react to the animals' natural interactions. These pervasive scenarios would allow animals to play by themselves or with a human, providing more engaging and dynamic playful activities that are capable of adapting over time.Pons Tomás, P. (2018). Towards Intelligent Playful Environments for Animals based on Natural User Interfaces [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/113075TESISCompendi

    Interaction Design for Digital Musical Instruments

    Get PDF
    The thesis aims to elucidate the process of designing interactive systems for musical performance that combine software and hardware in an intuitive and elegant fashion. The original contribution to knowledge consists of: (1) a critical assessment of recent trends in digital musical instrument design, (2) a descriptive model of interaction design for the digital musician and (3) a highly customisable multi-touch performance system that was designed in accordance with the model. Digital musical instruments are composed of a separate control interface and a sound generation system that exchange information. When designing the way in which a digital musical instrument responds to the actions of a performer, we are creating a layer of interactive behaviour that is abstracted from the physical controls. Often, the structure of this layer depends heavily upon: 1. The accepted design conventions of the hardware in use 2. Established musical systems, acoustic or digital 3. The physical configuration of the hardware devices and the grouping of controls that such configuration suggests This thesis proposes an alternate way to approach the design of digital musical instrument behaviour – examining the implicit characteristics of its composite devices. When we separate the conversational ability of a particular sensor type from its hardware body, we can look in a new way at the actual communication tools at the heart of the device. We can subsequently combine these separate pieces using a series of generic interaction strategies in order to create rich interactive experiences that are not immediately obvious or directly inspired by the physical properties of the hardware. This research ultimately aims to enhance and clarify the existing toolkit of interaction design for the digital musician
    corecore