225 research outputs found

    Overview of some Command Modes for Human-Robot Interaction Systems

    Get PDF
    Interaction and command modes as well as their combination are essential features of modern and futuristic robotic systems interacting with human beings in various dynamical environments. This paper presents a synthetic overview concerning the most command modes used in Human-Robot Interaction Systems (HRIS). It includes the first historical command modes which are namely tele-manipulation, off-line robot programming, and traditional elementary teaching by demonstration. It then introduces the most recent command modes which have been fostered later on by the use of artificial intelligence techniques implemented on more powerful computers. In this context, we will consider specifically the following modes: interactive programming based on the graphical-user-interfaces, voice-based, pointing-on-image-based, gesture-based, and finally brain-based commands.info:eu-repo/semantics/publishedVersio

    Motion Modeling for Expressive Interaction

    Get PDF
    While human-human or human-object interactions involve very rich, complex and nuanced gestures, gestures as they are captured for human-computer interaction remain relatively simplistic. Our approach is to consider the study of variation of motion input as a way of understanding expression and expressivity in human-computer interaction and in order to propose computational solutions for capturing and using these expressive variations. The paper reports an attempt at drawing the lines of design guidelines for modeling systems adapting to motion variations. We propose to illustrate them through two case studies: the first model is used to estimate temporal and geometrical motion variations while the second is used to track variations of motion dynamics. These case studies are illustrated in two application

    A Fuzzy Logic Architecture for Rehabilitation Robotic Systems

    Get PDF
    Robots are highly incorporated in rehabilitation in the last decade to compensate lost functions in disabled individuals. By controlling the rehabilitation robots from far, many benefits are achieved. These benefits include but not restricted to minimum hospital stays, decreasing cost, and increasing the level of care. The main goal of this work is to have an effective solution to take care of patients from far. Tackling the problem of the remote control of rehabilitation robots is undergoing and highly challenging. In this paper, a remote wrist rehabilitation system is presented. The developed system is a sophisticated robot ensuring the two wrist movements (Flexion /extension and abduction/adduction). Additionally, the proposed system provides a software interface enabling the physiotherapists to control the rehabilitation process remotely. The patient’s safety during the therapy is achieved through the integration of a fuzzy controller in the system control architecture. The fuzzy controller is employed to control the robot action according to the pain felt by the patient. By using fuzzy logic approach, the system can adapt effectively according to the patients’ conditions. The Queue Telemetry Transport Protocol (MQTT) is considered to overcome the latency during the human robot interaction. Based on a Kinect camera, the control technique is made gestural. The physiotherapist gestures are detected and transmitted to the software interface to be processed and be sent to the robot. The acquired measurements are recorded in a database that can be used later to monitor patient progress during the treatment protocol. The obtained experimental results show the effectiveness of the developed remote rehabilitation system

    A computational approach to gestural interactions of the upper limb on planar surfaces

    Get PDF
    There are many compelling reasons for proposing new gestural interactions: one might want to use a novel sensor that affords access to data that couldn’t be previously captured, or transpose a well-known task into a different unexplored scenario. After an initial design phase, the creation, optimisation or understanding of new interactions remains, however, a challenge. Models have been used to foresee interaction properties: Fitts’ law, for example, accurately predicts movement time in pointing and steering tasks. But what happens when no existing models apply? The core assertion to this work is that a computational approach provides frameworks and associated tools that are needed to model such interactions. This is supported through three research projects, in which discriminative models are used to enable interactions, optimisation is included as an integral part of their design and reinforcement learning is used to explore motions users produce in such interactions

    Practical, appropriate, empirically-validated guidelines for designing educational games

    Get PDF
    There has recently been a great deal of interest in the potential of computer games to function as innovative educational tools. However, there is very little evidence of games fulfilling that potential. Indeed, the process of merging the disparate goals of education and games design appears problematic, and there are currently no practical guidelines for how to do so in a coherent manner. In this paper, we describe the successful, empirically validated teaching methods developed by behavioural psychologists and point out how they are uniquely suited to take advantage of the benefits that games offer to education. We conclude by proposing some practical steps for designing educational games, based on the techniques of Applied Behaviour Analysis. It is intended that this paper can both focus educational games designers on the features of games that are genuinely useful for education, and also introduce a successful form of teaching that this audience may not yet be familiar with

    Haptic Media Scenes

    Get PDF
    The aim of this thesis is to apply new media phenomenological and enactive embodied cognition approaches to explain the role of haptic sensitivity and communication in personal computer environments for productivity. Prior theory has given little attention to the role of haptic senses in influencing cognitive processes, and do not frame the richness of haptic communication in interaction design—as haptic interactivity in HCI has historically tended to be designed and analyzed from a perspective on communication as transmissions, sending and receiving haptic signals. The haptic sense may not only mediate contact confirmation and affirmation, but also rich semiotic and affective messages—yet this is a strong contrast between this inherent ability of haptic perception, and current day support for such haptic communication interfaces. I therefore ask: How do the haptic senses (touch and proprioception) impact our cognitive faculty when mediated through digital and sensor technologies? How may these insights be employed in interface design to facilitate rich haptic communication? To answer these questions, I use theoretical close readings that embrace two research fields, new media phenomenology and enactive embodied cognition. The theoretical discussion is supported by neuroscientific evidence, and tested empirically through case studies centered on digital art. I use these insights to develop the concept of the haptic figura, an analytical tool to frame the communicative qualities of haptic media. The concept gauges rich machine- mediated haptic interactivity and communication in systems with a material solution supporting active haptic perception, and the mediation of semiotic and affective messages that are understood and felt. As such the concept may function as a design tool for developers, but also for media critics evaluating haptic media. The tool is used to frame a discussion on opportunities and shortcomings of haptic interfaces for productivity, differentiating between media systems for the hand and the full body. The significance of this investigation is demonstrating that haptic communication is an underutilized element in personal computer environments for productivity and providing an analytical framework for a more nuanced understanding of haptic communication as enabling the mediation of a range of semiotic and affective messages, beyond notification and confirmation interactivity

    Adaptive Gesture Recognition with Variation Estimation for Interactive Systems

    Get PDF
    This paper presents a gesture recognition/adaptation system for Human Computer Interaction applications that goes beyond activity classification and that, complementary to gesture labeling, characterizes the movement execution. We describe a template-based recognition method that simultaneously aligns the input gesture to the templates using a Sequential Montecarlo inference technique. Contrary to standard template- based methods based on dynamic programming, such as Dynamic Time Warping, the algorithm has an adaptation process that tracks gesture variation in real-time. The method continuously updates, during execution of the gesture, the estimated parameters and recognition results which offers key advantages for continuous human-machine interaction. The technique is evaluated in several different ways: recognition and early recognition are evaluated on a 2D onscreen pen gestures; adaptation is assessed on synthetic data; and both early recognition and adaptation is evaluation in a user study involving 3D free space gestures. The method is not only robust to noise and successfully adapts to parameter variation but also performs recognition as well or better than non-adapting offline template-based methods

    The Design and Implementation of a Kinect-Based Rehabilitation Exercise Monitoring and Guidance System

    Get PDF
    In preventive and rehabilitative healthcare, physical exercise is a powerful intervention. However, a program may require in the range of thousands of practice repetitions, and many people do not adhere to the program or perform their home exercises incorrectly, making the exercise ineffective, or even dangerous. This thesis research aims to develop a Kinect-based system for rehabilitation exercises monitoring and guidance. In the first step, a feasibility study was carried out on using Kinect for realtime monitoring of rehabilitation exercises while a multi-camera motion tracking system was used to establish the ground truth. In the second step, a Unity-based system was developed to provide realtime monitoring and guidance to patients. The Unity framework was chosen because it enables us to use virtual reality techniques to demonstrate detailed movements to the patient, and to facilitate examination of the quality and quantity of the patient sessions by the clinician. The avatar-based rendering of motion also preserves the privacy of the patients, which is essential for healthcare system
    • …
    corecore