9 research outputs found

    Simulation of Human and Artificial Emotion (SHArE)

    Full text link
    The framework for Simulation of Human and Artificial Emotion (SHArE) describes the architecture of emotion in terms of parameters transferable between psychology, neuroscience, and artificial intelligence. These parameters can be defined as abstract concepts or granularized down to the voltage levels of individual neurons. This model enables emotional trajectory design for humans which may lead to novel therapeutic solutions for various mental health concerns. For artificial intelligence, this work provides a compact notation which can be applied to neural networks as a means to observe the emotions and motivations of machines

    A Review of Verbal and Non-Verbal Human-Robot Interactive Communication

    Get PDF
    In this paper, an overview of human-robot interactive communication is presented, covering verbal as well as non-verbal aspects of human-robot interaction. Following a historical introduction, and motivation towards fluid human-robot communication, ten desiderata are proposed, which provide an organizational axis both of recent as well as of future research on human-robot communication. Then, the ten desiderata are examined in detail, culminating to a unifying discussion, and a forward-looking conclusion

    RoboAct : control de acciones para un actor robótico

    Get PDF
    El uncanny valley (o Valle Inquietante) es una hipótesis de la robótica que nos dice que cuando hay robots antropomórficos parecidos en exceso a una persona, causan una sensación de rechazo e inconformidad, sin embargo si el robot es parcialmente parecido, se le manipula como una máquina, generando la reacción inversa y desmejorando la interacción con éste. Con éste trabajo, se busca solucionar el problema de interacción humano-robot con los robots humanoides, realizando un acercamiento desde la psicología humana y aplicando estos resultados a través de algoritmos de cinemática robótica, haciendo que los gestos del robot sean más humanos.The uncanny valley it s a hypothesis in the robotic filed which holds that when are highly human-like anthropomorphic robots, they cause a negative reaction in the people and sensation of discomfort, nonetheless when there is some anthropomorphic similarity, the robot it-s treated like a machine, causing the opposite effect. With this work we want to solve the human-humanoid-robot interaction using a psychological approach and applying this results using robot kinematics algorithms, thus, making the robot gestures, more human-like.Ingeniero (a) de SistemasPregrad

    Interfaz emocional para el robot darwin mini en el contexto de teatro robótico

    Get PDF
    En el presente trabajo se diseñó e implementó el control de acciones para un robot humanoide Darwin Mini, que permite representar emociones en el contexto de teatro robótico. Para llegar a este fin se calculó la cinemática directa, y se desarrolló una aplicación modular en Java la cual se comunica a través de Bluetooth con el robot Darwin Mini. El robot es controlado por la tarjeta OpenCM9.04, donde se realizó el programa que permite moverlo. Por otro lado, los movimientos del robot se modelaron utilizando un grafo dirigido, en el cual los nodos representan cada posible posición del robot, y una rutina se genera a través del recorrido sobre dicho grafo. Con el fin de evaluar el éxito del proyecto, se realizaron pruebas de Likert con niños de la edad objetivo, para comprobar la efectividad del robot en la transmisión de emociones diferentes con unas acciones determinadas, cuyo resultado fue satisfactorio y positivoIn the present work was designed and implemented the control of actions for a humanoid robot Darwin Mini, which allows to represent emotions in the context of robotic theater. To achieve this goal, direct kinematics was calculated, and a modular application was developed in Java, which allows the components of the graphic interface to be separated from the functionalities required by the system. In addition, the application communicates via Bluetooth with the Darwin Mini robot. The robot is controlled by the OpenCM9.04 card, where the program that allows it to be moved was made. On the other hand, the movements of the robot were modeled using a directed graph, in which the nodes represent each possible position of the robot, and a routine is generated through the path on said graph. In order to evaluate the success of the project, Likert tests were conducted with children of the target age, to verify the effectiveness of the robot in the transmission of different emotions with certain actions, whose result was satisfactory and positiveIngeniero (a) ElectrónicoPregrad

    The development of a human-robot interface for industrial collaborative system

    Get PDF
    Industrial robots have been identified as one of the most effective solutions for optimising output and quality within many industries. However, there are a number of manufacturing applications involving complex tasks and inconstant components which prohibit the use of fully automated solutions in the foreseeable future. A breakthrough in robotic technologies and changes in safety legislations have supported the creation of robots that coexist and assist humans in industrial applications. It has been broadly recognised that human-robot collaborative systems would be a realistic solution as an advanced production system with wide range of applications and high economic impact. This type of system can utilise the best of both worlds, where the robot can perform simple tasks that require high repeatability while the human performs tasks that require judgement and dexterity of the human hands. Robots in such system will operate as “intelligent assistants”. In a collaborative working environment, robot and human share the same working area, and interact with each other. This level of interface will require effective ways of communication and collaboration to avoid unwanted conflicts. This project aims to create a user interface for industrial collaborative robot system through integration of current robotic technologies. The robotic system is designed for seamless collaboration with a human in close proximity. The system is capable to communicate with the human via the exchange of gestures, as well as visual signal which operators can observe and comprehend at a glance. The main objective of this PhD is to develop a Human-Robot Interface (HRI) for communication with an industrial collaborative robot during collaboration in proximity. The system is developed in conjunction with a small scale collaborative robot system which has been integrated using off-the-shelf components. The system should be capable of receiving input from the human user via an intuitive method as well as indicating its status to the user ii effectively. The HRI will be developed using a combination of hardware integrations and software developments. The software and the control framework were developed in a way that is applicable to other industrial robots in the future. The developed gesture command system is demonstrated on a heavy duty industrial robot

    Cognition, Affects et Interaction

    No full text
    International audienceCet ouvrage rassemble les travaux d’études et de recherche effectués dans le cadre du cours «Cognition, Affects et Interaction » que nous avons animé au 1er semestre 2015-2016. Cette deuxième édition de cours poursuit le principe inauguré en 2014 : aux cours magistraux donnés sur la thématique "Cognition, Interaction & Affects" qui donnent les outils méthodologiques des composantes de l’interaction socio-communicative, nous avons couplé une introduction à la robotique sociale et un apprentissage actif par travail de recherche en binômes. Le principe de ces travaux d’études et de recherche est d’effectuer une recherche bibliographique et de rédiger un article de synthèse sur un aspect de l’interaction homme-robot. Si plusieurs sujets ont été proposés aux étudiants en début d’année, certains binômes ont choisi d’aborder l’interaction avec un angle original qui reflète souvent les trajectoires de formation variés des étudiants en sciences cognitives (ingénierie, sociologie, psychologie, etc). Le résultat dépasse nos espérances : le lecteur trouvera une compilation d’articles argumentés de manière solide, rédigés de manière claire et présentés avec soin. Ces premières «publications» reflètent les capacités singulières de réflexion de cette promotion en nette augmentation par rapport à l’année précédente. Nous espérons que cette série d’ouvrages disponibles sous HAL puisse servir de point d’entrée à des étudiants ou chercheurs intéressés à explorer ce champ de recherches pluri-disciplinaire

    Automatic extraction of constraints in manipulation tasks for autonomy and interaction

    Get PDF
    Tasks routinely executed by humans involve sequences of actions performed with high dexterity and coordination. Fully specifying these actions such that a robot could replicate the task is often difficult. Furthermore the uncertainties introduced by the use of different tools or changing configurations demand the specification to be generic, while enhancing the important task aspects, i.e. the constraints. Therefore the first challenge of this thesis is inferring these constraints from repeated demonstrations. In addition humans explaining a task to another person rely on the person's ability to apprehend missing or implicit information. Therefore observations contain user-specific cues, alongside knowledge on performing the task. Thus our second challenge is correlating the task constraints with the user behavior for improving the robot's performance. We address these challenges using a Programming by Demonstration framework. In the first part of the thesis we describe an approach for decomposing demonstrations into actions and extracting task-space constraints as continuous features that apply throughout each action. The constraints consist of: (1) the reference frame for performing manipulation, (2) the variables of interest relative to this frame, allowing a decomposition in force and position control, and (3) a stiffness gain modulating the contribution of force and position. We then extend this approach to asymmetrical bimanual tasks by extracting features that enable arm coordination: the master--slave role that enables precedence, and the motion--motion or force--motion coordination that facilitates the physical interaction through an object. The set of constraints and the time-independent encoding of each action form a task prototype, used to execute the task. In the second part of the thesis we focus on discovering additional features implicit in the demonstrations with respect to two aspects of the teaching interactions: (1) characterizing the user performance and (2) improving the user behavior. For the first goal we assess the skill of the user and implicitly the quality of the demonstrations by using objective task--specific metrics, related directly to the constraints. We further analyze ways of making the user aware of the robot's state during teaching by providing task--related feedback. The feedback has a direct influence on both the teaching efficiency and the user's perception of the interaction. We evaluated our approaches on robotic experiments that encompass daily activities using two 7 degrees of freedom Kuka LWR robotic arms, and a 53 degrees of freedom iCub humanoid robot
    corecore