13 research outputs found

    Physiologically attentive user interface for robot teleoperation: real time emotional state estimation and interface modification using physiology, facial expressions and eye movements

    Get PDF
    We developed a framework for Physiologically Attentive User Interfaces, to reduce the interaction gap between humans and machines in life critical robot teleoperations. Our system utilizes emotional state awareness capabilities of psychophysiology and classifies three emotional states (Resting, Stress, and Workload) by analysing physiological data along with facial expression and eye movement analysis. This emotional state estimation is then used to create a dynamic interface that updates in real time with respect to user’s emotional state. The results of a preliminary evaluation of the developed emotional state classifier for robot teleoperation are presented, along with its future possibilities are discussed.info:eu-repo/semantics/acceptedVersio

    Physiologically attentive user interface for improved robot teleoperation

    Get PDF
    User interfaces (UI) are shifting from being attention-hungry to being attentive to users’ needs upon interaction. Interfaces developed for robot teleoperation can be particularly complex, often displaying large amounts of information, which can increase the cognitive overload that prejudices the performance of the operator. This paper presents the development of a Physiologically Attentive User Interface (PAUI) prototype preliminary evaluated with six participants. A case study on Urban Search and Rescue (USAR) operations that teleoperate a robot was used although the proposed approach aims to be generic. The robot considered provides an overly complex Graphical User Interface (GUI) which does not allow access to its source code. This represents a recurring and challenging scenario when robots are still in use, but technical updates are no longer offered that usually mean their abandon. A major contribution of the approach is the possibility of recycling old systems while improving the UI made available to end users and considering as input their physiological data. The proposed PAUI analyses physiological data, facial expressions, and eye movements to classify three mental states (rest, workload, and stress). An Attentive User Interface (AUI) is then assembled by recycling a pre-existing GUI, which is dynamically modified according to the predicted mental state to improve the user's focus during mentally demanding situations. In addition to the novelty of the proposed PAUIs that take advantage of pre-existing GUIs, this work also contributes with the design of a user experiment comprising mental state induction tasks that successfully trigger high and low cognitive overload states. Results from the preliminary user evaluation revealed a tendency for improvement in the usefulness and ease of usage of the PAUI, although without statistical significance, due to the reduced number of subjects.info:eu-repo/semantics/acceptedVersio

    User emotional interaction processor: a tool to support the development of GUIs through physiological user monitoring

    Get PDF
    Ever since computers have entered humans' daily lives, the activity between the human and the digital ecosystems has increased. This increase encourages the development of smarter and more user-friendly human-computer interfaces. However, to test these interfaces, the means of interaction have been limited, for the most part restricted to the conventional interface, the "manual" interface, where physical input is required, where participants who test these interfaces use a keyboard, mouse, or a touch screen, and where communication between participants and designers is required. There is another method, which will be applied in this dissertation, which does not require physical input from the participants, which is called Affective Computing. This dissertation presents the development of a tool to support the development of graphical interfaces, based on the monitoring of psychological and physiological aspects of the user (emotions and attention), aiming to improve the experience of the end user, with the ultimate goal of improving the interface design. The development of this tool will be described. The results, provided by designers from an IT company, suggest that the tool is useful but that the optimized interface generated by it still has some flaws. These flaws are mainly related to the lack of consideration of a general context in the interface generation process.Desde que os computadores entraram na vida diária dos humanos, a atividade entre o ecossistema humano e o digital tem aumentado. Este aumento estimula o desenvolvimento de interfaces humano-computador mais inteligentes e apelativas ao utilizador. No entanto, para testar estas interfaces, os meios de interação têm sido limitados, em grande parte restritos à interface convencional, a interface "manual", onde é preciso "input" físico, onde os participantes que testam estas interface, usam um teclado, um rato ou um "touch screen", e onde a comunicação dos participantes com os designers é necessária. Existe outro método, que será aplicado nesta dissertação, que não necessita de "input" físico dos participantes, que se denomina de "Affective Computing". Esta dissertação apresenta o desenvolvimento de uma ferramenta de suporte ao desenvolvimento de interfaces gráficas, baseada na monitorização de aspetos psicológicos e fisiológicos do utilizador (emoções e atenção), visando melhorar a experiência do utilizador final, com o objetivo último de melhorar o design da interface. O desenvolvimento desta ferramenta será descrito. Os resultados, dados por designers de uma empresa de IT, sugerem que esta é útil, mas que a interface otimizada gerada pela mesma tem ainda algumas falhas. Estas falhas estão, principalmente, relacionadas com a ausência de consideração de um contexto geral no processo de geração da interface

    Graphical user interface redefinition addressing users’ diversity

    Get PDF
    Improvements can still be made in the development of Interactive Computing Systems (ICSs) aiming to ease their use. This is particularly true when trying to address users’ diversity. Most ICSs do not adjust themselves to the user nor consider user’s particularities. However, some provide solutions to address better specificities of expert and novice users. Others adjust themselves based on user’s interaction history, but this does not always lead to improvements in use. An aspect that prevents to address users’ diversity broadly is the fact that most of existing ICSs do not provide source code access. This means that only owners can introduce improvements on them. This paper proposes an approach (based on both affective computing and computer vision) to broadly improve design for diversity (without source code access) for both existing and to be developed ICSs. The results are twofold: (i) example of an initial set of design guidelines; (ii) opens the way to runtime Graphical User Interface (GUI) redefinition and adjustment based on both user’s features and emotions reducing therefore designers’ restrictions when addressing users’ diversity.info:eu-repo/semantics/acceptedVersio

    Towards graphical user interface redefinition without source code access: System design and evaluation

    Get PDF
    Nowadays several interactive computing systems (ICSs) still have Graphical User Interfaces (GUIs) that are inadequate in terms of usability and user experience. Numerous improvements were made in the development of better GUIs however, little has been done to improve existing ones. This might be explained by the fact that most ICSs do not provide source code access. In most cases, this means that only persons with source code access can (easily) enhance the respective GUI. This paper presents a tool using computer vision (CV) algorithms to semi-automatically redefine existing GUIs without accessing their source code. The evaluation of a new GUI obtained from the redefinition of an existing GUI using the tool is described. Results show statistically significant improvements in usability (reduction of interaction mistakes), improved task completion success rate and improved user satisfaction.info:eu-repo/semantics/acceptedVersio

    How Can Physiological Computing Benefit Human-Robot Interaction?

    Get PDF
    As systems grow more automatized, the human operator is all too often overlooked. Although human-robot interaction (HRI) can be quite demanding in terms of cognitive resources, the mental states (MS) of the operators are not yet taken into account by existing systems. As humans are no providential agents, this lack can lead to hazardous situations. The growing number of neurophysiology and machine learning tools now allows for efficient operators' MS monitoring. Sending feedback on MS in a closed-loop solution is therefore at hands. Involving a consistent automated planning technique to handle such a process could be a significant asset. This perspective article was meant to provide the reader with a synthesis of the significant literature with a view to implementing systems that adapt to the operator's MS to improve human-robot operations' safety and performance. First of all, the need for this approach is detailed as regards remote operation, an example of HRI. Then, several MS identified as crucial for this type of HRI are defined, along with relevant electrophysiological markers. A focus is made on prime degraded MS linked to time-on-task and task demands, as well as collateral MS linked to system outputs (i.e. feedback and alarms). Lastly, the principle of symbiotic HRI is detailed and one solution is proposed to include the operator state vector into the system using a mixed-initiative decisional framework to drive such an interaction

    Towards Multi-UAV and Human Interaction Driving System Exploiting Human Mental State Estimation

    Get PDF
    This paper addresses the growing human-multi-UAV interaction issue. Current active approaches towards a reliable multi-UAV system are reviewed. This brings us to the conclusion that the multiple Unmanned Aerial Vehicles (UAVs) control paradigm is segmented into two main scopes: i) autonomous control and coordination within the group of UAVs, and ii) a human centered approach with helping agents and overt behavior monitoring. Therefore, to move further with the future of human-multi-UAV interaction problem, a new perspective is put forth. In the following sections, a brief understanding of the system is provided, followed by the current state of multi-UAV research and how taking the human pilot's physiology into account could improve the interaction. This idea is developed first by detailing what physiological computing is, including mental states of interest and their associated physiological markers. Second, the article concludes with the proposed approach for Human-multi-UAV interaction control and future plans

    Humanoid Robots

    Get PDF
    For many years, the human being has been trying, in all ways, to recreate the complex mechanisms that form the human body. Such task is extremely complicated and the results are not totally satisfactory. However, with increasing technological advances based on theoretical and experimental researches, man gets, in a way, to copy or to imitate some systems of the human body. These researches not only intended to create humanoid robots, great part of them constituting autonomous systems, but also, in some way, to offer a higher knowledge of the systems that form the human body, objectifying possible applications in the technology of rehabilitation of human beings, gathering in a whole studies related not only to Robotics, but also to Biomechanics, Biomimmetics, Cybernetics, among other areas. This book presents a series of researches inspired by this ideal, carried through by various researchers worldwide, looking for to analyze and to discuss diverse subjects related to humanoid robots. The presented contributions explore aspects about robotic hands, learning, language, vision and locomotion

    Proceedings of the 5th international conference on disability, virtual reality and associated technologies (ICDVRAT 2004)

    Get PDF
    The proceedings of the conferenc
    corecore