82,776 research outputs found

    Progressive co-adaptation in human-machine interaction

    Get PDF
    In this paper we discuss the concept of co-adaptation between a human operator and a machine interface and we summarize its application with emphasis on two different domains, teleoperation and assistive technology. The analysis of the literature reveals that only in few cases the possibility of a temporal evolution of the co-adaptation parameters has been considered. In particular, it has been overlooked the role of time-related indexes that capture changes in motor and cognitive abilities of the human operator. We argue that for a more effective long-term co-adaptation process, the interface should be able to predict and adjust its parameters according to the evolution of human skills and performance. We thus propose a novel approach termed progressive co-adaptation, whereby human performance is continuously monitored and the system makes inferences about changes in the users' cognitive and motor skills. We illustrate the features of progressive co-adaptation in two possible applications, robotic telemanipulation and active vision for the visually impaired

    Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges

    Get PDF
    In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices

    Enaction-Based Artificial Intelligence: Toward Coevolution with Humans in the Loop

    Full text link
    This article deals with the links between the enaction paradigm and artificial intelligence. Enaction is considered a metaphor for artificial intelligence, as a number of the notions which it deals with are deemed incompatible with the phenomenal field of the virtual. After explaining this stance, we shall review previous works regarding this issue in terms of artifical life and robotics. We shall focus on the lack of recognition of co-evolution at the heart of these approaches. We propose to explicitly integrate the evolution of the environment into our approach in order to refine the ontogenesis of the artificial system, and to compare it with the enaction paradigm. The growing complexity of the ontogenetic mechanisms to be activated can therefore be compensated by an interactive guidance system emanating from the environment. This proposition does not however resolve that of the relevance of the meaning created by the machine (sense-making). Such reflections lead us to integrate human interaction into this environment in order to construct relevant meaning in terms of participative artificial intelligence. This raises a number of questions with regards to setting up an enactive interaction. The article concludes by exploring a number of issues, thereby enabling us to associate current approaches with the principles of morphogenesis, guidance, the phenomenology of interactions and the use of minimal enactive interfaces in setting up experiments which will deal with the problem of artificial intelligence in a variety of enaction-based ways

    Construals as a complement to intelligent tutoring systems in medical education

    Get PDF
    This is a preliminary version of a report prepared by Meurig and Will Beynon in conjunction with a poster paper "Mediating Intelligence through Observation, Dependency and Agency in Making Construals of Malaria" at the 11th International Conference on Intelligent Tutoring Systems (ITS 2012) and a paper "Construals to Support Exploratory and Collaborative Learning in Medicine" at the associated workshop on Intelligent Support for Exploratory Environments (ISEE 2012). A final version of the report will be published at a later stage after feedback from presentations at these events has been taken into account, and the experimental versions of the JS-EDEN interpreter used in making construals have been developed to a more mature and stable form

    Progressive Analytics: A Computation Paradigm for Exploratory Data Analysis

    Get PDF
    Exploring data requires a fast feedback loop from the analyst to the system, with a latency below about 10 seconds because of human cognitive limitations. When data becomes large or analysis becomes complex, sequential computations can no longer be completed in a few seconds and data exploration is severely hampered. This article describes a novel computation paradigm called Progressive Computation for Data Analysis or more concisely Progressive Analytics, that brings at the programming language level a low-latency guarantee by performing computations in a progressive fashion. Moving this progressive computation at the language level relieves the programmer of exploratory data analysis systems from implementing the whole analytics pipeline in a progressive way from scratch, streamlining the implementation of scalable exploratory data analysis systems. This article describes the new paradigm through a prototype implementation called ProgressiVis, and explains the requirements it implies through examples.Comment: 10 page

    Adaptivity through alternate freeing and freezing of degrees of freedom

    Get PDF
    Starting with fewer degrees of freedom has been shown to enable a more efficient exploration of the sensorimotor space. While not necessarily leading to optimal task performance, it results in a smaller number of directions of stability, which guide the coordination of additional degrees of freedom. The developmental release of additional degrees of freedom is then expected to allow for optimal task performance and more tolerance and adaptation to environmental interaction. In this paper, we test this assumption with a small-sized humanoid robot that learns to swing under environmental perturbations. Our experiments show that a progressive release of degrees of freedom alone is not sufficient to cope with environmental perturbations. Instead, alternate freezing and freeing of the degrees of freedom is required. Such finding is consistent with observations made during transitional periods in acquisition of skills in infants

    Portable navigations system with adaptive multimodal interface for the blind

    Get PDF
    Recent advances in mobile technology have the potential to radically change the quality of tools available for people with sensory impairments, in particular the blind. Nowadays almost every smart-phone and tablet is equipped with high resolutions cameras, which are typically used for photos and videos, communication purposes, games and virtual reality applications. Very little has been proposed to exploit these sensors for user localisation and navigation instead. To this end, the “Active Vision with Human-in-the-Loop for the Visually Impaired” (ActiVis) project aims to develop a novel electronic travel aid to tackle the “last 10 yards problem” and enable the autonomous navigation of blind users in unknown environments, ultimately enhancing or replacing existing solutions, such as guide dogs and white canes. This paper describes some of the key project’s challenges, in particular with respect to the design of the user interface that translate visual information from the camera to guiding instructions for the blind person, taking into account limitations due to the visual impairment and proposing a multimodal interface that embeds human-machine co-adaptation
    corecore