5 research outputs found

    An evaluation of an adaptive learning system based on multimodal affect recognition for learners with intellectual disabilities

    Get PDF
    Artificial intelligence tools for education (AIEd) have been used to automate the provision of learning support to mainstream learners. One of the most innovative approaches in this field is the use of data and machine learning for the detection of a student's affective state, to move them out of negative states that inhibit learning, into positive states such as engagement. In spite of their obvious potential to provide the personalisation that would give extra support for learners with intellectual disabilities, little work on AIEd systems that utilise affect recognition currently addresses this group. Our system used multimodal sensor data and machine learning to first identify three affective states linked to learning (engagement, frustration, boredom) and second determine the presentation of learning content so that the learner is maintained in an optimal affective state and rate of learning is maximised. To evaluate this adaptive learning system, 67 participants aged between 6 and 18 years acting as their own control took part in a series of sessions using the system. Sessions alternated between using the system with both affect detection and learning achievement to drive the selection of learning content (intervention) and using learning achievement alone (control) to drive the selection of learning content. Lack of boredom was the state with the strongest link to achievement, with both frustration and engagement positively related to achievement. There was significantly more engagement and less boredom in intervention than control sessions, but no significant difference in achievement. These results suggest that engagement does increase when activities are tailored to the personal needs and emotional state of the learner and that the system was promoting affective states that in turn promote learning. However, longer exposure is necessary to determine the effect on learning

    Industry Led Use-Case Development for Human-Swarm Operations

    Get PDF
    In the domain of unmanned vehicles, autonomous robotic swarms promise to deliver increased efficiency and collective autonomy. How these swarms will operate in the future, and what communication requirements and operational boundaries will arise are yet to be sufficiently defined. A workshop was conducted with 11 professional unmanned-vehicle operators and designers with the objective of identifying use-cases for developing and testing robotic swarms. Three scenarios were defined by experts and were then compiled to produce a single use case outlining the scenario, objectives, agents, communication requirements and stages of operation when collaborating with highly autonomous swarms. Our compiled use case is intended for researchers, designers, and manufacturers alike to test and tailor their design pipeline to accommodate for some of the key issues in human-swarm ininteraction. Examples of application include informing simulation development, forming the basis of further design workshops, and identifying trust issues that may arise between human operators and the swarm

    37th International Symposium on Intensive Care and Emergency Medicine (part 3 of 3)

    Full text link

    Fragile X Syndrome: The GABAergic System and Circuit Dysfunction

    No full text
    corecore