1,272 research outputs found

    Brain-coupled Interaction for Semi-autonomous Navigation of an Assistive Robot

    Get PDF
    This paper presents a novel semi-autonomous navigation strategy designed for low throughput interfaces. A mobile robot (e.g. intelligent wheelchair) proposes the most probable action, as analyzed from the environment, to a human user who can either accept or reject the proposition. In case of refusal, the robot will propose another action, until both entities agree on what needs to be done. In an unknown environment, the robotic system first extracts features so as to recognize places of interest where a human-robot interaction should take place (e.g. crossings). Based on the local topology, relevant actions are then proposed, the user providing answers by the mean of a button or a brain-computer interface (BCI). Our navigation strategy is successfully tested both in simulation and with a real robot, and a feasibility study for the use of a BCI confirms the potential of such interface

    Overcoming barriers and increasing independence: service robots for elderly and disabled people

    Get PDF
    This paper discusses the potential for service robots to overcome barriers and increase independence of elderly and disabled people. It includes a brief overview of the existing uses of service robots by disabled and elderly people and advances in technology which will make new uses possible and provides suggestions for some of these new applications. The paper also considers the design and other conditions to be met for user acceptance. It also discusses the complementarity of assistive service robots and personal assistance and considers the types of applications and users for which service robots are and are not suitable

    Explainable shared control in assistive robotics

    Get PDF
    Shared control plays a pivotal role in designing assistive robots to complement human capabilities during everyday tasks. However, traditional shared control relies on users forming an accurate mental model of expected robot behaviour. Without this accurate mental image, users may encounter confusion or frustration whenever their actions do not elicit the intended system response, forming a misalignment between the respective internal models of the robot and human. The Explainable Shared Control paradigm introduced in this thesis attempts to resolve such model misalignment by jointly considering assistance and transparency. There are two perspectives of transparency to Explainable Shared Control: the human's and the robot's. Augmented reality is presented as an integral component that addresses the human viewpoint by visually unveiling the robot's internal mechanisms. Whilst the robot perspective requires an awareness of human "intent", and so a clustering framework composed of a deep generative model is developed for human intention inference. Both transparency constructs are implemented atop a real assistive robotic wheelchair and tested with human users. An augmented reality headset is incorporated into the robotic wheelchair and different interface options are evaluated across two user studies to explore their influence on mental model accuracy. Experimental results indicate that this setup facilitates transparent assistance by improving recovery times from adverse events associated with model misalignment. As for human intention inference, the clustering framework is applied to a dataset collected from users operating the robotic wheelchair. Findings from this experiment demonstrate that the learnt clusters are interpretable and meaningful representations of human intent. This thesis serves as a first step in the interdisciplinary area of Explainable Shared Control. The contributions to shared control, augmented reality and representation learning contained within this thesis are likely to help future research advance the proposed paradigm, and thus bolster the prevalence of assistive robots.Open Acces

    A reconfigurable wheelchair for mobility and rehabilitation:Design and development

    Get PDF
    This paper presents the design and development of a prototype of a reconfigurable wheelchair for rehabilitation and self-assistance to fit the size of a seven years old child (average 35 kg weight). Though the developed prototype is developed at this stage to fit a child, it can be resized, after considering variations in weight and size, to fit an older adult. The developed prototype has a mechanism that enables the user to transform from sit-to-stand (STS) posture and vice versa. With the help of the developed wheelchair, the user will also be able to adjust the posture of his upper body using an adjustable back support using two linear actuators. This configuration will allow the user to use the wheelchair as a mobility device as well as for rehabilitation purposes without the need of external support. The availability of STS and back adjustment mechanisms will allow the user to do regular exercising which will enhance blood circulation as sitting for long periods inflates lower limbs disability. The proposed configuration will help in enhancing the functional capabilities of end-users allowing for increased independence and ultimately quality of life

    Haptic Interaction with a Guide Robot in Zero Visibility

    Get PDF
    Search and rescue operations are often undertaken in dark and noisy environment in which rescue team must rely on haptic feedback for exploration and safe exit. However, little attention has been paid specifically to haptic sensitivity in such contexts or the possibility of enhancing communicational proficiency in the haptic mode as a life-preserving measure. The potential of root swarms for search and rescue has been shown by the Guardians project (EU, 2006-2010); however the project also showed the problem of human robot interaction in smoky (non-visibility) and noisy conditions. The REINS project (UK, 2011-2015) focused on human robot interaction in such conditions. This research is a body of work (done as a part of he REINS project) which investigates the haptic interaction of a person wit a guide robot in zero visibility. The thesis firstly reflects upon real world scenarios where people make use of the haptic sense to interact in zero visibility (such as interaction among firefighters and symbiotic relationship between visually impaired people and guide dogs). In addition, it reflects on the sensitivity and trainability of the haptic sense, to be used for the interaction. The thesis presents an analysis and evaluation of the design of a physical interface (Designed by the consortium of the REINS project) connecting the human and the robotic guide in poor visibility conditions. Finally, it lays a foundation for the design of test cases to evaluate human robot haptic interaction, taking into consideration the two aspects of the interaction, namely locomotion guidance and environmental exploration

    Making the most of context-awareness in brain-computer interfaces

    Get PDF
    In order for brain-computer interfaces (BCIs) to be used reliably for extended periods of time, they must be able to adapt to the users evolving needs. This adaptation should not only be a function of the environmental (external) context, but should also consider the internal context, such as cognitive states and brain signal reliability. In this work, we propose three different shared control frameworks that have been used for BCI applications: contextual fusion, contextual gating, and contextual regulation. We review recently published results in the light of these three context-awareness frameworks. Then, we discuss important issues to consider when designing a shared controller for BCI
    • …
    corecore