4,465 research outputs found

    Overcoming barriers and increasing independence: service robots for elderly and disabled people

    Get PDF
    This paper discusses the potential for service robots to overcome barriers and increase independence of elderly and disabled people. It includes a brief overview of the existing uses of service robots by disabled and elderly people and advances in technology which will make new uses possible and provides suggestions for some of these new applications. The paper also considers the design and other conditions to be met for user acceptance. It also discusses the complementarity of assistive service robots and personal assistance and considers the types of applications and users for which service robots are and are not suitable

    The Promotoer: a successful story of translational research in BCI for motor rehabilitation

    Get PDF
    Several groups have recently demonstrated in the context of randomized controlled trials (RCTs) how sensorimotor Brain-Computer Interface (BCI) systems can be beneficial for post-stroke motor recovery. Following a successful RCT, at Fondazione Santa Lucia (FSL) a further translational effort was made with the implementation of the Promotœr, an all in-one BCIsupported MI training station. Up to now, 25 patients underwent training with the Promotɶr during their admission for rehabilitation purposes (in add-on to standard therapy). Two illustrative cases are presented. Though currently limited to FSL, the Promotɶr represents a successful story of translational research in BCI for stroke rehabilitation. Results are promising both in terms of feasibility of a BCI training in the context of a real rehabilitation program and in terms of clinical and neurophysiological benefits observed in the patients

    A Low-Cost Tele-Presence Wheelchair System

    Full text link
    This paper presents the architecture and implementation of a tele-presence wheelchair system based on tele-presence robot, intelligent wheelchair, and touch screen technologies. The tele-presence wheelchair system consists of a commercial electric wheelchair, an add-on tele-presence interaction module, and a touchable live video image based user interface (called TIUI). The tele-presence interaction module is used to provide video-chatting for an elderly or disabled person with the family members or caregivers, and also captures the live video of an environment for tele-operation and semi-autonomous navigation. The user interface developed in our lab allows an operator to access the system anywhere and directly touch the live video image of the wheelchair to push it as if he/she did it in the presence. This paper also discusses the evaluation of the user experience

    Future bathroom: A study of user-centred design principles affecting usability, safety and satisfaction in bathrooms for people living with disabilities

    Get PDF
    Research and development work relating to assistive technology 2010-11 (Department of Health) Presented to Parliament pursuant to Section 22 of the Chronically Sick and Disabled Persons Act 197

    A mosaic of eyes

    Get PDF
    Autonomous navigation is a traditional research topic in intelligent robotics and vehicles, which requires a robot to perceive its environment through onboard sensors such as cameras or laser scanners, to enable it to drive to its goal. Most research to date has focused on the development of a large and smart brain to gain autonomous capability for robots. There are three fundamental questions to be answered by an autonomous mobile robot: 1) Where am I going? 2) Where am I? and 3) How do I get there? To answer these basic questions, a robot requires a massive spatial memory and considerable computational resources to accomplish perception, localization, path planning, and control. It is not yet possible to deliver the centralized intelligence required for our real-life applications, such as autonomous ground vehicles and wheelchairs in care centers. In fact, most autonomous robots try to mimic how humans navigate, interpreting images taken by cameras and then taking decisions accordingly. They may encounter the following difficulties

    Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges

    Get PDF
    In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices

    Brain-Computer Interfacing for Wheelchair Control by Detecting Voluntary Eye Blinks

    Get PDF
    The human brain is considered as one of the most powerful quantum computers and combining the human brain with technology can even outperform artificial intelligence. Using a Brain-Computer Interface (BCI) system, the brain signals can be analyzed and programmed for specific tasks. This research work employs BCI technology for a medical application that gives the unfortunate paralyzed individuals the capability to interact with their surroundings solely using voluntary eye blinks. This research contributes to the existing technology to be more feasible by introducing a modular design with three physically separated components: a headwear, a computer, and a wheelchair. As the signal-to-noise ratio (SNR) of the existing systems is too high to separate the eye blink artifacts from the regular EEG signal, a precise ThinkGear module is used which acquired the raw EEG signal through a single dry electrode. This chip offers an advanced filtering technology that has a high noise immunity along with an embedded Bluetooth module using which the acquired signal is transferred wirelessly to a computer. A MATLAB program captures voluntary eye blink artifacts from the brain waves and commands the movement of a miniature wheelchair via Bluetooth. To distinguish voluntary eye blinks from involuntary eye blinks, blink strength thresholds are determined. A Graphical User Interface (GUI) designed in MATLAB displays the EEG waves in real-time and enables the user to determine the movements of the wheelchair which is specially designed to take commands from the GUI.  The findings from the testing phase unveil the advantages of a modular design and the efficacy of using eye blink artifacts as the control element for brain-controlled wheelchairs. The work presented here gives a basic understanding of the functionality of a BCI system, and provides eye blink-controlled navigation of a wheelchair for patients suffering from severe paralysis

    Color-based classification of EEG Signals for people with the severe locomotive disorder

    Full text link
    The neurons in the brain produces electric signals and a collective firing of these electric signals gives rise to brainwaves. These brainwave signals are captured using EEG (Electroencephalogram) devices as micro voltages. These sequence of signals captured by EEG sensors have embedded features in them that can be used for classification. The signals can be used as an alternative input for people suffering from severe locomotive disorder.Classification of different colors can be mapped for many functions like directional movement. In this paper, raw EEG signals from NeuroSky Mindwave headset (a single electrode EEG sensor) have been classified with an attention based Deep Learning Network. Attention based LSTM Networks have been implemented for classification of two different colors and four different colors. An accuracy of 93.5\% was obtained for classification of two colors and an accuracy of 65.75\% was obtained for classifcation of four signals using the mentioned attention based LSTM network.Comment: 6 pages, 3 figures, 14 graphs, 4 tables, 2 author
    corecore