3,320 research outputs found

    Powered Wheelchair Platform for Assistive Technology Development

    Get PDF
    Literature shows that numerous wheelchair platforms, of various complexities, have been developed and evaluated for Assistive Technology purposes. However there has been little consideration to providing researchers with an embedded system which is fully compatible, and communicates seamlessly with current manufacturer's wheelchair systems. We present our powered wheelchair platform which allows researchers to mount various inertial and environment sensors, and run guidance and navigation algorithms which can modify the human desired joystick trajectory, so as to assist users with negotiating obstacles, and moving from room to room. We are also able to directly access other currently manufactured human input devices and integrate new and novel input devices into the powered wheelchair platform for clinical and research assessment

    User Based Development and Test of the EXOTIC Exoskeleton:Empowering Individuals with Tetraplegia Using a Compact, Versatile, 5-DoF Upper Limb Exoskeleton Controlled through Intelligent Semi-Automated Shared Tongue Control

    Get PDF
    This paper presents the EXOTIC- a novel assistive upper limb exoskeleton for individuals with complete functional tetraplegia that provides an unprecedented level of versatility and control. The current literature on exoskeletons mainly focuses on the basic technical aspects of exoskeleton design and control while the context in which these exoskeletons should function is less or not prioritized even though it poses important technical requirements. We considered all sources of design requirements, from the basic technical functions to the real-world practical application. The EXOTIC features: (1) a compact, safe, wheelchair-mountable, easy to don and doff exoskeleton capable of facilitating multiple highly desired activities of daily living for individuals with tetraplegia; (2) a semi-automated computer vision guidance system that can be enabled by the user when relevant; (3) a tongue control interface allowing for full, volitional, and continuous control over all possible motions of the exoskeleton. The EXOTIC was tested on ten able-bodied individuals and three users with tetraplegia caused by spinal cord injury. During the tests the EXOTIC succeeded in fully assisting tasks such as drinking and picking up snacks, even for users with complete functional tetraplegia and the need for a ventilator. The users confirmed the usability of the EXOTIC

    Research on Application of Cognitive-Driven Human-Computer Interaction

    Get PDF
    Human-computer interaction is an important research content of intelligent manufacturing human factor engineering. Natural human-computer interaction conforms to the cognition of users' habits and can efficiently process inaccurate information interaction, thus improving user experience and reducing cognitive load. Through the analysis of the information interaction process, user interaction experience cognition and human-computer interaction principles in the human-computer interaction system, a cognitive-driven human-computer interaction information transmission model is established. Investigate the main interaction modes in the current human-computer interaction system, and discuss its application status, technical requirements and problems. This paper discusses the analysis and evaluation methods of interaction modes in human-computer system from three levels of subjective evaluation, physiological measurement and mathematical method evaluation, so as to promote the understanding of inaccurate information to achieve the effect of interaction self-adaptation and guide the design and optimization of human-computer interaction system. According to the development status of human-computer interaction in intelligent environment, the research hotspots, problems and development trends of human-computer interaction are put forward

    Updating of user requirements of elderly and disabled drivers and travellers

    Get PDF
    The user requirements have been reassessed in the light of the results from the collaborative evaluations with other Transport Telematics Projects, as well as data and expertise gathered from the literature and other experts in the field. The user requirements identified are also the fundamental base for the development of different parts of the TELSCAN project. User requirements cover, of course, a multitude of different aspects, and to demonstrate how they have been integrated into the project’s output, they have been grouped into the following categories: • System function requirements • Interface requirements • Information requirements • Protocol requirements

    Explainable shared control in assistive robotics

    Get PDF
    Shared control plays a pivotal role in designing assistive robots to complement human capabilities during everyday tasks. However, traditional shared control relies on users forming an accurate mental model of expected robot behaviour. Without this accurate mental image, users may encounter confusion or frustration whenever their actions do not elicit the intended system response, forming a misalignment between the respective internal models of the robot and human. The Explainable Shared Control paradigm introduced in this thesis attempts to resolve such model misalignment by jointly considering assistance and transparency. There are two perspectives of transparency to Explainable Shared Control: the human's and the robot's. Augmented reality is presented as an integral component that addresses the human viewpoint by visually unveiling the robot's internal mechanisms. Whilst the robot perspective requires an awareness of human "intent", and so a clustering framework composed of a deep generative model is developed for human intention inference. Both transparency constructs are implemented atop a real assistive robotic wheelchair and tested with human users. An augmented reality headset is incorporated into the robotic wheelchair and different interface options are evaluated across two user studies to explore their influence on mental model accuracy. Experimental results indicate that this setup facilitates transparent assistance by improving recovery times from adverse events associated with model misalignment. As for human intention inference, the clustering framework is applied to a dataset collected from users operating the robotic wheelchair. Findings from this experiment demonstrate that the learnt clusters are interpretable and meaningful representations of human intent. This thesis serves as a first step in the interdisciplinary area of Explainable Shared Control. The contributions to shared control, augmented reality and representation learning contained within this thesis are likely to help future research advance the proposed paradigm, and thus bolster the prevalence of assistive robots.Open Acces

    Tongue Control of Upper-Limb Exoskeletons For Individuals With Tetraplegia

    Get PDF
    • …
    corecore