1,314 research outputs found

    ADAPT Project Publications Booklet

    Get PDF

    Human-in-the-Loop Cyber Physical Systems: Modular Designs for Semi-Autonomous Wheelchair Navigation

    Get PDF
    This project involves the design and development of a prototyping platform and open design framework for a semi-autonomous wheelchair to realize a human-in-the-loop cyber physical system as an assistive technology. The system is designed to assist physically locked-in individuals in navigating indoor environments through the use of modular sensor, communication, and control designs. This enables the user to share control with the wheelchair and allows the system to operate semi-autonomously with human-in-the-loop. The Wheelchair Add-on Modules (WAMs) developed for use in this project are platform-independent and facilitate development and application of semi- autonomous functionality

    Cognitive assisted living ambient system: a survey

    Get PDF
    The demographic change towards an aging population is creating a significant impact and introducing drastic challenges to our society. We therefore need to find ways to assist older people to stay independently and prevent social isolation of these population. Information and Communication Technologies (ICT) provide various solutions to help older adults to improve their quality of life, stay healthier, and live independently for a time. Ambient Assisted Living (AAL) is a field to investigate innovative technologies to provide assistance as well as healthcare and rehabilitation to impaired seniors. The paper provides a review of research background and technologies of AAL

    3D Perception Based Lifelong Navigation of Service Robots in Dynamic Environments

    Get PDF
    Lifelong navigation of mobile robots is to ability to reliably operate over extended periods of time in dynamically changing environments. Historically, computational capacity and sensor capability have been the constraining factors to the richness of the internal representation of the environment that a mobile robot could use for navigation tasks. With affordable contemporary sensing technology available that provides rich 3D information of the environment and increased computational power, we can increasingly make use of more semantic environmental information in navigation related tasks.A navigation system has many subsystems that must operate in real time competing for computation resources in such as the perception, localization, and path planning systems. The main thesis proposed in this work is that we can utilize 3D information from the environment in our systems to increase navigational robustness without making trade-offs in any of the real time subsystems. To support these claims, this dissertation presents robust, real world 3D perception based navigation systems in the domains of indoor doorway detection and traversal, sidewalk-level outdoor navigation in urban environments, and global localization in large scale indoor warehouse environments.The discussion of these systems includes methods of 3D point cloud based object detection to find respective objects of semantic interest for the given navigation tasks as well as the use of 3D information in the navigational systems for purposes such as localization and dynamic obstacle avoidance. Experimental results for each of these applications demonstrate the effectiveness of the techniques for robust long term autonomous operation

    Navegação semùntica aplicada a passagens estreitas em cadeira de rodas inteligente

    Get PDF
    With the development of robotics technology, new opportunities for improving the quality of life of people with mobility impairments arise. The IntellWheels project was born with the goal of developing a hardware and software kit that can turn a motorized wheelchair into an autonomous Intelligent Wheelchair. This dissertation fits into this project on the topic of indoor navigation with the goal of adding a semantic layer to the navigation framework. Semantic in robotics is the ability of a robot to understand its environment. In the case of an Intelligent Wheelchair, this is especially important considering it is a robot that carries a passenger. A solution was developed where concepts of semantic navigation are used to tackle the problem of crossing narrow passages, building an additional mapping layer where these passages are automatically detected and marked on the map and an extra planning layer that takes in this semantic information and makes decisions for controlling the lower-level planners. In order to validate this solution tests, in both simulated and real-world environments, were performed to verify if the individual components worked as intended. This thesis resulted in an improvement of the indoor navigation capabilities of the framework and is a base for further development in the area of semantic navigation of Intelligent Wheelchairs.Com o desenvolvimento da tecnologia em robótica, surgem novas oportunidades para melhorar a qualidade de vida de pessoas com problemas de mobilidade. O projeto IntellWheels nasce com o objetivo de criar um kit de hardware e software capaz de transformar uma cadeira de rodas motorizada numa cadeira de rodas inteligente e autónoma. Esta dissertação encaixa neste projeto no tópico de navegação em ambientes interiores com o objetivo de adicionar uma camada semùntica à framework de navegação. Em robótica, semùntica é a capacidade de um robÎ entender o seu ambiente. No caso de uma cadeira de rodas inteligente, isto é especialmente importante tendo em conta que transporta um passageiro. Uma solução foi desenvolvida em que conceitos de navegação semùntica são usados para abordar o problema de atravessar passagens estreitas, construindo uma camada adicional de mapeamento em que estas passagens são automaticamente identificadas e marcadas no mapa e uma camada extra de planeamento que recebe esta informação e toma decisÔes controlando os planeadores de nível mais baixo. Para validar esta solução, foram realizados testes, em ambiente simulado e ambiente real para verificar se os componentes individuais funcionavam como pretendido. Esta dissertação resultou numa melhoria das capacidades de navegação em ambientes interiores da framework e é uma base para futuros desenvolvimentos na årea de navegação semùntica para cadeiras de rodas inteligentes.Mestrado em Engenharia de Computadores e Telemåtic

    A non-holonomic, highly human-in-the-loop compatible, assistive mobile robotic platform guidance navigation and control strategy

    Get PDF
    The provision of assistive mobile robotics for empowering and providing independence to the infirm, disabled and elderly in society has been the subject of much research. The issue of providing navigation and control assistance to users, enabling them to drive their powered wheelchairs effectively, can be complex and wide-ranging; some users fatigue quickly and can find that they are unable to operate the controls safely, others may have brain injury re-sulting in periodic hand tremors, quadriplegics may use a straw-like switch in their mouth to provide a digital control signal. Advances in autonomous robotics have led to the development of smart wheelchair systems which have attempted to address these issues; however the autonomous approach has, ac-cording to research, not been successful; users reporting that they want to be active drivers and not passengers. Recent methodologies have been to use collaborative or shared control which aims to predict or anticipate the need for the system to take over control when some pre-decided threshold has been met, yet these approaches still take away control from the us-er. This removal of human supervision and control by an autonomous system makes the re-sponsibility for accidents seriously problematic. This thesis introduces a new human-in-the-loop control structure with real-time assistive lev-els. One of these levels offers improved dynamic modelling and three of these levels offer unique and novel real-time solutions for: collision avoidance, localisation and waypoint iden-tification, and assistive trajectory generation. This architecture and these assistive functions always allow the user to remain fully in control of any motion of the powered wheelchair, shown in a series of experiments

    Explainable shared control in assistive robotics

    Get PDF
    Shared control plays a pivotal role in designing assistive robots to complement human capabilities during everyday tasks. However, traditional shared control relies on users forming an accurate mental model of expected robot behaviour. Without this accurate mental image, users may encounter confusion or frustration whenever their actions do not elicit the intended system response, forming a misalignment between the respective internal models of the robot and human. The Explainable Shared Control paradigm introduced in this thesis attempts to resolve such model misalignment by jointly considering assistance and transparency. There are two perspectives of transparency to Explainable Shared Control: the human's and the robot's. Augmented reality is presented as an integral component that addresses the human viewpoint by visually unveiling the robot's internal mechanisms. Whilst the robot perspective requires an awareness of human "intent", and so a clustering framework composed of a deep generative model is developed for human intention inference. Both transparency constructs are implemented atop a real assistive robotic wheelchair and tested with human users. An augmented reality headset is incorporated into the robotic wheelchair and different interface options are evaluated across two user studies to explore their influence on mental model accuracy. Experimental results indicate that this setup facilitates transparent assistance by improving recovery times from adverse events associated with model misalignment. As for human intention inference, the clustering framework is applied to a dataset collected from users operating the robotic wheelchair. Findings from this experiment demonstrate that the learnt clusters are interpretable and meaningful representations of human intent. This thesis serves as a first step in the interdisciplinary area of Explainable Shared Control. The contributions to shared control, augmented reality and representation learning contained within this thesis are likely to help future research advance the proposed paradigm, and thus bolster the prevalence of assistive robots.Open Acces

    Towards Natural Human Control and Navigation of Autonomous Wheelchairs

    Get PDF
    Approximately 2.2 million people in the United States depend on a wheelchair to assist with their mobility. Often times, the wheelchair user can maneuver around using a conventional joystick. Visually impaired or wheelchair patients with restricted hand mobility, such as stroke, arthritis, limb injury, Parkinson’s, cerebral palsy or multiple sclerosis, prevent them from using traditional joystick controls. The resulting mobility limitations force these patients to rely on caretakers to perform everyday tasks. This minimizes the independence of the wheelchair user. Modern day speech recognition systems can be used to enhance user experiences when using electronic devices. By expanding the motorized wheelchair control interface to include the detection of user speech commands, the independence is given back to the mobility impaired. A speech recognition interface was developed for a smart wheelchair. By integrating navigation commands with a map of the wheelchair’s surroundings, the wheelchair interface is more natural and intuitive to use. Complex speech patterns are interpreted for users to command the smart wheelchair to navigate to specified locations within the map. Pocketsphinx, a speech toolkit, is used to interpret the vocal commands. A language model and dictionary were generated based on a set of possible commands and locations supplied to the speech recognition interface. The commands fall under the categories of speed, directional, or destination commands. Speed commands modify the relative speed of the wheelchair. Directional commands modify the relative direction of the wheelchair. Destination commands require a known location on a map to navigate to. The completion of the speech input processer and the connection between wheelchair components via the Robot Operating System make map navigation possible
    • 

    corecore