3,804 research outputs found

    Computer- and robot-assisted Medical Intervention

    Full text link
    Medical robotics includes assistive devices used by the physician in order to make his/her diagnostic or therapeutic practices easier and more efficient. This chapter focuses on such systems. It introduces the general field of Computer-Assisted Medical Interventions, its aims, its different components and describes the place of robots in that context. The evolutions in terms of general design and control paradigms in the development of medical robots are presented and issues specific to that application domain are discussed. A view of existing systems, on-going developments and future trends is given. A case-study is detailed. Other types of robotic help in the medical environment (such as for assisting a handicapped person, for rehabilitation of a patient or for replacement of some damaged/suppressed limbs or organs) are out of the scope of this chapter.Comment: Handbook of Automation, Shimon Nof (Ed.) (2009) 000-00

    NASA space station automation: AI-based technology review

    Get PDF
    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures

    PROSIS: An isoarchic structure for HMS control

    No full text
    International audienceThis paper presents a holonic and isoarchic approach to the Flexible Manufacturing System (FMS) control. This approach is based on a flat holonic form, where each holon is a model for each entity of the FMS, with a unifying level of communication between holons. After description of this model, called PROSIS, the interaction protocol and decision rules are presented. The objective is to increase the FMS productivity and flexibility, particularly on responsiveness aspects. This responsiveness is achieved through decentralized generation of the production tasks. The reactive behaviour of the FMS control is illustrated by the example of a flexible turning cell, upon occurrence of a failure or of an urgent batch order, and the resulting Gantt charts are shown

    Towards the simulation of cooperative perception applications by leveraging distributed sensing infrastructures

    Get PDF
    With the rapid development of Automated Vehicles (AV), the boundaries of their function alities are being pushed and new challenges are being imposed. In increasingly complex and dynamic environments, it is fundamental to rely on more powerful onboard sensors and usually AI. However, there are limitations to this approach. As AVs are increasingly being integrated in several industries, expectations regarding their cooperation ability is growing, and vehicle-centric approaches to sensing and reasoning, become hard to integrate. The proposed approach is to extend perception to the environment, i.e. outside of the vehicle, by making it smarter, via the deployment of wireless sensors and actuators. This will vastly improve the perception capabilities in dynamic and unpredictable scenarios and often in a cheaper way, relying mostly in the use of lower cost sensors and embedded devices, which rely on their scale deployment instead of centralized sensing abilities. Consequently, to support the development and deployment of such cooperation actions in a seamless way, we require the usage of co-simulation frameworks, that can encompass multiple perspectives of control and communications for the AVs, the wireless sensors and actuators and other actors in the environment. In this work, we rely on ROS2 and micro-ROS as the underlying technologies for integrating several simulation tools, to construct a framework, capable of supporting the development, test and validation of such smart, cooperative environments. This endeavor was undertaken by building upon an existing simulation framework known as AuNa. We extended its capabilities to facilitate the simulation of cooperative scenarios by incorporat ing external sensors placed within the environment rather than just relying on vehicle-based sensors. Moreover, we devised a cooperative perception approach within this framework, showcasing its substantial potential and effectiveness. This will enable the demonstration of multiple cooperation scenarios and also ease the deployment phase by relying on the same software architecture.Com o rápido desenvolvimento dos Veículos Autónomos (AV), os limites das suas funcional idades estão a ser alcançados e novos desafios estão a surgir. Em ambientes complexos e dinâmicos, é fundamental a utilização de sensores de alta capacidade e, na maioria dos casos, inteligência artificial. Mas existem limitações nesta abordagem. Como os AVs estão a ser integrados em várias indústrias, as expectativas quanto à sua capacidade de cooperação estão a aumentar, e as abordagens de perceção e raciocínio centradas no veículo, tornam-se difíceis de integrar. A abordagem proposta consiste em extender a perceção para o ambiente, isto é, fora do veículo, tornando-a inteligente, através do uso de sensores e atuadores wireless. Isto irá melhorar as capacidades de perceção em cenários dinâmicos e imprevisíveis, reduzindo o custo, pois a abordagem será baseada no uso de sensores low-cost e sistemas embebidos, que dependem da sua implementação em grande escala em vez da capacidade de perceção centralizada. Consequentemente, para apoiar o desenvolvimento e implementação destas ações em cooperação, é necessária a utilização de frameworks de co-simulação, que abranjam múltiplas perspetivas de controlo e comunicação para os AVs, sensores e atuadores wireless, e outros atores no ambiente. Neste trabalho será utilizado ROS2 e micro-ROS como as tecnologias subjacentes para a integração das ferramentas de simulação, de modo a construir uma framework capaz de apoiar o desenvolvimento, teste e validação de ambientes inteligentes e cooperativos. Esta tarefa foi realizada com base numa framework de simulação denominada AuNa. Foram expandidas as suas capacidades para facilitar a simulação de cenários cooperativos através da incorporação de sensores externos colocados no ambiente, em vez de depender apenas de sensores montados nos veículos. Além disso, concebemos uma abordagem de perceção cooperativa usando a framework, demonstrando o seu potencial e eficácia. Isto irá permitir a demonstração de múltiplos cenários de cooperação e também facilitar a fase de implementação, utilizando a mesma arquitetura de software

    Command and Control Systems for Search and Rescue Robots

    Get PDF
    The novel application of unmanned systems in the domain of humanitarian Search and Rescue (SAR) operations has created a need to develop specific multi-Robot Command and Control (RC2) systems. This societal application of robotics requires human-robot interfaces for controlling a large fleet of heterogeneous robots deployed in multiple domains of operation (ground, aerial and marine). This chapter provides an overview of the Command, Control and Intelligence (C2I) system developed within the scope of Integrated Components for Assisted Rescue and Unmanned Search operations (ICARUS). The life cycle of the system begins with a description of use cases and the deployment scenarios in collaboration with SAR teams as end-users. This is followed by an illustration of the system design and architecture, core technologies used in implementing the C2I, iterative integration phases with field deployments for evaluating and improving the system. The main subcomponents consist of a central Mission Planning and Coordination System (MPCS), field Robot Command and Control (RC2) subsystems with a portable force-feedback exoskeleton interface for robot arm tele-manipulation and field mobile devices. The distribution of these C2I subsystems with their communication links for unmanned SAR operations is described in detail. Field demonstrations of the C2I system with SAR personnel assisted by unmanned systems provide an outlook for implementing such systems into mainstream SAR operations in the future

    NASA space station automation: AI-based technology review. Executive summary

    Get PDF
    Research and Development projects in automation technology for the Space Station are described. Artificial Intelligence (AI) based technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics
    corecore