8,164 research outputs found

    Java-based MIDI interface for robot control

    Get PDF
    Robot offers an excellent means of utilizing high level technology to make a given manufacturing operation more profitable and competitive. Apart from manufacturing, robots are also being utilized in the areas of medicine and agriculture with great returns. Robotic applications now require multiple robots working in a coordinated manner, something similar to an orchestra. This lead to the concept of a Robotic Instrument Digital Interfacing (RIDI) system for real time coordinated control of robotic devices. It provides a novel runtime environment that supports the building and deployment of distributed robotic devices.;The objective of this research is to design and develop a Graphical User Interface (GUI). The GUI aims at providing extensive support for monitoring multiple device activity in real time. With the exception of code for device-specific interfaces, the prototype RIDI-GUI implementation will be coded in Java. This thesis details on the various aspects involved in the design and implementation of the GUI software for two robot arms

    NASA space station automation: AI-based technology review

    Get PDF
    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures

    Concurrent Path Planning with One or More Humanoid Robots

    Get PDF
    A robotic system includes a controller and one or more robots each having a plurality of robotic joints. Each of the robotic joints is independently controllable to thereby execute a cooperative work task having at least one task execution fork, leading to multiple independent subtasks. The controller coordinates motion of the robot(s) during execution of the cooperative work task. The controller groups the robotic joints into task-specific robotic subsystems, and synchronizes motion of different subsystems during execution of the various subtasks of the cooperative work task. A method for executing the cooperative work task using the robotic system includes automatically grouping the robotic joints into task-specific subsystems, and assigning subtasks of the cooperative work task to the subsystems upon reaching a task execution fork. The method further includes coordinating execution of the subtasks after reaching the task execution fork

    A Multi-Robot Cooperation Framework for Sewing Personalized Stent Grafts

    Full text link
    This paper presents a multi-robot system for manufacturing personalized medical stent grafts. The proposed system adopts a modular design, which includes: a (personalized) mandrel module, a bimanual sewing module, and a vision module. The mandrel module incorporates the personalized geometry of patients, while the bimanual sewing module adopts a learning-by-demonstration approach to transfer human hand-sewing skills to the robots. The human demonstrations were firstly observed by the vision module and then encoded using a statistical model to generate the reference motion trajectories. During autonomous robot sewing, the vision module plays the role of coordinating multi-robot collaboration. Experiment results show that the robots can adapt to generalized stent designs. The proposed system can also be used for other manipulation tasks, especially for flexible production of customized products and where bimanual or multi-robot cooperation is required.Comment: 10 pages, 12 figures, accepted by IEEE Transactions on Industrial Informatics, Key words: modularity, medical device customization, multi-robot system, robot learning, visual servoing, robot sewin

    A Multi-Robot Cooperation Framework for Sewing Personalized Stent Grafts

    Get PDF
    This paper presents a multi-robot system for manufacturing personalized medical stent grafts. The proposed system adopts a modular design, which includes: a (personalized) mandrel module, a bimanual sewing module, and a vision module. The mandrel module incorporates the personalized geometry of patients, while the bimanual sewing module adopts a learning-by-demonstration approach to transfer human hand-sewing skills to the robots. The human demonstrations were firstly observed by the vision module and then encoded using a statistical model to generate the reference motion trajectories. During autonomous robot sewing, the vision module plays the role of coordinating multi-robot collaboration. Experiment results show that the robots can adapt to generalized stent designs. The proposed system can also be used for other manipulation tasks, especially for flexible production of customized products and where bimanual or multi-robot cooperation is required.Comment: 10 pages, 12 figures, accepted by IEEE Transactions on Industrial Informatics, Key words: modularity, medical device customization, multi-robot system, robot learning, visual servoing, robot sewin

    NASA space station automation: AI-based technology review. Executive summary

    Get PDF
    Research and Development projects in automation technology for the Space Station are described. Artificial Intelligence (AI) based technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics

    Contactless medium scale industrial robot collaboration

    Get PDF
    The growing cost of High-Value/Mix and Low Volume (HMLV) industries like Aerospace is heavily based on industrial robots and manual operations done by operators [1]. Robots are excellent in repeatability by HMLV industries need changes with every single product. On the other hand human workforce is good at variability and intelligence but cost a lot as production rate is not comparable to robots and machines. There are flexible systems which have been specifically introduced for this type of industry FLEXA is one of them. But still there is need of collaboration between human and robot to get the flexible and cost effective solution [2]. A comprehensive survey has been conducted specifically on the issue of Human Robot collaboration [3] which laid out many advantages of this approach includes flexibility, cost-effectiveness and use of robot as intelligent assistant. There are several attempts have been made for Human Robot Collaboration for HMLV industry and Chen et al. attempt is one of them

    The KALI multi-arm robot programming and control environment

    Get PDF
    The KALI distributed robot programming and control environment is described within the context of its use in the Jet Propulsion Laboratory (JPL) telerobot project. The purpose of KALI is to provide a flexible robot programming and control environment for coordinated multi-arm robots. Flexibility, both in hardware configuration and software, is desired so that it can be easily modified to test various concepts in robot programming and control, e.g., multi-arm control, force control, sensor integration, teleoperation, and shared control. In the programming environment, user programs written in the C programming language describe trajectories for multiple coordinated manipulators with the aid of KALI function libraries. A system of multiple coordinated manipulators is considered within the programming environment as one motion system. The user plans the trajectory of one controlled Cartesian frame associated with a motion system and describes the positions of the manipulators with respect to that frame. Smooth Cartesian trajectories are achieved through a blending of successive path segments. The manipulator and load dynamics are considered during trajectory generation so that given interface force limits are not exceeded

    TRACS: An Experimental Multiagent Robotic System

    Get PDF
    TRACS (Two Robotic Arm Coordination System), developed at the GRASP Laboratory at University of Pennsylvania, is an experimental system for studying dynamically coordinated control of multiple robotic manipulators. The systems is used to investigate such issues as the design of controller architectures, development of real-time control and coordination programming environments, integration of sensory devices, and implementation of dynamic coordination algorithms. The system consists two PUMA 250 robot arms and custom-made end effectors for manipulation and grasping. The controller is based an IBM PC/AT for its simplicity in I/O interface, ease of real-time programming, and availability of low-cost supporting devices. The Intel 286 in the PC is aided by a high speed AMD 29000 based floating point processor board. They are pipelined in such a way that the AMD 29000 processor performs real-time computations and the Intel 286 carries out I/O operations. The system is capable of implementing dynamic coordinated control of the two manipulators at 200 Hz. TRACS utilizes a C library called MO to provide the real-time programming environment. An effort has been made to separate hardware-dependent code from hardware-independent code. As such, MO is used in the laboratory to control different robots on different operating systems (MS-DOS and Unix) with minimal changes in hardware-dependent code such as reading encoders and setting joint torques. TRACS utilizes all off-the-shelf hardware components. Further, the adoption of MS-DOS instead of Unix or Unix-based real-time operating systems makes the real-time programming simple and minimizes the interrupt latencies. The feasibility of the system is demonstrated by a series of experiments of grasping and manipulating common objects by two manipulators

    Development of an intelligent object for grasp and manipulation research

    Get PDF
    KƵiva R, Haschke R, Ritter H. Development of an intelligent object for grasp and manipulation research. Presented at the ICAR 2011, Tallinn, Estonia.In this paper we introduce a novel device, called iObject, which is equipped with tactile and motion tracking sensors that allow for the evaluation of human and robot grasping and manipulation actions. Contact location and contact force, object acceleration in space (6D) and orientation relative to the earth (3D magnetometer) are measured and transmitted wirelessly over a Bluetooth connection. By allowing human-human, human-robot and robot-robot comparisons to be made, iObject is a versatile tool for studying manual interaction. To demonstrate the efficiency and flexibility of iObject for the study of bimanual interactions, we report on a physiological experiment and evaluate the main parameters of the considered dual-handed manipulation task
    • ā€¦
    corecore