9,018 research outputs found

    Graphical programming and the use of simulation for space-based manipulators

    Get PDF
    Robotic manipulators are difficult to program even without the special requirements of a zero-gravity environment. While attention should be paid to investigating the usefulness of industrial application programming methods to space manipulators, new methods with potential application to both environments need to be invented. These methods should allow various levels of autonomy and human-in-the-loop interaction and simple, rapid switching among them. For all methods simulation must be integrated to provide reliability and safety. Graphical programming of manipulators is a candidate for an effective robot programming method despite current limitations in input devices and displays. A research project in task-level robot programming has built an innovative interface to a state-of-the-art commercial simulation and robot programming platform. The prototype demonstrates simple augmented methods for graphical programming and simulation which may be of particular interest to those concerned with Space Station applications; its development has also raised important issues for the development of more sophisticated robot programming tools. Both aspects of the project are discussed

    Robot graphic simulation testbed

    Get PDF
    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts

    Automated sequence and motion planning for robotic spatial extrusion of 3D trusses

    Full text link
    While robotic spatial extrusion has demonstrated a new and efficient means to fabricate 3D truss structures in architectural scale, a major challenge remains in automatically planning extrusion sequence and robotic motion for trusses with unconstrained topologies. This paper presents the first attempt in the field to rigorously formulate the extrusion sequence and motion planning (SAMP) problem, using a CSP encoding. Furthermore, this research proposes a new hierarchical planning framework to solve the extrusion SAMP problems that usually have a long planning horizon and 3D configuration complexity. By decoupling sequence and motion planning, the planning framework is able to efficiently solve the extrusion sequence, end-effector poses, joint configurations, and transition trajectories for spatial trusses with nonstandard topologies. This paper also presents the first detailed computation data to reveal the runtime bottleneck on solving SAMP problems, which provides insight and comparing baseline for future algorithmic development. Together with the algorithmic results, this paper also presents an open-source and modularized software implementation called Choreo that is machine-agnostic. To demonstrate the power of this algorithmic framework, three case studies, including real fabrication and simulation results, are presented.Comment: 24 pages, 16 figure

    Multi-bot Easy Control Hierarchy

    Get PDF
    The goal of our project is to create a software architecture that makes it possible to easily control a multi-robot system, as well as seamlessly change control modes during operation. The different control schemes first include the ability to implement on-board and off-board controllers. Second, the commands can specify either actuator level, vehicle level, or fleet level behavior. Finally, motion can be specified by giving a waypoint and time constraint, a velocity and heading, or a throttle and angle. Our code is abstracted so that any type of robot - ranging from ones that use a differential drive set up, to three-wheeled holonomic platforms, to quadcopters - can be added to the system by simply writing drivers that interface with the hardware used and by implementing math packages that do the required calculations. Our team has successfully demonstrated piloting a single robots while switching between waypoint navigation and a joystick controller. In addition, we have demonstrated the synchronized control of two robots using joystick control. Future work includes implementing a more robust cluster control, including off-board functionality, and incorporating our architecture into different types of robots
    • …
    corecore