5 research outputs found

    Flexible Composition of Robot Logic with Computer Vision Services

    Get PDF
    Vision-based robotics is an ever-growing field within industrial automation. Demands for greater flexibility and higher quality motivate manufacturing companies to adopt these technologies for such tasks as material handling, assembly, and inspection. In addition to the direct use in the manufacturing setting, robots combined with vision systems serve as highly flexible means for realization of prototyping test-beds in the R&D context.Traditionally, the problem areas of robotics and computer vision are attacked separately. An exception is the study of vision-based servo control, the focus of which constitutes control-theoretic aspects of vision-based robot guidance under assumption that robot joints can be controlled directly. The missing part is a systemic approach to implementing robotic application with vision sensing given industrial robots constrained by their programming interface. This thesis targets the development process of vision-based robotic systems in an event-driven environment. It focuses on design and composition of three functional components: (1) robot control function, (2) image acquisition function, and (3) image processing function. The thesis approaches its goal by a combination of laboratory results, a case study of an industrial company (Kongsberg Automotive AS), and formalization of computational abstractions and architectural solutions. The image processing function is tackled with the application of reactive pipelines. The proposed system development method allows for smooth transition from early-stage vision algorithm prototyping to the integration phase. The image acquisition function in this thesis is exposed in a service-oriented manner with the help of a flexible set of concurrent computational primitives. To realize control of industrial robots, a distributed architecture is devised, which supports composability of communication-heavy robot logic, as well as flexible coupling of the robot control node with vision services

    Event-driven industrial robot control architecture for the Adept V+ platform

    Get PDF
    Modern industrial robotic systems are highly interconnected. They operate in a distributed environment and communicate with sensors, computer vision systems, mechatronic devices, and computational components. On the fundamental level, communication and coordination between all parties in such distributed system are characterized by discrete event behavior. The latter is largely attributed to the specifics of communication over the network, which, in terms, facilitates asynchronous programming and explicit event handling. In addition, on the conceptual level, events are an important building block for realizing reactivity and coordination. Eventdriven architecture has manifested its effectiveness for building loosely-coupled systems based on publish-subscribe middleware, either general-purpose or robotic-oriented. Despite all the advances in middleware, industrial robots remain difficult to program in context of distributed systems, to a large extent due to the limitation of the native robot platforms. This paper proposes an architecture for flexible event-based control of industrial robots based on the Adept V+ platform. The architecture is based on the robot controller providing a TCP/IP server and a collection of robot skills, and a high-level control module deployed to a dedicated computing device. The control module possesses bidirectional communication with the robot controller and publish/subscribe messaging with external systems. It is programmed in asynchronous style using pyadept, a Python library based on Python coroutines, AsyncIO event loop and ZeroMQ middleware. The proposed solution facilitates integration of Adept robots into distributed environments and building more flexible robotic solutions with eventbased logic

    EPypes: a framework for building event-driven data processing pipelines

    Get PDF
    Many data processing systems are naturally modeled as pipelines, where data flows though a network of computational procedures. This representation is particularly suitable for computer vision algorithms, which in most cases possess complex logic and a big number of parameters to tune. In addition, online vision systems, such as those in the industrial automation context, have to communicate with other distributed nodes. When developing a vision system, one normally proceeds from ad hoc experimentation and prototyping to highly structured system integration. The early stages of this continuum are characterized with the challenges of developing a feasible algorithm, while the latter deal with composing the vision function with other components in a networked environment. In between, one strives to manage the complexity of the developed system, as well as to preserve existing knowledge. To tackle these challenges, this paper presents EPypes, an architecture and Python-based software framework for developing vision algorithms in a form of computational graphs and their integration with distributed systems based on publish-subscribe communication. EPypes facilitates flexibility of algorithm prototyping, as well as provides a structured approach to managing algorithm logic and exposing the developed pipelines as a part of online systems

    Flexible Image Acquisition Service for Distributed Robotic Systems

    No full text
    The widespread use vision systems in robotics introduces a number of challenges related to management of image acquisition and image processing tasks, as well as their coupling to the robot control function. With the proliferation of more distributed setups and flexible robotic architectures, the workflow of image acquisition needs to support a wider variety of communication styles and application scenarios. This paper presents FxIS, a flexible image acquisition service targeting distributed robotic systems with event-based communication. The principal idea a FxIS is in composition of a number of execution threads with a set of concurrent data structures, supporting acquisition from multiple cameras that is closely synchronized in time, both between the cameras and with the request timestamp

    Flexible image acquisition service for distributed robotic systems

    No full text
    The widespread use vision systems in robotics introduces a number of challenges related to management of image acquisition and image processing tasks, as well as their coupling to the robot control function. With the proliferation of more distributed setups and flexible robotic architectures, the workflow of image acquisition needs to support a wider variety of communication styles and application scenarios. This paper presents FxIS, a flexible image acquisition service targeting distributed robotic systems with event-based communication. The principal idea a FxIS is in composition of a number of execution threads with a set of concurrent data structures, supporting acquisition from multiple cameras that is closely synchronized in time, both between the cameras and with the request timestamp
    corecore