18 research outputs found

    Developing a control architecture for a vision based automatic pallet picking

    Get PDF
    The goal of this project is to enhance the controlling performance of an articulated-frame-steering autonomous machine. The current system employs the vision sensor for the detection of the target object and controlling the machine movement by the data obtained by that. The problem we face is that at a far distance when location of the object is detected for the first time, the unreliable data specially the orientation of target will be acquired. So relying on this type of data for heading the machine can distract the whole system behavior, and also there should be a logic for switching between different states of machine in order to control the different stages of performance. In order to solve the problem, a smooth switching logic will be defined to control the machine. This switching logic should be in a case that operational coordinates with the visual servoing be synchronized and it should be planned to control robot degrees of freedom in each step. The MATLAB/Simulink is employed to execute the idea since this software is capable in the logic-based planning in a graphical way in stateflow. The results of the implemented method would be increasing the accuracy of picking the object and enabling the machine to have more space to have any needed maneuver, moreover the environmental disturbances will have the minimum effect on the final result

    Vision-based Global Path Planning and Trajectory Generation for Robotic Applications in Hazardous Environments

    Get PDF
    The aim of this study is to ļ¬nd an eļ¬ƒcient global path planning algorithm and trajectory generation method using a vision system. Path planning is part of the more generic navigation function of mobile robots that consists of establishing an obstacle-free path, starting from the initial pose to the target pose in the robot workspace.In this thesis, special emphasis is placed on robotic applications in industrial and scientiļ¬c infrastructure environments that are hazardous and inaccessible to humans, such as nuclear power plants and ITER1 and CERN2 LHC3 tunnel. Nuclear radiation can cause deadly damage to the human body, but we have to depend on nuclear energy to meet our great demands for energy resources. Therefore, the research and development of automatic transfer robots and manipulations under nuclear environment are regarded as a key technology by many countries in the world. Robotic applications in radiation environments minimize the danger of radiation exposure to humans. However, the robots themselves are also vulnerable to radiation. Mobility and maneuverability in such environments are essential to task success. Therefore, an eļ¬ƒcient obstacle-free path and trajectory generation method are necessary for ļ¬nding a safe path with maximum bounded velocities in radiation environments. High degree of freedom manipulators and maneuverable mobile robots with steerable wheels, such as non-holonomic omni-directional mobile robots make them suitable for inspection and maintenance tasks where the camera is the only source of visual feedback.In this thesis, a novel vision-based path planning method is presented by utilizing the artiļ¬cial potential ļ¬eld, the visual servoing concepts and the CAD-based recognition method to deal with the problem of path and trajectory planning. Unlike the majority of conventional trajectory planning methods that consider a robot as only one point, the entire shape of a mobile robot is considered by taking into account all of the robotā€™s desired points to avoid obstacles. The vision-based algorithm generates synchronized trajectories for all of the wheels on omni-directional mobile robot. It provides the robotā€™s kinematic variables to plan maximum allowable velocities so that at least one of the actuators is always working at maximum velocity. The advantage of generated synchronized trajectories is to avoid slippage and misalignment in translation and rotation movement. The proposed method is further developed to propose a new vision-based path coordination method for multiple mobile robots with independently steerable wheels to avoid mutual collisions as well as stationary obstacles. The results of this research have been published to propose a new solution for path and trajectory generation in hazardous and inaccessible to human environments where the one camera is the only source of visual feedback

    Raskaiden pyƶrƤllisten mobiilirobottien mallinnus, simulointi ja radanseuranta

    Get PDF
    Autonomous vehicles have been studied at least since the 1950s. During the last decade, interest towards this field of study has grown imposingly. Path-following control is one of the main subjects among autonomous vehicles. The focus in path-following control is in controlling of the pose of the vehicle to match with the desired pose, which is provided by a specified path or trajectory. Usually the pose is represented in a two-dimensional world frame by the means of x and y coordinates and angle of heading. The methods used in this thesis are modelling and simulation (M&S). M&S of physical systems is a well-recognized field of expertise among engineering sciences. Rapid system prototyping, control designing, or studying an existing system by the means of M&S provide possibilities for observing, developing, and testing under risk-free environment. In this thesis, using the M&S methods provides possibilities for fast and economical evaluation of the designed algorithms before considering prototype testing with actual systems under real environments. Objectives of the thesis are to implement dynamic robot models of two vehicles, design high-level controller structures for their actuators, implement a path-following controller, and study the behaviour of the robots during various autonomous path-following scenarios. The vehicles to be modelled are Ponsse Caribou S10 and Haulotte 16RTJ PRO. The exact study vehicles are owned by Tampere University of Technology. Results from closed loop path-following control of the modelled robots denoted accurate path-following under well-behaved path curvatures, generally with a mean absolute lateral position error less than 0.1 m. In the best simulation results, mean position errors were under of 0.05 m. The implemented controllers proved to be effective at the whole velocity range of the forwarder Ponsse Caribou S10. The implemented high-level inverse kinematic controllers succeeded in synchronous commanding of the robotsā€™ actuators. Due to the forming of the inverse kinematics, the path-following controller was able to output identical control signals independent of the steering structure of the vehicle, thus permitting a possibility for future development among more advanced path-following control

    Use of Advance Driver Assistance System Sensors for Human Detection and Work Machine Odometry

    Get PDF
    This master thesis covers two major topics, the first is the use of Advance driver assistance system (ADAS) sensors for human detection, and second is the use of ADAS sensors for the odometry estimation of the mobile work machine. Solid-state Lidar and Automotive Radar sensors are used as the ADAS sensors. Real-time Simulink models are created for both the sensors. The data is collected from the sensors by connecting the sensors with the XPC target via CAN communication. Later the data is later sent to Robot operating system (ROS) for visualization. The testing of the Solid-state Lidar and Automotive Radar sensors has been performed in different conditions and scenarios, it isnā€™t limited to human detection only. Detection of cars, machines, building, fence and other multiple objects have also been tested. Moreover, the two major cases for the testing of the sensors were the static case and the dynamic case. For the static case, both the sensors were mounted on a stationary rack and the moving/stationary objects were detected by the sensors. For the dynamic case, both the sensors were mounted on the GIM mobile machine, and the machine was driven around for the sensors to detect an object in the environment. The results are promising, and it is concluded that the sensors can be used for the human detection and for some other applications as well. Furthermore, this research presents an algorithm used to estimate the complete odometry/ ego-motion of the mobile work machine. For this purpose, we are using an automotive radar sensor. Using this sensor and a gyroscope, we seek a complete odometry of the GIM mobile machine, which includes 2-components of linear speed (forward and side slip) and a single component of angular speed. Kinematic equations are calculated having the constraints of vehicle motion and stationary points in the environment. Radial velocity and the azimuth angle of the objects detected are the major components of the kinematic equations provided by the automotive radar sensor. A stationary environment is a compulsory clause in accurate estimation of radar odometry. Assuming the points detected by the automotive radar sensor are stationary, it is then possible to calculate all the three unknown components of speed. However, it is impossible to calculate all the three components using a single radar sensor, because the latter system of equation becomes singular. Literature suggests use of multiple radar sensors, however, in this research, a vertical gyroscope is used to overcome this singularity. GIM mobile machine having a single automotive radar sensor and a vertical gyroscope is used for the experimentation. The results have been compared with the algorithm presented in [32] as well as the wheel odometry of the GIM mobile machine. Furthermore, the results have also been tested with complete navigation solution (GNSS included) as a reference path

    Proceedings of the NASA Conference on Space Telerobotics, volume 3

    Get PDF
    The theme of the Conference was man-machine collaboration in space. The Conference provided a forum for researchers and engineers to exchange ideas on the research and development required for application of telerobotics technology to the space systems planned for the 1990s and beyond. The Conference: (1) provided a view of current NASA telerobotic research and development; (2) stimulated technical exchange on man-machine systems, manipulator control, machine sensing, machine intelligence, concurrent computation, and system architectures; and (3) identified important unsolved problems of current interest which can be dealt with by future research

    Design and implementation of flexible microprocessor control for retrofitting to first generation robotic devices

    Get PDF
    This Master of Science project concerns the design and development of a flexible microprocessor-based controller for a Versatran Industrial Robot. The software and hardware are designed in modules to enhance the flexibility of the controller so that it can be used as the control unit for other forms of workhandling equipment. The hardware of the designed controller is based on the Texas Instruments single board computer and interface printed circuit boards although some specially designed interface hardware was required. The software is developed in two major categories, which are "real-time" modules and "operator communication" modules. The real-time modules were for the control of the hydraulic servo-valves, pneumatic actuators and interlock switches, whilst the operator communication modules were used to assist the operator in programming "handling" sequences". The main advantages of the controller in its present form can be summarised thus:- (i) The down-time between program changes is significantly reduced; (ii) There can be many more positions programmed in a "handling sequence"; (iii)Greater control over axis dynamics can be achieved The software and hardware structure adopted has sufficient flexibility to allow many future enhancements to be provided. For example, as part of a subsequent research project additional facilities are being implemented as follows: a teach hand held pendant is being installed to improve still further the ease with which "handling sequences" can be programmed; improved control algorithms are being implemented and these will facilitate contouring; communication software is being included so that the controller can access via a node a commercially available local area network

    Conference on Intelligent Robotics in Field, Factory, Service, and Space (CIRFFSS 1994), volume 1

    Get PDF
    The AIAA/NASA Conference on Intelligent Robotics in Field, Factory, Service, and Space (CIRFFSS '94) was originally proposed because of the strong belief that America's problems of global economic competitiveness and job creation and preservation can partly be solved by the use of intelligent robotics, which are also required for human space exploration missions. Individual sessions addressed nuclear industry, agile manufacturing, security/building monitoring, on-orbit applications, vision and sensing technologies, situated control and low-level control, robotic systems architecture, environmental restoration and waste management, robotic remanufacturing, and healthcare applications

    Proceedings of the NASA Conference on Space Telerobotics, volume 2

    Get PDF
    These proceedings contain papers presented at the NASA Conference on Space Telerobotics held in Pasadena, January 31 to February 2, 1989. The theme of the Conference was man-machine collaboration in space. The Conference provided a forum for researchers and engineers to exchange ideas on the research and development required for application of telerobotics technology to the space systems planned for the 1990s and beyond. The Conference: (1) provided a view of current NASA telerobotic research and development; (2) stimulated technical exchange on man-machine systems, manipulator control, machine sensing, machine intelligence, concurrent computation, and system architectures; and (3) identified important unsolved problems of current interest which can be dealt with by future research

    Fourth Annual Workshop on Space Operations Applications and Research (SOAR 90)

    Get PDF
    The proceedings of the SOAR workshop are presented. The technical areas included are as follows: Automation and Robotics; Environmental Interactions; Human Factors; Intelligent Systems; and Life Sciences. NASA and Air Force programmatic overviews and panel sessions were also held in each technical area
    corecore