39,952 research outputs found

    PEMANFAATAN BLUETOOTH DAN SENSOR ACCELEROMETER PADA PONSEL BERBASIS ANDROID UNTUK PENGONTROLAN GERAKAN MOBILE ROBOT

    Get PDF
    Accelerometer is one from many kinds of sensor which embedded on an Android based cellphone. Accelerometer works by measuring the acceleration of gravitation, where in a cellphone movement such as swinging or rotating will be read by the accelerometer and used to change the orientation of the phone screen, also in android games is used to control the movement, while bluetooth is a standard of wireless mobile communication technology to exchange data between cellphones. The application whom made by the authors, use bluetooth as a bridge to connect between Android based cellphone and mobile robot, as well as the Accelerometer which acts as mobile robot motion controller in real time. Mobile robot motion controlled by sending the value change of accelerometer to the mobile robot via serial communication. This application is expected to provide a change for the develeopment of mobile robot control

    Mapless LiDAR Navigation Control of Wheeled Mobile Robots Based on Deep Imitation Learning

    Get PDF
    [[abstract]]This paper addresses the problems related to the mapless navigation control of wheeled mobile robots based on deep learning technology. The traditional navigation control framework is based on a global map of the environment, and its navigation performance depends on the quality of the global map. In this paper, we proposes a mapless Light Detection and Ranging (LiDAR) navigation control method for wheeled mobile robots based on deep imitation learning. The proposed method is a data-driven control method that directly uses LiDAR sensors and relative target position for mobile robot navigation control. A deep convolutional neural network (CNN) model is proposed to predict motion control commands of the mobile robot without the requirement of the global map to achieve navigation control of the mobile robot in unknown environments. While collecting the training dataset, we manipulated the mobile robot to avoid obstacles through manual control and recorded the raw data of the LiDAR sensor, the relative target position, and the corresponding motion control commands. Next, we applied a data augmentation method on the recorded samples to increase the number of training samples in the dataset. In the network model design, the proposed CNN model consists of a LiDAR CNN module to extract LiDAR features and a motion prediction module to predict the motion behavior of the robot. In the model training phase, the proposed CNN model learns the mapping between the input sensor data and the desired motion behavior through end-to-end imitation learning. Experimental results show that the proposed mapless LiDAR navigation control method can safely navigate the mobile robot in four unseen environments with an average success rate of 75%. Therefore, the proposed mapless LiDAR navigation control system is effective for robot navigation control in an unknown environment without the global map.[[notice]]補正完

    A teleoperation framework for mobile robots based on shared control

    Get PDF
    Mobile robots can complete a task in cooperation with a human partner. In this paper, a hybrid shared control method for a mobile robot with omnidirectional wheels is proposed. A human partner utilizes a six degrees of freedom haptic device and electromyography (EMG) signals sensor to control the mobile robot. A hybrid shared control approach based on EMG and artificial potential field is exploited to avoid obstacles according to the repulsive force and attractive force and to enhance the human perception of the remote environment based on force feedback of the mobile platform. This shared control method enables the human partner to tele-control the mobile robot’s motion and achieve obstacles avoidance synchronously. Compared with conventional shared control methods, this proposed one provides a force feedback based on muscle activation and drives the human partners to update their control intention with predictability. Experimental results demonstrate the enhanced performance of the mobile robots in comparison with the methods in the literature

    The off-line programming of a PC based industrial robot with sensory feedback (volume I of II)

    Get PDF
    The need for sensor-based automatic motion planning and control of industrial robots in an unstructured environment is extensive. For example in-factory transportation, household chores, military applications, chemical, radioactive, and other applications dangerous to humans. Researchers are attempting to build systems capable of generating purposeful motion in highly uncertain co-nplex environments, using on-line information from robot sensors. An example of such a task would be moving a mobile robot or a manipulator arm from its starting position to a goal position in a scene with unknown arbitrarily shaped obstacles. Carrying out such tasks requires, first, sensors and relatedGR201

    Comparative Study of Computer Vision Based Line Followers Using Raspberry Pi and Jetson Nano

    Get PDF
    The line follower robot is a mobile robot which can navigate and traverse to another place by following a trajectory which is generally in the form of black or white lines. This robot can also assist human in carrying out transportation and industrial automation. However, this robot also has several challenges with regard to the calibration issue, incompatibility on wavy surfaces, and also the light sensor placement due to the line width variation. Robot vision utilizes image processing and computer vision technology for recognizing objects and controlling the robot motion. This study discusses the implementation of vision based line follower robot using a camera as the only sensor used to capture objects. A comparison of robot performance employing different CPU controllers, namely Raspberry Pi and Jetson Nano, is made. The image processing uses an edge detection method which detect the border to discriminate two image areas and mark different parts. This method aims to enable the robot to control its motion based on the object captured by the webcam. The results show that the accuracies of the robot employing the Raspberry Pi and Jetson Nano are 96% and 98%, respectively

    Dynamic sensor planning with stereo for model identification on a mobile platform

    Get PDF
    This paper presents an approach to sensor planning for simultaneous pose estimation and model identification of a moving object using a stereo camera sensor mounted on a mobile base. For a given database of object models, we consider the problem of identifying an object known to belong to the database and where to move next should the object not be easily identifiable from the initial viewpoint. No constraints on the motion of the object nor the robot itself are assumed, which is an improvement on previous methods. Sensor planning is based on the selection of the control action that optimizes a cost metric based on information gain. Experimental results from the implementation of the method on a two-wheeled nonholonomic robot are presented to illustrate and validate the method

    A Novel Artificial Organic Controller with Hermite Optical Flow Feedback for Mobile Robot Navigation

    Get PDF
    This chapter describes a novel nature-inspired and intelligent control system for mobile robot navigation using a fuzzy-molecular inference (FMI) system as the control strategy and a single vision-based sensor device, that is, image acquisition system, as feedback. In particular, FMI system is proposed as a hybrid fuzzy inference system with an artificial hydrocarbon network structure as defuzzifier that deals with uncertainty in motion feedback, improving robot navigation in dynamic environments. Additionally, the robotics system uses processed information from an image acquisition device using a real-time Hermite optical flow approach. This organic and nature-inspired control strategy was compared with a conventional controller and validated in an educational robot platform, providing excellent results when navigating in dynamic environments with a single-constrained perception device
    • …
    corecore