854 research outputs found

    The Robot Car

    Get PDF

    Car Robots of Line Followers and False Detections Using Microcontroller of BS2SX Basics Tamp

    Get PDF
    This study aims to design a robot car that can follow the line and can avoid obstacles if it detects objects. The movement of the robot car is equipped with light sensor (LDR) and ultrasonic as its navigation system. Automation system navigation settings using BS2SX Basic Stamp microcontroller hardware, while the software with Pbasic programming language. The program is written on the Basic Stamp Editor already installed on the PC, then downloaded to EEPROM. Ultrasonic sensors are utilized as a detection of obstacles and light sensor sensors (LDR) as a tracked line detector. The robot will follow the designed line, if there are obstacles in front of it the robot will avoid and run back if the ultrasonic sensor does not detect obstacles in front of it. With the PWM signal input setting (Pulse Width Modulation) from the microcontroller, the robot car can walk straight, turn and will avoid when detecting obstacles. ©2017 JNSMR UIN Walisongo. All rights reserve

    Car Robots of Line Followers and False Detections Using Microcontroller of BS2SX Basics Tamp

    Get PDF
    This study aims to design a robot car that can follow the line and can avoid obstacles if it detects objects. The movement of the robot car is equipped with light sensor (LDR) and ultrasonic as its navigation system. Automation system navigation settings using BS2SX Basic Stamp microcontroller hardware, while the software with Pbasic programming language. The program is written on the Basic Stamp Editor already installed on the PC, then downloaded to EEPROM. Ultrasonic sensors are utilized as a detection of obstacles and light sensor sensors (LDR) as a tracked line detector. The robot will follow the designed line, if there are obstacles in front of it the robot will avoid and run back if the ultrasonic sensor does not detect obstacles in front of it. With the PWM signal input setting (Pulse Width Modulation) from the microcontroller, the robot car can walk straight, turn and will avoid when detecting obstacles. ©2017 JNSMR UIN Walisongo. All rights reserve

    Development of PAN (personal area network) for Mobile Robot Using Bluetooth Transceiver

    Get PDF
    In recent years, wireless applications using radio frequency (RF) have been rapidly evolving in personal computing and communications devices. Bluetooth technology was created to replace the cables used on mobile devices. Bluetooth is an open specification and encompasses a simple low-cost, low power solution for integration into devices. This research work aim was to provide a PAN (personal area network) for computer based mobile robot that supports real-time control of four mobile robots from a host mobile robot. With ad hoc topology, mobile robots may request and establish a connection when it is within the range or terminated the connection when it leaves the area. A system that contains both hardware and software is designed to enable the robots to participate in multi-agent robotics system (MARS). Computer based mobile robot provide operating system that enabled development of wireless connection via IP address

    Use of Mini Solar Panels for Battery Charging in the Mini Robot Warehouse

    Get PDF
    Industrial businesses have employed various robot cars to execute tasks related to goods movement, such as the Warehouse Robot Car, which utilizes a line follower sensor system. However, the substantial electricity consumption poses a challenge due to excessive energy use in robot cars. In light of this issue, the research aims to assess the utilization of mini solar panels in robot cars equipped with 3.3V Li-ION 18650 batteries. The primary objective is to calculate the electricity required to operate a line-following robot car within the designated sector. The approach involves designing robotics and solar panel systems for generating input and output power, including charging measurements. According to the findings, solar panels can charge at a maximum rate of 0.206358 watts for 45 minutes, while the robot can operate at a minimum rate of 0.26 watts for 29 minutes, resulting in a charging efficiency to robot performance of 43.5 percents

    Perancangan Dan Pengujian Model Mobil Robot Penanam Bibit Kangkung

    Get PDF
    Abstrak. Penanaman bibit kangkung sesuai prosedur dilakukan dengan proses menajuk, proses menajuk memakan waktu yang lama, sehingga membuat petani melakukan penaburan bibit untuk mempersingkat waktu peanaman, namun berpotensi terjadi gagal panen akbiat bibit tidak tertanam. Tujuan penelitian ini yaitu membangun mobil robot penanam bibit kangkung. Metode yang diperlukan dalam membangun mobil robot ini meliputi, penghitungan rata-rata bedengan, perancangan mobil robot, perakitan, dan pengujian. Dari metode tersebut maka diperoleh hasil yang meliputi desain dan analisis desain, hasil pembuatan/perakitan, hasil pengujian dan hasil penanaman. Desain mobil robot memiliki ukuran sesuai ukuran rata-rata bedengan yang digunakan petani yaitu 1,6  m, hasil analisis menunjukan beban maksimum robot 28,80 kg dengan perpindahan maksimal 3,266 mm yang membuat chassis masih aman digunakan.Proses perakitan mobil robot meliputi komponen elektronika dan komponen chassis sesuai dengan hasil desain dan perancangan. Robot yang telah dirakit dilakukan pengujian. Pengujian elektronika menampilkan keluaran arus sebesar 3A, tegangan 12 volt dan tahanan 5,4 ohm, pada penggunaan baterai dengan kapasitas 7,4 Ah yang dapat bekerja selama 4,5 jam dengan menggunakan stepdown supaya arus stabil membuat kecepatan stabil dan tidak membuat komponen mobil robot hangus akibat loncatan tegangan. Karena mobil robot bekerja secara autonomous maka pengujian vision juga diperlukan sebagai indra penglihatan gerak pada mobil robot. Selama 4,5 jam mobil robot dapat menanam bibit dengan luas lahan 4,7 hektar (4788 m2).Design and Testing Mobile Robot Spinach Water Seed PlantingAbstract. Planting water spinach seeds according to the procedure were carried out by the process of crowning, the process took a long time, so it made farmers sow seeds to shorten the planting time, but there was a potential for crop failure due to not planting the seeds. The purpose of this research was to build a mobile robot water spinach seed planting. The methods needed to build this robot car included calculating the average bed, designing a robot car, assembling, and testing. From this method, the results obtained included design and design analysis, manufacturing/assembly results, testing results, and planting results. The mobile robot design had a size according to the average size of the beds used by farmers, namely 1.6 × 1.4 × 0.5 m, the analysis results showed that the robot's maximum load was 28.80 kg with a maximum displacement of 3.266 mm which made the chassis still safe to use. The robot car assembly process included electronic components and chassis components in accordance with the design and design results. The assembled robot was tested. Electronic testing displayed a current output of 3A, a voltage of 12 volts, and a resistance of 5.4 ohms, on the use of a 7.4 Ah battery that can work for 4.5 hours using a step down so that the current was stable to make the speed stable and did not make robot car components scorched due to voltage jumps. Because robot cars worked autonomously, vision testing was also needed as a sense of motion in robot cars. During 4.5 hours the robot car can plant seeds with a land area of 4.7 hectares (4788 m2)

    Towards Visual Ego-motion Learning in Robots

    Full text link
    Many model-based Visual Odometry (VO) algorithms have been proposed in the past decade, often restricted to the type of camera optics, or the underlying motion manifold observed. We envision robots to be able to learn and perform these tasks, in a minimally supervised setting, as they gain more experience. To this end, we propose a fully trainable solution to visual ego-motion estimation for varied camera optics. We propose a visual ego-motion learning architecture that maps observed optical flow vectors to an ego-motion density estimate via a Mixture Density Network (MDN). By modeling the architecture as a Conditional Variational Autoencoder (C-VAE), our model is able to provide introspective reasoning and prediction for ego-motion induced scene-flow. Additionally, our proposed model is especially amenable to bootstrapped ego-motion learning in robots where the supervision in ego-motion estimation for a particular camera sensor can be obtained from standard navigation-based sensor fusion strategies (GPS/INS and wheel-odometry fusion). Through experiments, we show the utility of our proposed approach in enabling the concept of self-supervised learning for visual ego-motion estimation in autonomous robots.Comment: Conference paper; Submitted to IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2017, Vancouver CA; 8 pages, 8 figures, 2 table
    corecore