44,661 research outputs found

    Development of an Optical System Based on Spectral Imaging Used for a Slug Control Robot

    Get PDF
    The state-of-the-art technique to control slug pests in agriculture is the spreading of slug pellets. This method has some downsides, because slug pellets also harm beneficials and often fail because their efficiency depends on the prevailing weather conditions. This study is part of a research project which is developing a pest control robot to monitor the field, detect slugs, and eliminate them. Robots represent a promising alternative to slug pellets. They work independent of weather conditions and can distinguish between pests and beneficials. As a prerequisite, a robot must be able to reliably identify slugs irrespective of the characteristics of the surrounding conditions. In this context, the utilization of computer vision and image analysis methods are challenging, because slugs look very similar to the soil, particularly in color images. Therefore, the goal of this study was to develop an optical filter-based system that distinguishes between slugs and soil. In this context, the spectral characteristics of both slugs and soil in the visible and visible near-infrared (VNIR) wavebands were measured. Conspicuous maxima followed by conspicuous local minima were found for the reflection spectra of slugs in the near infrared range from 850 nm to 990 nm]. Thus, this enabled differentiation between slugs and soils; soils showed a monotonic increase in the intensity of the relative reflection for this wavelength. The extrema determined in the reflection spectra of slugs were used to develop and set up a slug detector device consisting of a monochromatic camera, a filter changer and two narrow bandpass filters with nominal wavelengths of 925 nm and 975 nm. The developed optical system takes two photographs of the target area at night. By subtracting the pixel values of the images, the slugs are highlighted, and the soil is removed in the image due to the properties of the reflection spectra of soils and slugs. In the resulting image, the pixels of slugs were, on average, 12.4 times brighter than pixels of soil. This enabled the detection of slugs by a threshold method.Peer Reviewe

    A steady state tip control strategy for long reach robots

    Get PDF
    The work presented in this thesis describes the development of a novel strategy for the steady state tip position control of a single link flexible robot arm. Control is based upon a master/slave relationship. Arm trajectory is defined by through 'master' positioning head which moves a laser through a programmed path. Tip position is detected by an optical system which produces an error signal proportional to the displacement of the tip from the demand laser spot position. The error signal and its derivative form inputs to the arm 'slave' controller so enabling direct tip control with simultaneous correction for arm bending. Trajectory definition is not model-based as it is defined optically through movement of the positioning head alone. A critical investigation of vacuum tube and solid state sensing methods is undertaken leading to the development of a photodiode quadrant detector beam tracking system. The effect of varying the incident light parameters on the beam tracker performance are examined from which the optimum illumination characteristics are determined. Operational testing of the system on a dual-axis prototype robot using the purpose-built beam tracker has shown that successful steady state tip control can be achieved through a PD based slave controller. Errors of less than 0.05 mm and settling times of 0.2 s are obtained. These results compare favourably with those for the model-based tip position correction strategies where tracking errors of ± 0.6 mm are recorded

    Vision-based interface applied to assistive robots

    Get PDF
    This paper presents two vision-based interfaces for disabled people to command a mobile robot for personal assistance. The developed interfaces can be subdivided according to the algorithm of image processing implemented for the detection and tracking of two different body regions. The first interface detects and tracks movements of the user's head, and these movements are transformed into linear and angular velocities in order to command a mobile robot. The second interface detects and tracks movements of the user's hand, and these movements are similarly transformed. In addition, this paper also presents the control laws for the robot. The experimental results demonstrate good performance and balance between complexity and feasibility for real-time applications.Fil: Pérez Berenguer, María Elisa. Universidad Nacional de San Juan. Facultad de Ingeniería. Departamento de Electrónica y Automática. Gabinete de Tecnología Médica; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Soria, Carlos Miguel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Juan. Instituto de Automática. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; ArgentinaFil: López Celani, Natalia Martina. Universidad Nacional de San Juan. Facultad de Ingeniería. Departamento de Electrónica y Automática. Gabinete de Tecnología Médica; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Nasisi, Oscar Herminio. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; ArgentinaFil: Mut, Vicente Antonio. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Juan. Instituto de Automática. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; Argentin

    Optical coherence tomography-based consensus definition for lamellar macular hole.

    Get PDF
    BackgroundA consensus on an optical coherence tomography definition of lamellar macular hole (LMH) and similar conditions is needed.MethodsThe panel reviewed relevant peer-reviewed literature to reach an accord on LMH definition and to differentiate LMH from other similar conditions.ResultsThe panel reached a consensus on the definition of three clinical entities: LMH, epiretinal membrane (ERM) foveoschisis and macular pseudohole (MPH). LMH definition is based on three mandatory criteria and three optional anatomical features. The three mandatory criteria are the presence of irregular foveal contour, the presence of a foveal cavity with undermined edges and the apparent loss of foveal tissue. Optional anatomical features include the presence of epiretinal proliferation, the presence of a central foveal bump and the disruption of the ellipsoid zone. ERM foveoschisis definition is based on two mandatory criteria: the presence of ERM and the presence of schisis at the level of Henle's fibre layer. Three optional anatomical features can also be present: the presence of microcystoid spaces in the inner nuclear layer (INL), an increase of retinal thickness and the presence of retinal wrinkling. MPH definition is based on three mandatory criteria and two optional anatomical features. Mandatory criteria include the presence of a foveal sparing ERM, the presence of a steepened foveal profile and an increased central retinal thickness. Optional anatomical features are the presence of microcystoid spaces in the INL and a normal retinal thickness.ConclusionsThe use of the proposed definitions may provide uniform language for clinicians and future research

    Shape sensing of miniature snake-like robots using optical fibers

    Get PDF
    Snake like continuum robots are increasingly used for minimally invasive surgery. Most robotic devices of this sort that have been reported to date are controlled in an open loop manner. Using shape sensing to provide closed loop feedback would allow for more accurate control of the robot's position and, hence, more precise surgery. Fiber Bragg Gratings, magnetic sensors and optical reflectance sensors have all been reported for this purpose but are often limited by their cost, size, stiffness or complexity of fabrication. To address this issue, we designed, manufactured and tested a prototype two-link robot with a built-in fiber-optic shape sensor that can deliver and control the position of a CO 2 -laser fiber for soft tissue ablation. The shape sensing is based on optical reflectance, and the device (which has a 4 mm outer diameter) is fabricated using 3D printing. Here we present proof-of-concept results demonstrating successful shape sensing - i.e. measurement of the angular displacement of the upper link of the robot relative to the lower link - in real time with a mean measurement error of only 0.7°
    corecore