4 research outputs found

    Integrating Automatic Bio-manipulation Algorithm Theory with Software

    Get PDF
    在生命科学研究中,生物操作主要存在操作者训练时间长、操作成功率和效率低、操作缺乏一致性和重复性等问题,自动化生物操作可以解决这些问题。将多个商用设备集成在一起,利用视觉系统与机械手系统的映射矩阵建立误差校正方程,采用局部递归和全局优化标定的方法减小目标点的定位误差。在此基础上,利用卡尔曼滤波器建立系统控制方程,控制操作探针精确定位。最后,用程序实现了算法理论,并将该算法与其他设备驱动模块集成在一起,建立了柔性的、用户界面友好的软件系统。实验结果表明:建立的算法理论和软件系统,可以自动控制操作探针精确运动,运动精度在1个像素范围内。For the study of life sciences,bio-manipulation faces such challenges as long time for operator training,low success rate,low efficiency,operation inconsistency and repetition.We believe that automatic bio-manipulation can solve these challenges.We integrate several commercial devices and use the matrix that maps the visual system with the robotic system to derive the error correction equation.We also use the local recurrence and global optimization calibration method to reduce the target point′s positioning error.On this basis,using the Kalman filter,we establish the systems′ control equations to control the precise positioning of the bio-manipulation probe.Finally,we implement the algorithm theory and integrate the algorithm with the drive modules of other devices,thus establishing a flexible and user-friendly software system.The experimental results show that the algorithm theory and software systems thus integrated can automatically control the probe′s accurate motion,the motion accuracy being within one pixel.国家自然科学基金项目(50875222);福建省自然科学基金项目(2009J01265)资

    Haptic technology for micro-robotic cell injection training systems — a review

    Full text link
    Currently, the micro-robotic cell injection procedure is performed manually by expert human bio-operators. In order to be proficient at the task, lengthy and expensive dedicated training is required. As such, effective specialized training systems for this procedure can prove highly beneficial. This paper presents a comprehensive review of haptic technology relevant to cell injection training and discusses the feasibility of developing such training systems, providing researchers with an inclusive resource enabling the application of the presented approaches, or extension and advancement of the work. A brief explanation of cell injection and the challenges associated with the procedure are first presented. Important skills, such as accuracy, trajectory, speed and applied force, which need to be mastered by the bio-operator in order to achieve successful injection, are then discussed. Then an overview of various types of haptic feedback, devices and approaches is presented. This is followed by discussion on the approaches to cell modeling. Discussion of the application of haptics to skills training across various fields and haptically-enabled virtual training systems evaluation are then presented. Finally, given the findings of the review, this paper concludes that a haptically-enabled virtual cell injection training system is feasible and recommendations are made to developers of such systems

    Design og styring av smarte robotsystemer for applikasjoner innen biovitenskap: biologisk prøvetaking og jordbærhøsting

    Get PDF
    This thesis aims to contribute knowledge to support fully automation in life-science applications, which includes design, development, control and integration of robotic systems for sample preparation and strawberry harvesting, and is divided into two parts. Part I shows the development of robotic systems for the preparation of fungal samples for Fourier transform infrared (FTIR) spectroscopy. The first step in this part developed a fully automated robot for homogenization of fungal samples using ultrasonication. The platform was constructed with a modified inexpensive 3D printer, equipped with a camera to distinguish sample wells and blank wells. Machine vision was also used to quantify the fungi homogenization process using model fitting, suggesting that homogeneity level to ultrasonication time can be well fitted with exponential decay equations. Moreover, a feedback control strategy was proposed that used the standard deviation of local homogeneity values to determine the ultrasonication termination time. The second step extended the first step to develop a fully automated robot for the whole process preparation of fungal samples for FTIR spectroscopy by adding a newly designed centrifuge and liquid-handling module for sample washing, concentration and spotting. The new system used machine vision with deep learning to identify the labware settings, which frees the users from inputting the labware information manually. Part II of the thesis deals with robotic strawberry harvesting. This part can be further divided into three stages. i) The first stage designed a novel cable-driven gripper with sensing capabilities, which has high tolerance to positional errors and can reduce picking time with a storage container. The gripper uses fingers to form a closed space that can open to capture a fruit and close to push the stem to the cutting area. Equipped with internal sensors, the gripper is able to control a robotic arm to correct for positional errors introduced by the vision system, improving the robustness. The gripper and a detection method based on color thresholding were integrated into a complete system for strawberry harvesting. ii) The second stage introduced the improvements and updates to the first stage where the main focus was to address the challenges in unstructured environment by introducing a light-adaptive color thresholding method for vision and a novel obstacle-separation algorithm for manipulation. At this stage, the new fully integrated strawberry-harvesting system with dual-manipulator was capable of picking strawberries continuously in polytunnels. The main scientific contribution of this stage is the novel obstacle-separation path-planning algorithm, which is fundamentally different from traditional path planning where obstacles are typically avoided. The algorithm uses the gripper to push aside surrounding obstacles from an entrance, thus clearing the way for it to swallow the target strawberry. Improvements were also made to the gripper, the arm, and the control. iii) The third stage improved the obstacle-separation method by introducing a zig-zag push for both horizontal and upward directions and a novel dragging operation to separate upper obstacles from the target. The zig-zag push can help the gripper capture a target since the generated shaking motion can break the static contact force between the target and obstacles. The dragging operation is able to address the issue of mis-capturing obstacles located above the target, in which the gripper drags the target to a place with fewer obstacles and then pushes back to move the obstacles aside for further detachment. The separation paths are determined by the number and distribution of obstacles based on the downsampled point cloud in the region of interest.Denne avhandlingen tar sikte på å bidra med kunnskap om automatisering og robotisering av applikasjoner innen livsvitenskap. Avhandlingen er todelt, og tar for seg design, utvikling, styring og integrering av robotsystemer for prøvetaking og jordbærhøsting. Del I omhandler utvikling av robotsystemer til bruk under forberedelse av sopprøver for Fourier-transform infrarød (FTIR) spektroskopi. I første stadium av denne delen ble det utviklet en helautomatisert robot for homogenisering av sopprøver ved bruk av ultralyd-sonikering. Plattformen ble konstruert ved å modifisere en billig 3D-printer og utstyre den med et kamera for å kunne skille prøvebrønner fra kontrollbrønner. Maskinsyn ble også tatt i bruk for å estimere soppens homogeniseringsprosess ved hjelp av matematisk modellering, noe som viste at homogenitetsnivået faller eksponensielt med tiden. Videre ble det foreslått en strategi for regulering i lukker sløyfe som brukte standardavviket for lokale homogenitetsverdier til å bestemme avslutningstidspunkt for sonikeringen. I neste stadium ble den første plattformen videreutviklet til en helautomatisert robot for hele prosessen som forbereder prøver av sopprøver for FTIR-spektroskopi. Dette ble gjort ved å legge til en nyutviklet sentrifuge- og væskehåndteringsmodul for vasking, konsentrering og spotting av prøver. Det nye systemet brukte maskinsyn med dyp læring for å identifisere innstillingene for laboratorieutstyr, noe som gjør at brukerne slipper å registrere innstillingene manuelt.Norwegian University of Life SciencespublishedVersio

    Virtual reality training for micro-robotic cell injection

    Full text link
    This research was carried out to fill the gap within existing knowledge on the approaches to supplement the training for micro-robotic cell injection procedure by utilising virtual reality and haptic technologies
    corecore