118 research outputs found

    Study of micromanipulation using stereoscopic microscope

    Get PDF
    In this paper, we describe a visual feedback system using a stereoscopic microscope that controls a micromanipulator so that a needle may pierce a target as much length as desired. At first, for achieving the manipulation at a realistic rate, we proposed a strategy of moving the needle head. Secondarily, we developed an algorithm for prediction of the tip position of the needle head within the target. Before the needle pierces the target, the shape of the needle head is preserved as a reference pattern. After the needle piercing the target, the shape of the needle head within the target is predicted using the reference pattern and the tip position of the needle head can be detected. Experimental results indicate that the proposed system may be useful in micromanipulation such as microinjection to seeds </p

    Factors of Micromanipulation Accuracy and Learning

    No full text
    Micromanipulation refers to the manipulation under a microscope in order to perform delicate procedures. It is difficult for humans to manipulate objects accurately under a microscope due to tremor and imperfect perception, limiting performance. This project seeks to understand factors affecting accuracy in micromanipulation, and to propose strategies for learning improving accuracy. Psychomotor experiments were conducted using computer-controlled setups to determine how various feedback modalities and learning methods can influence micromanipulation performance. In a first experiment, static and motion accuracy of surgeons, medical students and non-medical students under different magniification levels and grip force settings were compared. A second experiment investigated whether the non-dominant hand placed close to the target can contribute to accurate pointing of the dominant hand. A third experiment tested a training strategy for micromanipulation using unstable dynamics to magnify motion error, a strategy shown to be decreasing deviation in large arm movements. Two virtual reality (VR) modules were then developed to train needle grasping and needle insertion tasks, two primitive tasks in a microsurgery suturing procedure. The modules provided the trainee with a visual display in stereoscopic view and information on their grip, tool position and angles. Using the VR module, a study examining effects of visual cues was conducted to train tool orientation. Results from these studies suggested that it is possible to learn and improve accuracy in micromanipulation using appropriate sensorimotor feedback and training

    Micromanipulation with Stereoscopic Imaging

    Get PDF
    The neuronal organization of the hippocampus is extensively studied. However, the synoptic connections between neurons of the hilus and neurons of the granule cell layer are still debate. In the present study, we utilized automated micromanipulation technique combined with visual image processing to analyze the synaptic connections between hilar mossy cells or mossy fibers, and granule cells or basket cells in the granule cell layer. The stereoscopic microscope image of the dentate gyms of coronal slice in adult rat brain was converted into a binary image. The area of the dentate gyms is detected by genetic algorithms. This was matched with the template image made from a microscopic photo of cresyl violet-stained coronal section of the rat brain by parallel and rotation matching. A glass microelectrode inserted into the apex of the dentate gyrus by the micromanipulation system. Lipophilic fluorescent tracer DiI or DiD was injected into the hilus by a nanoinjector. </p

    Dynamic stereo microscopy for studying particle sedimentation

    Get PDF
    We demonstrate a new method for measuring the sedimentation of a single colloidal bead by using a combination of optical tweezers and a stereo microscope based on a spatial light modulator. We use optical tweezers to raise a micron-sized silica bead to a fixed height and then release it to observe its 3D motion while it sediments under gravity. This experimental procedure provides two independent measurements of bead diameter and a measure of Faxén’s correction, where the motion changes due to presence of the boundary

    Nanopipette with a lipid nanotube as nanochannel

    Get PDF
    Proceedings of the 7th IEEE International Conference on Nanotechnology, August 2 - 5, 2007, Hong Kon

    Multi-particle three-dimensional coordinate estimation in real-time optical manipulation

    Get PDF
    oai:ojs.pkp.sfu.ca:article/304We have previously shown how stereoscopic images can be obtained in our three-dimensional optical micromanipulation system [J. S. Dam et al, Opt. Express 16, 7244 (2008)]. Here, we present an extension and application of this principle to automatically gather the three-dimensional coordinates for all trapped particles with high tracking range and high reliability without requiring user calibration. Through deconvolving of the red, green, and blue colour planes to correct for bleeding between colour planes, we show that we can extend the system to also utilize green illumination, in addition to the blue and red. Applying the green colour as on-axis illumination yields redundant information for enhanced error correction, which is used to verify the gathered data, resulting in reliable coordinates as well as producing visually attractive images

    Design and implementation of a vision system for microassembly workstation

    Get PDF
    Rapid development of micro/nano technologies and the evolvement of biotechnology have led to the research of assembling micro components into complex microsystems and manipulation of cells, genes or similar biological components. In order to develop advanced inspection/handling systems and methods for manipulation and assembly of micro products and micro components, robust micromanipulation and microassembly strategies can be implemented on a high-speed, repetitive, reliable, reconfigurable, robust and open-architecture microassembly workstation. Due to high accuracy requirements and specific mechanical and physical laws which govern the microscale world, micromanipulation and microassembly tasks require robust control strategies based on real-time sensory feedback. Vision as a passive sensor can yield high resolutions of micro objects and micro scenes along with a stereoscopic optical microscope. Visual data contains useful information for micromanipulation and microassembly tasks, and can be processed using various image processing and computer vision algorithms. In this thesis, the initial work on the design and implementation of a vision system for microassembly workstation is introduced. Both software and hardware issues are considered. Emphasis is put on the implementation of computer vision algorithms and vision based control techniques which help to build strong basis for the vision part of the microassembly workstation. The main goal of designing such a vision system is to perform automated micromanipulation and microassembly tasks for a variety of applications. Experiments with some teleoperated and semiautomated tasks, which aim to manipulate micro particles manually or automatically by microgripper or probe as manipulation tools, show quite promising results

    Analysis of CAD Model-based Visual Tracking for Microassembly using a New Block Set for MATLAB/Simulink.

    No full text
    International audienceMicroassembly is an innovative alternative to the microfabrication process of MOEMS which is quite complex. It usually implies the use of microrobots controlled by an operator. The reliability of this approach has been already confirmed for the micro-optical technologies. However, the characterization of assemblies has shown that the operator is the main source of inaccuracies in the teleoperated microassembly. Therefore, there is a great interest in automating the microassembly process. One of the constraints of automation in microscale is the lack of high precision sensors capable to provide the full information about the object position. Thus, the usage of visual-based feedback represents a very promising approach allowing to automate the microassembly process. The purpose of this paper is to characterize the techniques of object position estimation based on the visual data, i.e. visual tracking techniques from the ViSP library. These algorithms allows to get the 3D object pose using a single view of the scene and the CAD model of the object. The performance of three main types of model-based trackers is analyzed and quantified: edge-based, texture-based and hybrid tracker. The problems of visual tracking in microscale are discussed. The control of the micromanipulation station used in the framework of our project is performed using a new Simulink block set. Experimental results are shown and demonstrate thepossibility to obtain the repeatability below 1 micrometer

    Image-Guided Robot-Assisted Techniques with Applications in Minimally Invasive Therapy and Cell Biology

    Get PDF
    There are several situations where tasks can be performed better robotically rather than manually. Among these are situations (a) where high accuracy and robustness are required, (b) where difficult or hazardous working conditions exist, and (c) where very large or very small motions or forces are involved. Recent advances in technology have resulted in smaller size robots with higher accuracy and reliability. As a result, robotics is fi nding more and more applications in Biomedical Engineering. Medical Robotics and Cell Micro-Manipulation are two of these applications involving interaction with delicate living organs at very di fferent scales.Availability of a wide range of imaging modalities from ultrasound and X-ray fluoroscopy to high magni cation optical microscopes, makes it possible to use imaging as a powerful means to guide and control robot manipulators. This thesis includes three parts focusing on three applications of Image-Guided Robotics in biomedical engineering, including: Vascular Catheterization: a robotic system was developed to insert a catheter through the vasculature and guide it to a desired point via visual servoing. The system provides shared control with the operator to perform a task semi-automatically or through master-slave control. The system provides control of a catheter tip with high accuracy while reducing X-ray exposure to the clinicians and providing a more ergonomic situation for the cardiologists. Cardiac Catheterization: a master-slave robotic system was developed to perform accurate control of a steerable catheter to touch and ablate faulty regions on the inner walls of a beating heart in order to treat arrhythmia. The system facilitates touching and making contact with a target point in a beating heart chamber through master-slave control with coordinated visual feedback. Live Neuron Micro-Manipulation: a microscope image-guided robotic system was developed to provide shared control over multiple micro-manipulators to touch cell membranes in order to perform patch clamp electrophysiology. Image-guided robot-assisted techniques with master-slave control were implemented for each case to provide shared control between a human operator and a robot. The results show increased accuracy and reduced operation time in all three cases

    An optically actuated surface scanning probe

    Get PDF
    We demonstrate the use of an extended, optically trapped probe that is capable of imaging surface topography with nanometre precision, whilst applying ultra-low, femto-Newton sized forces. This degree of precision and sensitivity is acquired through three distinct strategies. First, the probe itself is shaped in such a way as to soften the trap along the sensing axis and stiffen it in transverse directions. Next, these characteristics are enhanced by selectively position clamping independent motions of the probe. Finally, force clamping is used to refine the surface contact response. Detailed analyses are presented for each of these mechanisms. To test our sensor, we scan it laterally over a calibration sample consisting of a series of graduated steps, and demonstrate a height resolution of ∼ 11 nm. Using equipartition theory, we estimate that an average force of only ∼ 140 fN is exerted on the sample during the scan, making this technique ideal for the investigation of delicate biological samples
    corecore