7 research outputs found

    Object modeling using a ToF camera under an uncertainty reduction approach

    Get PDF
    Trabajo presentado al ICRA 2010 celebrado en Anchorage (Alaska) del3 al 7 de mayo.Time-of-Flight (ToF) cameras deliver 3D images at 25 fps, offering great potential for developing fast object modeling algorithms. Surprisingly, this potential has not been extensively exploited up to now. A reason for this is that, since the acquired depth images are noisy, most of the available registration algorithms are hardly applicable. A further difficulty is that the transformations between views are in general not accurately known, a circumstance that multi-view object modeling algorithms do not handle properly under noisy conditions. In this work, we take into account both uncertainty sources (in images and camera poses) to generate spatially consistent 3D object models fusing multiple views with a probabilistic approach. We propose a method to compute the covariance of the registration process, and apply an iterative state estimation method to build object models under noisy conditions.This work was supported by projects: 'Perception, action & cognition through learning of object-action complexes.' (4915), 'CONSOL IDER-INGENIO 2010 Multimodal interaction in pattern recognition and computer vision' (V-00069), 'Percepción y acción ante incertidumbre' (4803). This work has been partially supported by the Spanish Ministry of Science and Innovation under project DPI2008-06022, the MIPRCV Consolider Ingenio 2010 project, and the EU PACO PLUS project FP6-2004-IST-4-27657. S. Foix and G. Alenyà are supported by PhD and postdoctoral fellowships, respectively, from CSIC’s JAE program.Peer Reviewe

    Exploitation of time-of-flight (ToF) cameras

    Get PDF
    This technical report reviews the state-of-the art in the field of ToF cameras, their advantages, their limitations, and their present-day applications sometimes in combination with other sensors. Even though ToF cameras provide neither higher resolution nor larger ambiguity-free range compared to other range map estimation systems, advantages such as registered depth and intensity data at a high frame rate, compact design, low weight and reduced power consumption have motivated their use in numerous areas of research. In robotics, these areas range from mobile robot navigation and map building to vision-based human motion capture and gesture recognition, showing particularly a great potential in object modeling and recognition.Preprin

    ToF cameras for active vision in robotics

    Get PDF
    ToF cameras are now a mature technology that is widely being adopted to provide sensory input to robotic applications. Depending on the nature of the objects to be perceived and the viewing distance, we distinguish two groups of applications: those requiring to capture the whole scene and those centered on an object. It will be demonstrated that it is in this last group of applications, in which the robot has to locate and possibly manipulate an object, where the distinctive characteristics of ToF cameras can be better exploited. After presenting the physical sensor features and the calibration requirements of such cameras, we review some representative works highlighting for each one which of the distinctive ToF characteristics have been more essential. Even if at low resolution, the acquisition of 3D images at frame-rate is one of the most important features, as it enables quick background/ foreground segmentation. A common use is in combination with classical color cameras. We present three developed applications, using a mobile robot and a robotic arm, to exemplify with real images some of the stated advantages.This work was supported by the EU project GARNICS FP7-247947, by the Spanish Ministry of Science and Innovation under project PAU+ DPI2011-27510, and by the Catalan Research Commission through SGR-00155Peer Reviewe

    ToF cameras for eye-in-hand robotics

    Get PDF
    This work was supported by the Spanish Ministry of Science and Innovation under project PAU+ DPI2011-27510, by the EU Project IntellAct FP7-ICT2009-6-269959 and by the Catalan Research Commission through SGR-00155.Peer Reviewe

    Robot guidance using machine vision techniques in industrial environments: A comparative review

    Get PDF
    In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works

    ToF Camera calibration: an automatic setting of its integration time and an experimental analysis of its modulation frequency

    Get PDF
    [ES] La percepción de profundidad se hace imprescindible en muchas tareas de manipulación, control visual y navegación de robots. Las cámaras de tiempo de vuelo (ToF: Time of Flight) generan imágenes de rango que proporcionan medidas de profundidad en tiempo real. No obstante, el parámetro distancia que calculan estas cámaras es fuertemente dependiente del tiempo de integración que se configura en el sensor y de la frecuencia de modulación empleada por el sistema de iluminación que integran. En este artículo, se presenta una metodología para el ajuste adaptativo del tiempo de integración y un análisis experimental del comportamiento de una cámara ToF cuando se modifica la frecuencia de modulación. Este método ha sido probado con éxito en algoritmos de control visual con arquitectura ‘eye-in-hand’ donde el sistema sensorial está compuesto por una cámara ToF. Además, la misma metodología puede ser aplicada en otros escenarios de trabajo.[EN] The depth perception is essential in many manipulation tasks, visual inspection and robot navigation. The cameras of Time of Flight (TOF) generate range images which provide depth measurements in real time. However, the distance parameter computed from these cameras is strongly dependent on the integration time set for the sensor and the frequency of modulation used by the integrated lighting system. In this paper, a methodology for automatic setting of integration time and an experimental analysis of ToF camera behavior adjusting its modulation frequency is presented. This method has been successfully tested on visual servoing algorithms with architecture ‘eye-in-hand’ in which the sensory system consists of a ToF camera, in addition this methodology can be applied to other workspaces and scenarios.Este trabajo ha sido co-financiado por el Gobierno regional de la Generalitat Valenciana, Universidad de Alicante y CICYT través de los proyectos GV2012/102, GRE10-16 y DPI2012-32390.Gil, P.; Kisler, T.; García, G.; Jara, C.; Corrales, J. (2013). Calibración de cámaras de tiempo de vuelo: Ajuste adaptativo del tiempo de integración y análisis de la frecuencia de modulación. Revista Iberoamericana de Automática e Informática industrial. 10(4):453-464. https://doi.org/10.1016/j.riai.2013.08.002OJS453464104Bouguet, J.Y., 2000. Pyramidal implementation of affine Lucas Kanade feature tracker. Intel Corporation- Microprocessor Research Labs, OpenCV Library.Chaumette, F., Hutchinson, S., 2006. Visual servo control. I. Basic approaches. IEEE Robotics and Automation Magazine 13, IEEE Press, pp. 82-90.Chiabrando, F., Chiabrando, R., Piatti, D., Rianudo, F., 2009. Sensors for 3d imaging: metric evualuation an calibration of CCD/CMOS time-of-flight camera. Sensors 9(9), pp. 10080-10096. DOI: 10.3390/s91210080.Distante, C., Diraco, G., Leone, A., 2010. Active range imaging dataset for indoor surveillance. In Proc. of British Machine Vision Conference (BMVC), BMVA Press, vol. 2, pp.1-16.Foix, S., Alenya, G., & Torras, C. (2011). Lock-in Time-of-Flight (ToF) Cameras: A Survey. IEEE Sensors Journal, 11(9), 1917-1926. doi:10.1109/jsen.2010.2101060Plaue, M. (2009). Theoretical and experimental error analysis of continuous-wave time-of-flight range cameras. Optical Engineering, 48(1), 013602. doi:10.1117/1.3070634Fuchs, S. Hirzinger, G., 2008. Extrinsic and depth calibration of ToF-cameras. In Proc. of Conference on Computer Vision and Pattern Recognition (CVPR), IEEE Press Society, pp. 1-6. DOI: 10.1109/CVPR.2008.4587828.Garcia, F., Aouada D., Mirbach, B., Solignac T., Ottersten B., 2011. Real-time hybrod ToF multi-camera rig fusion system for depth map enhancement. In Proc. of Conference on Computer Vision and Pattern Recognition (CVPR), IEEE Press Society, pp. 1-8. DOI: 10.1109/CVPRW.2011.5981740.Gil, P., Pomares, J., Torres, F., 2010. Analysis and adaptation of integration time in PMD camera for visual servoing. In Proc. of 20th International Conference on Pattern Recognition (ICPR), IEEE Press Society, pp. 311-315. DOI: 10.1109/ICPR.2010.85.Herrera, D.C., Kannala, J., Heikkila, J., 2011. Accurate and practical calibration of a depth and color camera pair. In Proc. of 14th International Conference on Computer Analysis of Images and Patterns (CAIP), vol 2, Ed. Springer-Verlag Berlín, Heidelberg, pp. 437-445.Hussman, S., Liepert, T., 2009. Three-dimensional tof robot vision system. IEEE Transactions on Instrumentation and Measurement 58(1), pp. 141-146. DOI: 10.1109/TIM.2008.928409.Hussman, S., Edeler, T., 2010. Robot vision using 3d tof systems. En: Ales Ude, (Ed.), Robot Vision. Intech Press, pp. 293-306.Kakiuchi, Y., Ueda, R., Kobayashi, K., Okada, K., Inaba, M., 2010. Working with movable obstacles using on-line environmet perception reconstruction using active sensing and color range sensor. In Proc. of International Conference on Intelligent Robots and Systems (IROS), IEEE Press, pp. 1696-1701. DOI: 10.1109/IROS.2010.5650206.Kim, Y.M., Chan, D., Theobalt, C., Thrun, S., 2008. Design and calibration of a multi-view ToF sensor fusion system. In Proc. of 22nd Conference on Computer Vision and Pattern Recognition (CVPR), IEEE Press Society, pp. 1524-1530. DOI: 10.1109/CVPRW.2008.4563160.Kisler, T., Gil, P., 2011. Detección y seguimiento de objetos sólidos con cámaras ToF. Actas de XXXII Jornadas de Automática (JA), CEA-IFAC Actas. Sevilla (Spain).Khoshelham, K. 2011. Accuracy analysis of Kinect depth data. En: D.D., Lichti, A.F., Habbib, (Ed.). In Proc of ISPRS Journal of Photogrammetry and Remote Sensing-Workshop on Laser Scanning, vol. 38(5), pp. 29-31.Kolb, A., Barth, E., Koch, E., Larse, R., 2010. Time-of-flight Cameras in Computer Graphics. Computer Graphics Forum, vol. 29(1), pp. 141-159. DOI: 10.1111/j.1467-8659.2009.01583.x.Kuehnle, J.U., Xue, Z., Sotz, M., Zoellner, J.M., Verl, A., Dillmann, R., 2008. Grasping in depth maps of time-of-flight cameras. In Proc. of IEEE International Workshop on Robotic and Sensors Environments (ROSE). pp. 132-137. DOI: 10.1109/ROSE.2008.466914.Lai, K., Liefeng Bo, Xiaofrng Ren, Fox, D., 2011. Spares distance learning for object recognition combining RGB and depth information. In Proc. of International Conference on Robotics and Automation (ICRA), IEEE Press Society, pp. 4007-4013. DOI: 10.1109/ICRA.2011.5980377.Lichti, D., 2008. Self-calibration of a 3D range camera. In Proc of International Society for Photogrammetry and Remote Sensing 37(3), pp.1-6.Lichti, D. Rouzaud, D., 2009. Surface-dependent 3d range camera self- calibration. En: A., Beraldin, G.S., Cheok, M., McCarthy, (Ed.), In Proc. of SPIE vol. 72390, pp. DOI: 10.1117/12.805509.Lichti, D., Kim, C., 2011. A comparison of three geometric self-calibration methods for range cameras. Remote Sensing 11(3), pp. 1014-1028. DOI: 10.3390/rs3051014.Lindner, M., Kolb, A., Ringbeck, T., 2008. New insights into the calibration of ToF-sensors. In Proc. of 22nd Conference on Computer Vision and Pattern Recognition (CVPR), IEEE Press Society, pp. 1603-1607. DOI: 10.1109/CVPRW.2008.4563172.Lindner, M., Schiller, I., Kolb, A., & Koch, R. (2010). Time-of-Flight sensor calibration for accurate range sensing. Computer Vision and Image Understanding, 114(12), 1318-1328. doi:10.1016/j.cviu.2009.11.002May, S., Werner, B., Surmann, H., Pervölz, K., 2006. “3d time-of-flight cameras for mobile robotics. In Proc. of International Conference on Intelligent Robots and Systems (IROS), IEEE Press, 790-795, DOI: 10.1109/IROS.2006.281670.May, S., Fuchs, S., Droeschel, D. Holz, D., Nüchter, A., 2009. Robust 3d-mapping with time-of-flight cameras. In Proc. of International Conference on Intelligent Robots and Systems (IROS), IEEE Press Society, pp 1673-1678.May, S., Droeschel, D., Holz, D., Fuchs, S., Malis, E., Nüchter, A., Hertzberg, J., 2009. Three-dimensional mapping with time-of-light cameras. Journal of Field Robotics. Special Issue on Three-dimensional Mapping Part 2, 26(11-12), pp. 934-965. DOI: 10.1002/ROB.20321.Mufti, F., & Mahony, R. (2011). Statistical analysis of signal measurement in time-of-flight cameras. ISPRS Journal of Photogrammetry and Remote Sensing, 66(5), 720-731. doi:10.1016/j.isprsjprs.2011.06.004Pattison, T., 2010. Quantification and description of distance measurement errors of a time-of-flight camera. M. Sc. Thesis. University of Stuttgart, Stuttgart (Germany).Pomares, J., Gil, P., Torres, F., 2010. Visual control of robots using range images. Sensors 10(8), pp. 7303-7322. DOI: 10.3290/s100807303.Rapp, H., Frank, M., Hamprecht, F.A., Jähne, B., 2008. A theoretical and experimental investigation of the systematic errors and statistical uncertainties of time-of-flight-cameras. International Journal of Intelligent Systems Technologies and Applications vol. 5(3-4), pp. 402-413. DOI: 10.1504/IJISTA.2008.021303.Shahbazi, M., Homayouni, S., Saadatseresht, M., & Sattari, M. (2011). Range Camera Self-Calibration Based on Integrated Bundle Adjustment via Joint Setup with a 2D Digital Camera. Sensors, 11(9), 8721-8740. doi:10.3390/s110908721Schaller, C., 2011, Time-of-Flight-A new Modality for Radiotherapy, M. Sc. Thesis. University Erlangen-Nuremberg, Erlagen (Germany).Schiller, I., Beder, C., Koch, R., 2008. Calibration of a PMD-camera using a planar calibration pattern together with a multi-camera setup. In Proc. of ISPRS Journal of Photogrammetry and Remote Sensing vol. 37, pp. 297-302.Schwarz, L., Mateus, D., Castaneda, V., Navab, N., 2010. Manifold learning for ToF-based human body tracking and activity recognition. In Proc. of British Machine Vision Conference (BMVC), BMVA Press, pp.1-11. DOI: 10.5244/C.24.80.Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., Blake, A., 2011. Real-time human pose recognition in parts from single depth images. In Proc of IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE Press Society, pp. 1297-1304.Smisek, J., 2011. 3D Camera Calibration. MSc. Thesis. Czech Technnical Univesity, Prague (Czech).Weyer, C.A., Bae, K.H., Lim, K., Lichti, D., 2008. Extensive metric performance evaluation of a 3D range camera. In Proc. of ISPRS Journal of Photogrammetry and Remote Sensing vol.37(5), pp.939-944.Wiedemann M., Sauer M., Driewer F. Schilling K., 2008. Analysis and characterization of the PMD camera for aplication in mobile robots. M. J. Chung and P. Misra (Ed.). In Proc. of 17th World Congress of the International Federation of Automotic Control, IFAC Press, pp.13689-13694.Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330-1334. doi:10.1109/34.888718Zhu, J., Wang, L., Yang, R., Davis, J., 2008. Fusion of time-of-flight depth and stereo for high accuracy depth maps. In Proc. of Computer Vision and Pattern Recognition (CVPR), IEEE Press Society, pp. 1-8. DOI: 10.1109/CVPR.2008.4587761.Zhu, J., Yang, R., & Xiang, X. (2011). Eye contact in video conference via fusion of time-of-flight depth sensor and stereo. 3D Research, 2(3). doi:10.1007/3dres.03(2011)5Zinber, T., Schmidt, J., Niemann, H., 2003. A refined ICP algorithm for robust 3d correspondence estimation. In Proc. of Conference on Image Processing (ICIP), IEEE Press, pp. 695-698
    corecore