28 research outputs found

    Robot guidance using machine vision techniques in industrial environments: A comparative review

    Get PDF
    In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works

    Automatic System to Detect Both Distraction and Drowsiness in Drivers Using Robust Visual Features

    Get PDF
    [ES] De acuerdo con un reciente estudio publicado por la Organización Mundial de la Salud (OMS), se estima que 1.25 millones de personas mueren como resultado de accidentes de tráfico. De todos ellos, muchos son provocados por lo que se conoce como inatención, cuyos principales factores contribuyentes son tanto la distracción como la somnolencia. En líneas generales, se calcula que la inatención ocasiona entre el 25% y el 75% de los accidentes y casi-accidentes. A causa de estas cifras y sus consecuencias se ha convertido en un campo ampliamente estudiado por la comunidad investigadora, donde diferentes estudios y soluciones han sido propuestos, pudiendo destacar los métodos basados en visión por computador como uno de los más prometedores para la detección robusta de estos eventos de inatención. El objetivo del presente artículo es el de proponer, construir y validar una arquitectura especialmente diseñada para operar en entornos vehiculares basada en el análisis de características visuales mediante el empleo de técnicas de visión por computador y aprendizaje automático para la detección tanto de la distracción como de la somnolencia en los conductores. El sistema se ha validado, en primer lugar, con bases de datos de referencia testeando los diferentes módulos que la componen. En concreto, se detecta la presencia o ausencia del conductor con una precisión del 100%, 90.56%, 88.96% por medio de un marcador ubicado en el reposacabezas del conductor, por medio del operador LBP, o por medio del operador CS-LBP, respectivamente. En lo que respecta a la validación mediante la base de datos CEW para la detección del estado de los ojos, se obtiene una precisión de 93.39% y de 91.84% utilizando una nueva aproximación basada en LBP (LBP_RO) y otra basada en el operador CS-LBP (CS-LBP_RO). Tras la realización de varios experimentos para ubicar la cámara en el lugar más adecuado, se posicionó la misma en el salpicadero, pudiendo aumentar la precisión en la detección de la región facial de un 86.88% a un 96.46%. Las pruebas en entornos reales se realizaron durante varios días recogiendo condiciones lumínicas muy diferentes durante las horas diurnas involucrando a 16 conductores, los cuales realizaron diversas actividades para reproducir síntomas de distracción y somnolencia. Dependiendo del tipo de actividad y su duración, se obtuvieron diferentes resultados. De manera general y considerando de forma conjunta todas las actividades se obtiene una tasa media de detección del 93.11%.[EN] According to the most recent studies published by the World Health Organization (WHO) in 2013, it is estimated that 1.25 million people die as a result of traffic crashes. Many of them are caused by what it is known as inattention, whose main contributing factors are both distraction and drowsiness. Overall, it is estimated that inattention causes between 25% and 75% of the crashes and near-crashes. That is why this is a thoroughly studied field by the research community, where solutions to combat distraction and drowsiness, in particular, and inattention, in general, can be classified into three main categories, and, where computer vision has clearly become a non-obtrusive effective tool for the detection of both distraction and drowsiness. The aim of this paper is to propose, build and validate an architecture based on the analysis of visual characteristics by using computer vision techniques and machine learning to detect both distraction and drowsiness in drivers. Firstly, the modules have been tested with all its components independently using several datasets. More specifically, the presence/absence of the driver is detected with an accuracy of 100%, 90.56%, 88.96% by using a marker positioned onto the headrest, the LBP operator and the CS-LBP operator, respectively. Regarding the eye closeness validation with CEW dataset, an accuracy of 93.39% and 91.84% is obtained using a new method using both LBP (LBP_RO) and CS-LBP (CS-LBP_RO). After performing several tests, the camera is positioned on the dashboard, increasing the accuracy of face detection from 86.88% to 96.46%. In connection with the tests performed in real-world settings, 16 drivers were involved performing several activities imitating different sings of sleepiness and distraction. Overall, an accuracy of 93.11%is obtained considering all activities and all drivers.El origen de las actividades del presente trabajo ha sido realizado parcialmente gracias al apoyo tanto de la Fundación para el fomento en Asturias de la investigación científica aplicada y la tecnología (FICYT) y de la empresa SINERCO SL, por medio de la ejecución del proyecto "Creación de algoritmos de visión artificial ", con referencia IE09-511.El presente trabajo se engloba en la tesis doctoral de Alberto Fernández Villán.Fernández Villán, A.; Usamentiaga Fernández, R.; Casado Tejedor, R. (2017). Sistema Automático Para la Detección de Distracción y Somnolencia en Conductores por Medio de Características Visuales Robustas. Revista Iberoamericana de Automática e Informática industrial. 14(3):307-328. https://doi.org/10.1016/j.riai.2017.05.001OJS307328143Abtahi, S., Omidyeganeh, M., Shirmohammadi, S., Hariri, B., 2014. Yawdd: a yawning detection dataset. In: Proceedings of the 5th ACM Multimedia Systems Conference. ACM, pp. 24-28.Ahlstrom, C., Dukic, T., 2010. Comparison of eye tracking systems with one and three cameras. In: Proceedings of the 7th International Conference on Methods and Techniques in Behavioral Research. ACM, p. 3.Ahonen, T., Hadid, A., Pietikainen, M., 2006. Face description with local binary patterns: Application to face recognition. IEEE transactions on pattern analysis and machine intelligence 28 (12), 2037-2041.Asthana, A., Marks, T. K., Jones, M. J., Tieu, K. H., Rohith, M., 2011. Fully automatic pose-invariant face recognition via 3d pose normalization. In: 2011 International Conference on Computer Vision. IEEE, pp. 937-944.Berri, R. A., Silva, A. G., Parpinelli, R. S., Girardi, E., Arthur, R., 2014. A pattern recognition system for detecting use of mobile phones while driving. In: Computer Vision Theory and Applications (VISAPP), 2014 International Conference on. Vol. 2. IEEE, pp. 411-418.Bolme, D. S., Draper, B. A., Beveridge, J. R., 2009. Average of synthetic exact filters. In: Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on. IEEE, pp. 2105-2112.Boyraz, P., Yang, X., Hansen, J. H., 2012. Computer vision systems for contextaware active vehicle safety and driver assistance. In: Digital Signal Processing for In-Vehicle Systems and Safety. Springer, pp. 217-227.Chang, C.-C., Lin, C.-J., 2011. Libsvm: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology (TIST) 2 (3), 27.Dalal, N., Triggs, B., 2005. Histograms of oriented gradients for human detection. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05). Vol. 1. IEEE, pp. 886-893.Daniluk, M., Rezaei, M., Nicolescu, R., Klette, R., 2014. Eye status based on eyelid detection: A driver assistance system. In: International Conference on Computer Vision and Graphics. Springer, pp. 171-178.Dasgupta, A., George, A., Happy, S., Routray, A., Shanker, T., 2013. An onboard vision based system for drowsiness detection in automotive drivers. International Journal of Advances in Engineering Sciences and Applied Mathematics 5 (2-3), 94-103.Devi, M. S., Bajaj, P. R., 2008. Driver fatigue detection based on eye tracking. In: 2008 First International Conference on Emerging Trends in Engineering and Technology. IEEE, pp. 649-652.Dinges, D. F., Grace, R., 1998. Perclos: A valid psychophysiological measure of alertness as assessed by psychomotor vigilance. US Department of Transportation, Federal Highway Administration, Publication Number FHWAMCRT-98-006.Dong, Y., Hu, Z., Uchimura, K., Murayama, N., 2011. Driver inattention monitoring system for intelligent vehicles: A review. IEEE transactions on intelligent transportation systems 12 (2), 596-614.Fernandez, A., Carus, J., Usamentiaga, R., Alvarez, E., Casado, R., 2017. Wearable and ambient sensors to health monitoring using computer vision and signal processing techniques. Journal of Networks In press.Fernandez, A., Carus, J. L., Usamentiaga, R., Alvarez, E., Casado, R., 2015a. Unobtrusive health monitoring system using video-based physiological information and activity measurements. In: Computer, Information and Telecommunication Systems (CITS), 2015 International Conference on. IEEE, pp. 1-5.Fernandez, A., Casado, R., Usamentiaga, R., 2015b. A real-time big data architecture for glasses detection using computer vision techniques. In: Future Internet of Things and Cloud (FiCloud), 2015 3rd International Conference on. IEEE, pp. 591-596.Fernandez, A., García, R., Usamentiaga, R., Casado, R., 2015c. Glasses detection on real images based on robust alignment. Machine Vision and Applications 26 (4), 519-531.Fernandez, A., Usamentiaga, R., Carus, ' J. L., Casado, R., 2016. Driver distraction using visual-based sensors and algorithms. Sensors 16 (11), 1805.Flores, M. J., Armingol, J. M., de la Escalera, A., 2010. Real-time warning system for driver drowsiness detection using visual information. Journal of Intelligent & Robotic Systems 59 (2), 103-125.Flores, M. J., de la Escalera, A., et al., 2011. Sistema avanzado de asistencia a la conduccion para la detecci ' on de la somnolencia. Revista Iberoamericana ' de Automatica ' e Informatica Industrial RIAI ' 8 (3), 216-228.Forsman, P. M., Vila, B. J., Short, R. A., Mott, C. G., Van Dongen, H. P., 2013. Efficient driver drowsiness detection at moderate levels of drowsiness. Accident Analysis & Prevention 50, 341-350.Hadid, A., Pietikainen, M., 2013. Demographic classification from ¨ face videos using manifold learning. Neurocomputing 100, 197-205.Hammoud, R. I., Wilhelm, A., Malawey, P., Witt, G. J., 2005. Efficient real-time algorithms for eye state and head pose tracking in advanced driver support systems. In: Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on. Vol. 2. IEEE, pp. 1181-vol.Hansen, D. W., Ji, Q., 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on pattern analysis and machine intelligence 32 (3), 478-500.Hattori, A., Tokoro, S., Miyashita, M., Tanaka, I., Ohue, K., Uozumi, S., 2006. Development of forward collision warning system using the driver behavioral information. Tech. rep., SAE Technical Paper.Heikkila, M., Pietik ¨ ainen, M., Schmid, C., 2009. Description ¨ of interest regions with local binary patterns. Pattern recognition 42 (3), 425-436.Hong, T., Qin, H., 2007. Drivers drowsiness detection in embedded system. In: Vehicular Electronics and Safety, 2007. ICVES. IEEE International Conference on. IEEE, pp. 1-5.Hsu, C.-W., Chang, C.-C., Lin, C.-J., et al., 2003. A practical guide to support vector classification.Jain, V., Learned-Miller, E. G., 2010. Fddb: A benchmark for face detection in unconstrained settings. UMass Amherst Technical Report.Jo, J., Lee, S. J., Park, K. R., Kim, I.-J., Kim, J., 2014. Detecting driver drowsiness using feature-level fusion and user-specific classification. Expert Systems with Applications 41 (4), 1139-1152.Jung, J.-Y., Kim, S.-W., Yoo, C.-H., Park, W.-J., Ko, S.-J., 2016. Lbp-fernsbased feature extraction for robust facial recognition. IEEE Transactions on Consumer Electronics 62 (4), 446-453.Lee, S. J., Jo, J., Jung, H. G., Park, K. R., Kim, J., 2011. Real-time gaze estimator based on driver's head orientation for forward collision warning system. IEEE Transactions on Intelligent Transportation Systems 12 (1), 254-267.Li, H., Lin, Z., Shen, X., Brandt, J., Hua, G., 2015. A convolutional neural network cascade for face detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. pp. 5325-5334.Liu, C. C., Hosking, S. G., Lenne, M. G., 2009. Predicting driver drowsiness ' using vehicle measures: Recent insights and future challenges. Journal of safety research 40 (4), 239-245.Lopez Romero, ' W. L., 2016. Sistema de control del estado de somnolencia en conductores de vehículos.Losada, D. G., Lopez, ' G. A. R., Acevedo, R. G., Villan, ' A. F., 2013. Aviueartificial vision to improve the user experience. In: New Concepts in Smart Cities: Fostering Public and Private Alliances (SmartMILE), 2013 International Conference on. IEEE, pp. 1-6.Lu, L., Ning, X., Qian, M., Zhao, Y., 2011. Close eye detected based on synthesized gray projection. In: Advances in Multimedia, Software Engineering and Computing Vol. 2. Springer, pp. 345-351.Markus, N., Frljak, M., Pand ˇ ziˇ c, I. S., Ahlberg, J., Forchheimer, R., 2014. Object detection with pixel intensity comparisons organized in decision trees. arXiv preprint arXiv:1305.4537.Martin, E., 2006. Breakthrough research on real-world driver behavior released. National Highway Traffic Safety Administration.Mbouna, R. O., Kong, S. G., Chun, M.-G., 2013. Visual analysis of eye state and head pose for driver alertness monitoring. IEEE transactions on intelligent transportation systems 14 (3), 1462-1469.Murphy-Chutorian, E., Trivedi, M. M., 2010. Head pose estimation and augmented reality tracking: An integrated system and evaluation for monitoring driver awareness. IEEE Transactions on intelligent transportation systems 11 (2), 300-311.Noori, S. M. R., Mikaeili, M., 2016. Driving drowsiness detection using fusion of electroencephalography, electrooculography, and driving quality signals. Journal of medical signals and sensors 6 (1), 39.Nuevo, J., Bergasa, L. M., Jimenez, ' P., 2010. Rsmat: Robust simultaneous modeling and tracking. Pattern Recognition Letters 31 (16), 2455-2463.of Transportation, D., 2016. Pennsylvania driver's manual. https://goo.gl/ XCER8C, accessed: 2016-09-018.Ojala, T., Pietikainen, M., Harwood, D., 1996. ¨ A comparative study of texture measures with classification based on featured distributions. Pattern recognition 29 (1), 51-59.Ojala, T., Pietikainen, M., Maenpaa, T., 2002. Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Transactions on pattern analysis and machine intelligence 24 (7), 971-987.Organization, W. H., 2016. Global status report on road safety 2015. http: //goo.gl/jMoJ4l, accessed: 2016-07-01.Pan, G., Sun, L., Wu, Z., Lao, S., 2007. Eyeblink-based anti-spoofing in face recognition from a generic webcamera. In: Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on. IEEE, pp. 1-8.Peden, M., Toroyan, T., Krug, E., Iaych, K., et al., 2016. The status of global road safety: The agenda for sustainable development encourages urgent action. Journal of the Australasian College of Road Safety 27 (2), 37.Phillips, P. J., Moon, H., Rizvi, S. A., Rauss, P. J., 2000. The feret evaluation methodology for face-recognition algorithms. IEEE Transactions on pattern analysis and machine intelligence 22 (10), 1090-1104.RACE, A. y. l. D., 2016. Los conductores espanoles reconocen sufrir m ˜ as som- ' nolencia al volante que los usuarios europeos. http://goo.gl/mui9S3, accessed: 2016-07-01.Regan, M. A., Hallett, C., Gordon, C. P., 2011. Driver distraction and driver inattention: Definition, relationship and taxonomy. Accident Analysis & Prevention 43 (5), 1771-1781.Sahayadhas, A., Sundaraj, K., Murugappan, M., 2012. Detecting driver drowsiness based on sensors: a review. Sensors 12 (12), 16937-16953.Selvakumar, K., Jerome, J., Rajamani, K., Shankar, N., 2015. Real-time vision based driver drowsiness detection using partial least squares analysis. Journal of Signal Processing Systems, 1-12.Shan, C., 2012. Learning local binary patterns for gender classification on realworld face images. Pattern Recognition Letters 33 (4), 431-437.Shan, C., Gong, S., McOwan, P. W., 2009. Facial expression recognition based on local binary patterns: A comprehensive study. Image and Vision Computing 27 (6), 803-816.Sigari, M. H., 2009. Driver hypo-vigilance detection based on eyelid behavior. In: Advances in Pattern Recognition, 2009. ICAPR'09. Seventh International Conference on. IEEE, pp. 426-429.Slawinski, E., Mut, ˜ V., Penizzotto, F., 2015. Sistema de alerta al conductor basado en realimentacion vibro-t ' actil. ' Revista Iberoamericana de Automatica ' e Informatica Industrial RIAI ' 12 (1), 36-48.Song, F., Tan, X., Chen, S., Zhou, Z.-H., 2013. A literature survey on robust and efficient eye localization in real-life scenarios. Pattern Recognition 46 (12), 3157-3173.Song, F., Tan, X., Liu, X., Chen, S., 2014. Eyes closeness detection from still images with multi-scale histograms of principal oriented gradients. Pattern Recognition 47 (9), 2825-2838.StopChatear, 2016. Uso de los smartphones en la conduccion. ' http://goo. gl/67dvtn, accessed: 2016-07-01.Talbot, R., Fagerlind, H., Morris, A., 2013. Exploring inattention and distraction in the safetynet accident causation database. Accident Analysis & Prevention 60, 445-455.Tan, X., Triggs, B., 2010. Enhanced local texture feature sets for face recognition under difficult lighting conditions. IEEE transactions on image processing 19 (6), 1635-1650.Timm, F., Barth, E., 2011. Accurate eye centre localisation by means of gradients. VISAPP 11, 125-130.Uˇricˇa'ˇr, M., Franc, V., Hlava'c, V., 2012. Detector of facial landmarks learned by ˇ the structured output svm. VIsAPP 12, 547-556.Vapnik, V., 1998. Statistical learning theory wiley new york google scholar.Vicente, F., Huang, Z., Xiong, X., De la Torre, F., Zhang, W., Levi, D., 2015. Driver gaze tracking and eyes off the road detection system. IEEE Transactions on Intelligent Transportation Systems 16 (4), 2014-2027.Villan, A. F., Candas, J. L. C., Fernandez, R. U., Tejedor, R. C., 2016. Face recognition and spoofing detection system adapted to visually-impaired people. IEEE Latin America Transactions 14 (2), 913-921.Viola, P., Jones, M. J., 2004. Robust real-time face detection. International journal of computer vision 57 (2), 137-154.Vural, E., Cetin, M., Ercil, A., Littlewort, G., Bartlett, M., Movellan, J., 2007. Drowsy driver detection through facial movement analysis. In: International Workshop on Human-Computer Interaction. Springer, pp. 6-18.You, C.-W., Lane, N. D., Chen, F., Wang, R., Chen, Z., Bao, T. J., Montes-de Oca, M., Cheng, Y., Lin, M., Torresani, L., et al., 2013. Carsafe app: alerting drowsy and distracted drivers using dual cameras on smartphones. In: Proceeding of the 11th annual international conference on Mobile systems, applications, and services. ACM, pp. 13-26.Zhang, Z., Zhang, J.-s., 2006. Driver fatigue detection based intelligent vehicle control. In: 18th International Conference on Pattern Recognition (ICPR'06). Vol. 2. IEEE, pp. 1262-1265

    Evaluation of Dust Deposition on Parabolic Trough Collectors in the Visible and Infrared Spectrum

    No full text
    Solar energy is mostly harnessed in arid areas where a high concentration of atmospheric dust represents a major environmental degradation factor. Gravitationally settled particles and other solid particles on the surface of the photovoltaic panels or thermal collectors greatly reduce the absorbed solar energy. Therefore, frequent cleaning schedules are required, consuming high quantities of water in regions where water precipitation is rare. The efficiency of this cleaning maintenance is greatly improved when methods to estimate the degree of cleanness are introduced. This work focuses on the need for better detecting the degradation created by dust deposition, considering experimental data based on different air pollutants, and analyzing the resulting thermal and visible signatures under different operating environments. Experiments are performed using six different types of pollutants applied to the surface of parabolic trough collectors while varying the pollutant density. The resulting reflectivity in the visible and infrared spectrum is calculated and compared. Results indicate that the pollutants can be distinguished, although the reflectivity greatly depends on the combination of the particle size of the pollutant and the applied amount, with greater impact from pollutants with small particles
    corecore