4 research outputs found

    Sensors and Technologies in Spain: State-of-the-Art

    Get PDF
    The aim of this special issue was to provide a comprehensive view on the state-of-the-art sensor technology in Spain. Different problems cause the appearance and development of new sensor technologies and vice versa, the emergence of new sensors facilitates the solution of existing real problems. [...

    On the Use of a Low-Cost Thermal Sensor to Improve Kinect People Detection in a Mobile Robot

    Get PDF
    Detecting people is a key capability for robots that operate in populated environments. In this paper, we have adopted a hierarchical approach that combines classifiers created using supervised learning in order to identify whether a person is in the view-scope of the robot or not. Our approach makes use of vision, depth and thermal sensors mounted on top of a mobile platform. The set of sensors is set up combining the rich data source offered by a Kinect sensor, which provides vision and depth at low cost, and a thermopile array sensor. Experimental results carried out with a mobile platform in a manufacturing shop floor and in a science museum have shown that the false positive rate achieved using any single cue is drastically reduced. The performance of our algorithm improves other well-known approaches, such as C4 and histogram of oriented gradients (HOG)

    Robot guidance using machine vision techniques in industrial environments: A comparative review

    Get PDF
    In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works

    A Mobile Robot System for Ambient Intelligence

    Get PDF
    Over the last years, Ambient Intelligence (AmI) has been pointed out as an alternative to current practices in home care. AmI supports the concept of Ambient Assisted Living, which aims to allow older people to remain independent at their own homes for longer. The integration of a mobile robot into a database-centric platform for Ambient Assisted Living is described in this thesis. The robot serves as a rst-aid agent to respond to emergencies, such as a fall, detected by the intelligent environment. To accomplish that the robot must 1) be able to receive tasks from intelligent environment; 2) execute the task; 3) report the progress and the result of the task back to the intelligent environment. The system of the robot is built on top of the Robot Operating System, while the existing intelligent environment on a PostgreSQL database. To receive tasks from the intelligent environment, the robot maintains an active connection with the database and subscribes to specic tasks. A task, for example, is to nd a person in the environment, which includes asking if the person is doing well. To nd a person a map-based approach and a face recognition are used. The robot can interact with people in the environment using text-to-speech and speech recognition. The active connection with the database enables the robot to report back about the execution of a task and to receive new or abort tasks. As a conclusion, together with an AAL system, mobile robots can support people living alone. The system has been implemented and successfully tested at Halmstad University on a Turtlebot 2. The code is available on Github
    corecore