17,787 research outputs found

    Platform Relative Sensor Abstractions across Mobile Robots using Computer Vision and Sensor Integration

    Get PDF
    Uniform sensor management and abstraction across different robot platforms is a difficult task due to the sheer diversity of sensing devices. However, because these sensors can be grouped into categories that in essence provide the same information, we can capture their similarities and create abstractions. An example would be distance data measured by an assortment of range sensors, or alternatively extracted from a camera using image processing. This paper describes how using software components it is possible to uniformly construct high-level abstractions of sensor information across various robots in a way to support the portability of common code that uses these abstractions (e.g. obstacle avoidance, wall following). We demonstrate our abstractions on a number of robots using different configurations of range sensors and cameras

    Visual analysis of sensor logs in smart spaces: Activities vs. situations

    Get PDF
    Models of human habits in smart spaces can be expressed by using a multitude of representations whose readability influences the possibility of being validated by human experts. Our research is focused on developing a visual analysis pipeline (service) that allows, starting from the sensor log of a smart space, to graphically visualize human habits. The basic assumption is to apply techniques borrowed from the area of business process automation and mining on a version of the sensor log preprocessed in order to translate raw sensor measurements into human actions. The proposed pipeline is employed to automatically extract models to be reused for ambient intelligence. In this paper, we present an user evaluation aimed at demonstrating the effectiveness of the approach, by comparing it wrt. a relevant state-of-the-art visual tool, namely SITUVIS

    Cognitive visual tracking and camera control

    Get PDF
    Cognitive visual tracking is the process of observing and understanding the behaviour of a moving person. This paper presents an efficient solution to extract, in real-time, high-level information from an observed scene, and generate the most appropriate commands for a set of pan-tilt-zoom (PTZ) cameras in a surveillance scenario. Such a high-level feedback control loop, which is the main novelty of our work, will serve to reduce uncertainties in the observed scene and to maximize the amount of information extracted from it. It is implemented with a distributed camera system using SQL tables as virtual communication channels, and Situation Graph Trees for knowledge representation, inference and high-level camera control. A set of experiments in a surveillance scenario show the effectiveness of our approach and its potential for real applications of cognitive vision

    Applications of fuzzy logic to control and decision making

    Get PDF
    Long range space missions will require high operational efficiency as well as autonomy to enhance the effectivity of performance. Fuzzy logic technology has been shown to be powerful and robust in interpreting imprecise measurements and generating appropriate control decisions for many space operations. Several applications are underway, studying the fuzzy logic approach to solving control and decision making problems. Fuzzy logic algorithms for relative motion and attitude control have been developed and demonstrated for proximity operations. Based on this experience, motion control algorithms that include obstacle avoidance were developed for a Mars Rover prototype for maneuvering during the sample collection process. A concept of an intelligent sensor system that can identify objects and track them continuously and learn from its environment is under development to support traffic management and proximity operations around the Space Station Freedom. For safe and reliable operation of Lunar/Mars based crew quarters, high speed controllers with ability to combine imprecise measurements from several sensors is required. A fuzzy logic approach that uses high speed fuzzy hardware chips is being studied

    Smart Traction Control Systems for Electric Vehicles Using Acoustic Road-type Estimation

    Full text link
    The application of traction control systems (TCS) for electric vehicles (EV) has great potential due to easy implementation of torque control with direct-drive motors. However, the control system usually requires road-tire friction and slip-ratio values, which must be estimated. While it is not possible to obtain the first one directly, the estimation of latter value requires accurate measurements of chassis and wheel velocity. In addition, existing TCS structures are often designed without considering the robustness and energy efficiency of torque control. In this work, both problems are addressed with a smart TCS design having an integrated acoustic road-type estimation (ARTE) unit. This unit enables the road-type recognition and this information is used to retrieve the correct look-up table between friction coefficient and slip-ratio. The estimation of the friction coefficient helps the system to update the necessary input torque. The ARTE unit utilizes machine learning, mapping the acoustic feature inputs to road-type as output. In this study, three existing TCS for EVs are examined with and without the integrated ARTE unit. The results show significant performance improvement with ARTE, reducing the slip ratio by 75% while saving energy via reduction of applied torque and increasing the robustness of the TCS.Comment: Accepted to be published by IEEE Trans. on Intelligent Vehicles, 22 Jan 201

    Robot control based on qualitative representation of human trajectories

    Get PDF
    A major challenge for future social robots is the high-level interpretation of human motion, and the consequent generation of appropriate robot actions. This paper describes some fundamental steps towards the real-time implementation of a system that allows a mobile robot to transform quantitative information about human trajectories (i.e. coordinates and speed) into qualitative concepts, and from these to generate appropriate control commands. The problem is formulated using a simple version of qualitative trajectory calculus, then solved using an inference engine based on fuzzy temporal logic and situation graph trees. Preliminary results are discussed and future directions of the current research are drawn
    corecore