837 research outputs found

    Ontological representation of time-of-flight camera data to support vision-based AmI

    Get PDF
    Proceedings of: 4th International Workshop on Sensor Networks and Ambient Intelligence, 19-23 March 2012, Lugano ( Switzerland)Recent advances in technologies for capturing video data have opened a vast amount of new application areas. Among them, the incorporation of Time-of-Flight (ToF) cameras on Ambient Intelligence (AmI) environments. Although theperformance of tracking algorithms have quickly improved, symbolic models used to represent the resulting knowledge have not yet been adapted for smart environments. This paper presents an extension of a previous system in the area of videobased AmI to incorporate ToF information to enhance sceneinterpretation. The framework is founded on an ontologybased model of the scene, which is extended to incorporate ToF data. The advantages and new features of the model are demonstrated in a Social Signal Processing (SSP) application.This work was supported in part by Projects CICYT TIN2011-28620-C02-01, CICYT TEC2011-28626-C02-02, CAM CONTEXTS (S2009/TIC-1485) and DPS2008-07029- C02-02.Publicad

    Image-guided ToF depth upsampling: a survey

    Get PDF
    Recently, there has been remarkable growth of interest in the development and applications of time-of-flight (ToF) depth cameras. Despite the permanent improvement of their characteristics, the practical applicability of ToF cameras is still limited by low resolution and quality of depth measurements. This has motivated many researchers to combine ToF cameras with other sensors in order to enhance and upsample depth images. In this paper, we review the approaches that couple ToF depth images with high-resolution optical images. Other classes of upsampling methods are also briefly discussed. Finally, we provide an overview of performance evaluation tests presented in the related studies

    Assistive technology design and development for acceptable robotics companions for ageing years

    Get PDF
    © 2013 Farshid Amirabdollahian et al., licensee Versita Sp. z o. o. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs license, which means that the text may be used for non-commercial purposes, provided credit is given to the author.A new stream of research and development responds to changes in life expectancy across the world. It includes technologies which enhance well-being of individuals, specifically for older people. The ACCOMPANY project focuses on home companion technologies and issues surrounding technology development for assistive purposes. The project responds to some overlooked aspects of technology design, divided into multiple areas such as empathic and social human-robot interaction, robot learning and memory visualisation, and monitoring persons’ activities at home. To bring these aspects together, a dedicated task is identified to ensure technological integration of these multiple approaches on an existing robotic platform, Care-O-BotÂź3 in the context of a smart-home environment utilising a multitude of sensor arrays. Formative and summative evaluation cycles are then used to assess the emerging prototype towards identifying acceptable behaviours and roles for the robot, for example role as a butler or a trainer, while also comparing user requirements to achieved progress. In a novel approach, the project considers ethical concerns and by highlighting principles such as autonomy, independence, enablement, safety and privacy, it embarks on providing a discussion medium where user views on these principles and the existing tension between some of these principles, for example tension between privacy and autonomy over safety, can be captured and considered in design cycles and throughout project developmentsPeer reviewe

    Employing a RGB-D Sensor for Real-Time Tracking of Humans across Multiple Re-Entries in a Smart Environment

    Get PDF
    The term smart environment refers to physical spaces equipped with sensors feeding into adaptive algorithms that enable the environment to become sensitive and responsive to the presence and needs of its occupants. People with special needs, such as the elderly or disabled people, stand to benefit most from such environments as they offer sophisticated assistive functionalities supporting independent living and improved safety. In a smart environment, the key issue is to sense the location and identity of its users. In this paper, we intend to tackle the problems of detecting and tracking humans in a realistic home environment by exploiting the complementary nature of (synchronized) color and depth images produced by a low-cost consumer-level RGB-D camera. Our system selectively feeds the complementary data emanating from the two vision sensors to different algorithmic modules which together implement three sequential components: (1) object labeling based on depth data clustering, (2) human re-entry identification based on comparing visual signatures extracted from the color (RGB) information, and (3) human tracking based on the fusion of both depth and RGB data. Experimental results show that this division of labor improves the system’s efficiency and classification performance

    Precision Agriculture for Crop and Livestock Farming—Brief Review

    Get PDF
    In the last few decades, agriculture has played an important role in the worldwide economy. The need to produce more food for a rapidly growing population is creating pressure on crop and animal production and a negative impact to the environment. On the other hand, smart farming technologies are becoming increasingly common in modern agriculture to assist in optimizing agricultural and livestock production and minimizing the wastes and costs. Precision agriculture (PA) is a technology-enabled, data-driven approach to farming management that observes, measures, and analyzes the needs of individual fields and crops. Precision livestock farming (PLF), relying on the automatic monitoring of individual animals, is used for animal growth, milk production, and the detection of diseases as well as to monitor animal behavior and their physical environment, among others. This study aims to briefly review recent scientific and technological trends in PA and their application in crop and livestock farming, serving as a simple research guide for the researcher and farmer in the application of technology to agriculture. The development and operation of PA applications involve several steps and techniques that need to be investigated further to make the developed systems accurate and implementable in commercial environments.info:eu-repo/semantics/publishedVersio

    Infrared Sensors for Autonomous Vehicles

    Get PDF
    The spurt in interest and development of Autonomous vehicles is a continuing boost to the growth of electronic devices in the automotive industry. The sensing, processing, activation, feedback and control functions done by the human brain have to be replaced with electronics. The task is proving to be exhilarating and daunting at the same time. The environment sensors – RADAR (RAdio Detection And Ranging), Camera and LIDAR (Light Detection And Ranging) are enjoying a lot attention with the need for increasingly greater range and resolution being demanded by the “eyes” and faster computation by the “brain”. Even though all three and more sensors (Ultrasonic / Stereo Camera / GPS / etc.) will be used together; this chapter will focus on challenges facing Camera and LIDAR. Anywhere from 2 – 8 cameras and 1 – 2 LIDAR are expected to be part of the sensor suite needed by Autonomous vehicles – which have to function equally well in day and night. Near infrared (800 – 1000nm) devices are currently emitters of choice in these sensors. Higher range, resolution and Field of view pose many challenges to overcome with new electronic device innovations before we realize the safety and other benefits of autonomous vehicles

    A review of smartphones based indoor positioning: challenges and applications

    Get PDF
    The continual proliferation of mobile devices has encouraged much effort in using the smartphones for indoor positioning. This article is dedicated to review the most recent and interesting smartphones based indoor navigation systems, ranging from electromagnetic to inertia to visible light ones, with an emphasis on their unique challenges and potential real-world applications. A taxonomy of smartphones sensors will be introduced, which serves as the basis to categorise different positioning systems for reviewing. A set of criteria to be used for the evaluation purpose will be devised. For each sensor category, the most recent, interesting and practical systems will be examined, with detailed discussion on the open research questions for the academics, and the practicality for the potential clients

    Probabilistic three-dimensional object tracking based on adaptive depth segmentation

    Get PDF
    Object tracking is one of the fundamental topics of computer vision with diverse applications. The arising challenges in tracking, i.e., cluttered scenes, occlusion, complex motion, and illumination variations have motivated utilization of depth information from 3D sensors. However, current 3D trackers are not applicable to unconstrained environments without a priori knowledge. As an important object detection module in tracking, segmentation subdivides an image into its constituent regions. Nevertheless, the existing range segmentation methods in literature are difficult to implement in real-time due to their slow performance. In this thesis, a 3D object tracking method based on adaptive depth segmentation and particle filtering is presented. In this approach, the segmentation method as the bottom-up process is combined with the particle filter as the top-down process to achieve efficient tracking results under challenging circumstances. The experimental results demonstrate the efficiency, as well as robustness of the tracking algorithm utilizing real-world range information
    • 

    corecore