1,254 research outputs found

    A Survey on IT-Techniques for a Dynamic Emergency Management in Large Infrastructures

    Get PDF
    This deliverable is a survey on the IT techniques that are relevant to the three use cases of the project EMILI. It describes the state-of-the-art in four complementary IT areas: Data cleansing, supervisory control and data acquisition, wireless sensor networks and complex event processing. Even though the deliverableā€™s authors have tried to avoid a too technical language and have tried to explain every concept referred to, the deliverable might seem rather technical to readers so far little familiar with the techniques it describes

    When Things Matter: A Data-Centric View of the Internet of Things

    Full text link
    With the recent advances in radio-frequency identification (RFID), low-cost wireless sensor devices, and Web technologies, the Internet of Things (IoT) approach has gained momentum in connecting everyday objects to the Internet and facilitating machine-to-human and machine-to-machine communication with the physical world. While IoT offers the capability to connect and integrate both digital and physical entities, enabling a whole new class of applications and services, several significant challenges need to be addressed before these applications and services can be fully realized. A fundamental challenge centers around managing IoT data, typically produced in dynamic and volatile environments, which is not only extremely large in scale and volume, but also noisy, and continuous. This article surveys the main techniques and state-of-the-art research efforts in IoT from data-centric perspectives, including data stream processing, data storage models, complex event processing, and searching in IoT. Open research issues for IoT data management are also discussed

    DataHub: Collaborative Data Science & Dataset Version Management at Scale

    Get PDF
    Relational databases have limited support for data collaboration, where teams collaboratively curate and analyze large datasets. Inspired by software version control systems like git, we propose (a) a dataset version control system, giving users the ability to create, branch, merge, difference and search large, divergent collections of datasets, and (b) a platform, DataHub, that gives users the ability to perform collaborative data analysis building on this version control system. We outline the challenges in providing dataset version control at scale.Comment: 7 page

    SurfelWarp: Efficient Non-Volumetric Single View Dynamic Reconstruction

    Full text link
    We contribute a dense SLAM system that takes a live stream of depth images as input and reconstructs non-rigid deforming scenes in real time, without templates or prior models. In contrast to existing approaches, we do not maintain any volumetric data structures, such as truncated signed distance function (TSDF) fields or deformation fields, which are performance and memory intensive. Our system works with a flat point (surfel) based representation of geometry, which can be directly acquired from commodity depth sensors. Standard graphics pipelines and general purpose GPU (GPGPU) computing are leveraged for all central operations: i.e., nearest neighbor maintenance, non-rigid deformation field estimation and fusion of depth measurements. Our pipeline inherently avoids expensive volumetric operations such as marching cubes, volumetric fusion and dense deformation field update, leading to significantly improved performance. Furthermore, the explicit and flexible surfel based geometry representation enables efficient tackling of topology changes and tracking failures, which makes our reconstructions consistent with updated depth observations. Our system allows robots to maintain a scene description with non-rigidly deformed objects that potentially enables interactions with dynamic working environments.Comment: RSS 2018. The video and source code are available on https://sites.google.com/view/surfelwarp/hom

    CRUC: Cold-start Recommendations Using Collaborative Filtering in Internet of Things

    Get PDF
    The Internet of Things (IoT) aims at interconnecting everyday objects (including both things and users) and then using this connection information to provide customized user services. However, IoT does not work in its initial stages without adequate acquisition of user preferences. This is caused by cold-start problem that is a situation where only few users are interconnected. To this end, we propose CRUC scheme - Cold-start Recommendations Using Collaborative Filtering in IoT, involving formulation, filtering and prediction steps. Extensive experiments over real cases and simulation have been performed to evaluate the performance of CRUC scheme. Experimental results show that CRUC efficiently solves the cold-start problem in IoT.Comment: Elsevier ESEP 2011: 9-10 December 2011, Singapore, Elsevier Energy Procedia, http://www.elsevier.com/locate/procedia/, 201

    Understanding the indoor environment through mining sensory data - a case study

    Get PDF
    A wireless sensor network (WSN) is a group of sensors linked by wireless medium to perform distributed sensing tasks. WSNs have attracted a wide interest from academia and industry alike due to their diversity of applications, including home automation, smart environment, and emergency services, in various buildings. The primary goal of a WSN is to collect data sensed by sensors. These data are characteristic of being heavily noisy, exhibiting temporal and spatial correlation. In order to extract useful information from such data, as this paper will demonstrate, people need to utilise various techniques to analyse the data. Data mining is a process in which a wide spectrum of data analysis methods is used. It is applied in the paper to analyse data collected from WSNs monitoring an indoor environment in a building. A case study is given to demonstrate how data mining can be used to optimise the use of the office space in a building

    Spatial Data Quality in the IoT Era:Management and Exploitation

    Get PDF
    Within the rapidly expanding Internet of Things (IoT), growing amounts of spatially referenced data are being generated. Due to the dynamic, decentralized, and heterogeneous nature of the IoT, spatial IoT data (SID) quality has attracted considerable attention in academia and industry. How to invent and use technologies for managing spatial data quality and exploiting low-quality spatial data are key challenges in the IoT. In this tutorial, we highlight the SID consumption requirements in applications and offer an overview of spatial data quality in the IoT setting. In addition, we review pertinent technologies for quality management and low-quality data exploitation, and we identify trends and future directions for quality-aware SID management and utilization. The tutorial aims to not only help researchers and practitioners to better comprehend SID quality challenges and solutions, but also offer insights that may enable innovative research and applications

    The design and development of multi-agent based RFID middleware system for data and devices management

    Get PDF
    Thesis (D. Tech. (Electrical Engineering)) - Central University of technology, Free State, 2012Radio frequency identification technology (RFID) has emerged as a key technology for automatic identification and promises to revolutionize business processes. While RFID technology adoption is improving rapidly, reliable and widespread deployment of this technology still faces many significant challenges. The key deployment challenges include how to use the simple, unreliable raw data generated by RFID deployments to make business decisions; and how to manage a large number of deployed RFID devices. In this thesis, a multi-agent based RFID middleware which addresses some of the RFID data and device management challenges was developed. The middleware developed abstracts the auto-identification applications from physical RFID device specific details and provides necessary services such as device management, data cleaning, event generation, query capabilities and event persistence. The use of software agent technology offers a more scalable and distributed system architecture for the proposed middleware. As part of a multi-agent system, application-independent domain ontology for RFID devices was developed. This ontology can be used or extended in any application interested with RFID domain ontology. In order to address the event processing tasks within the proposed middleware system, a temporal-based RFID data model which considers both applicationsā€™ temporal and spatial granules in the data model itself for efficient event processing was developed. The developed data model extends the conventional Entity-Relationship constructs by adding a time attribute to the model. By maintaining the history of events and state changes, the data model captures the fundamental RFID application logic within the data model. Hence, this new data model supports efficient generation of application level events, updating, querying and analysis of both recent and historical events. As part of the RFID middleware, an adaptive sliding-window based data cleaning scheme for reducing missed readings from RFID data streams (called WSTD) was also developed. The WSTD scheme models the unreliability of the RFID readings by viewing RFID streams as a statistical sample of tags in the physical world, and exploits techniques grounded in sampling theory to drive its cleaning processes. The WSTD scheme is capable of efficiently coping with both environmental variations and tag dynamics by automatically and continuously adapting its cleaning window size, based on observed readings
    • ā€¦
    corecore