2,249 research outputs found

    When Things Matter: A Data-Centric View of the Internet of Things

    Full text link
    With the recent advances in radio-frequency identification (RFID), low-cost wireless sensor devices, and Web technologies, the Internet of Things (IoT) approach has gained momentum in connecting everyday objects to the Internet and facilitating machine-to-human and machine-to-machine communication with the physical world. While IoT offers the capability to connect and integrate both digital and physical entities, enabling a whole new class of applications and services, several significant challenges need to be addressed before these applications and services can be fully realized. A fundamental challenge centers around managing IoT data, typically produced in dynamic and volatile environments, which is not only extremely large in scale and volume, but also noisy, and continuous. This article surveys the main techniques and state-of-the-art research efforts in IoT from data-centric perspectives, including data stream processing, data storage models, complex event processing, and searching in IoT. Open research issues for IoT data management are also discussed

    Capturing Data Uncertainty in High-Volume Stream Processing

    Get PDF
    We present the design and development of a data stream system that captures data uncertainty from data collection to query processing to final result generation. Our system focuses on data that is naturally modeled as continuous random variables. For such data, our system employs an approach grounded in probability and statistical theory to capture data uncertainty and integrates this approach into high-volume stream processing. The first component of our system captures uncertainty of raw data streams from sensing devices. Since such raw streams can be highly noisy and may not carry sufficient information for query processing, our system employs probabilistic models of the data generation process and stream-speed inference to transform raw data into a desired format with an uncertainty metric. The second component captures uncertainty as data propagates through query operators. To efficiently quantify result uncertainty of a query operator, we explore a variety of techniques based on probability and statistical theory to compute the result distribution at stream speed. We are currently working with a group of scientists to evaluate our system using traces collected from the domains of (and eventually in the real systems for) hazardous weather monitoring and object tracking and monitoring.Comment: CIDR 200

    A framework for distributed managing uncertain data in RFID traceability networks

    Get PDF
    The ability to track and trace individual items, especially through large-scale and distributed networks, is the key to realizing many important business applications such as supply chain management, asset tracking, and counterfeit detection. Networked RFID (radio frequency identification), which uses the Internet to connect otherwise isolated RFID systems and software, is an emerging technology to support traceability applications. Despite its promising benefits, there remains many challenges to be overcome before these benefits can be realized. One significant challenge centers around dealing with uncertainty of raw RFID data. In this paper, we propose a novel framework to effectively manage the uncertainty of RFID data in large scale traceability networks. The framework consists of a global object tracking model and a local RFID data cleaning model. In particular, we propose a Markov-based model for tracking objects globally and a particle filter based approach for processing noisy, low-level RFID data locally. Our implementation validates the proposed approach and the experimental results show its effectiveness.Jiangang Ma, Quan Z. Sheng, Damith Ranasinghe, Jen Min Chuah and Yanbo W

    When things matter: A survey on data-centric Internet of Things

    Get PDF
    With the recent advances in radio-frequency identification (RFID), low-cost wireless sensor devices, and Web technologies, the Internet of Things (IoT) approach has gained momentum in connecting everyday objects to the Internet and facilitating machine-to-human and machine-to-machine communication with the physical world. IoT offers the capability to connect and integrate both digital and physical entities, enabling a whole new class of applications and services, but several significant challenges need to be addressed before these applications and services can be fully realized. A fundamental challenge centers around managing IoT data, typically produced in dynamic and volatile environments, which is not only extremely large in scale and volume, but also noisy and continuous. This paper reviews the main techniques and state-of-the-art research efforts in IoT from data-centric perspectives, including data stream processing, data storage models, complex event processing, and searching in IoT. Open research issues for IoT data management are also discussed

    Context Aware Computing for The Internet of Things: A Survey

    Get PDF
    As we are moving towards the Internet of Things (IoT), the number of sensors deployed around the world is growing at a rapid pace. Market research has shown a significant growth of sensor deployments over the past decade and has predicted a significant increment of the growth rate in the future. These sensors continuously generate enormous amounts of data. However, in order to add value to raw sensor data we need to understand it. Collection, modelling, reasoning, and distribution of context in relation to sensor data plays critical role in this challenge. Context-aware computing has proven to be successful in understanding sensor data. In this paper, we survey context awareness from an IoT perspective. We present the necessary background by introducing the IoT paradigm and context-aware fundamentals at the beginning. Then we provide an in-depth analysis of context life cycle. We evaluate a subset of projects (50) which represent the majority of research and commercial solutions proposed in the field of context-aware computing conducted over the last decade (2001-2011) based on our own taxonomy. Finally, based on our evaluation, we highlight the lessons to be learnt from the past and some possible directions for future research. The survey addresses a broad range of techniques, methods, models, functionalities, systems, applications, and middleware solutions related to context awareness and IoT. Our goal is not only to analyse, compare and consolidate past research work but also to appreciate their findings and discuss their applicability towards the IoT.Comment: IEEE Communications Surveys & Tutorials Journal, 201

    Implicit Study of Techniques and Tools for Data Analysis of Complex Sensory Data

    Get PDF
    The utility as well as contribution of applications in Wireless Sensor Network (WSN) has been experienced by the users from more than a decade. However, with the evolution of time, it has been found that there is a massive growth of data generation even in WSN. The smaller size of sensor with limited battery life and minimal computational capability cannot handle processive such a massive stream of complex data efficiently. Although, there are various types of mining techniques being practiced today, but such tools and techniques cannot be efficiently used for analyzing such complex and massively growing data. This paper therefore discusses about the generation of large data and issues of the existing research techniques by reviewing the literatures and frequently used tools. The study finally briefs about the significant research gap that calls for need of data analytical tools in extracting knowledge from complex sensory data

    Scalable Statistical Modeling and Query Processing over Large Scale Uncertain Databases

    Get PDF
    The past decade has witnessed a large number of novel applications that generate imprecise, uncertain and incomplete data. Examples include monitoring infrastructures such as RFIDs, sensor networks and web-based applications such as information extraction, data integration, social networking and so on. In my dissertation, I addressed several challenges in managing such data and developed algorithms for efficiently executing queries over large volumes of such data. Specifically, I focused on the following challenges. First, for meaningful analysis of such data, we need the ability to remove noise and infer useful information from uncertain data. To address this challenge, I first developed a declarative system for applying dynamic probabilistic models to databases and data streams. The output of such probabilistic modeling is probabilistic data, i.e., data annotated with probabilities of correctness/existence. Often, the data also exhibits strong correlations. Although there is prior work in managing and querying such probabilistic data using probabilistic databases, those approaches largely assume independence and cannot handle probabilistic data with rich correlation structures. Hence, I built a probabilistic database system that can manage large-scale correlations and developed algorithms for efficient query evaluation. Our system allows users to provide uncertain data as input and to specify arbitrary correlations among the entries in the database. In the back end, we represent correlations as a forest of junction trees, an alternative representation for probabilistic graphical models (PGM). We execute queries over the probabilistic database by transforming them into message passing algorithms (inference) over the junction tree. However, traditional algorithms over junction trees typically require accessing the entire tree, even for small queries. Hence, I developed an index data structure over the junction tree called INDSEP that allows us to circumvent this process and thereby scalably evaluate inference queries, aggregation queries and SQL queries over the probabilistic database. Finally, query evaluation in probabilistic databases typically returns output tuples along with their probability values. However, the existing query evaluation model provides very little intuition to the users: for instance, a user might want to know Why is this tuple in my result? or Why does this output tuple have such high probability? or Which are the most influential input tuples for my query ?'' Hence, I designed a query evaluation model, and a suite of algorithms, that provide users with explanations for query results, and enable users to perform sensitivity analysis to better understand the query results

    Applications of wireless sensor networks in pharmaceutical industry

    Get PDF
    Advances in wireless sensor networking have opened up new opportunities in healthcare systems. The future will see the integration of the abundance of existing specialized medical technology with pervasive, wireless networks. Radio frequency identification (RFID) and Wireless Sensor Network (WSN) are the two key elements of Pervasive computing and are considered as interrelated technologies. Although RFID has been used in various areas but it lacks intelligence that is its ability to process information and respond to real world events. People are using large scale WSN to monitor real-time environment status. RFID technology, if combined with other sensors, may enable a range of other applications that can exponentially increase visibility and monitoring. Combined with RFID a general sensor can be upgraded to intelligent wireless sensor (Smart node), having sensing, computation, communication into a single small device Field Programmable Gate Arrays (FPGA) With dazzling wireless technology now available, it's tempting for manufacturers to snatch up any wireless sensor that comes along as a means of optimizing processes and plant performance. This is especially true within the pharmaceutical industry, where vendors are plying industrial-strength wireless sensors for temperature, humidity and pressure, as well as sensitive process-monitoring wireless devices to support PAT applications. In this paper we surveyed the existing wireless sensor and RFID based technologies that target the healthcare application
    corecore