36,291 research outputs found

    Information Agents for Pervasive Sensor Networks

    No full text
    In this paper, we describe an information agent, that resides on a mobile computer or personal digital assistant (PDA), that can autonomously acquire sensor readings from pervasive sensor networks (deciding when and which sensor to acquire readings from at any time). Moreover, it can perform a range of information processing tasks including modelling the accuracy of the sensor readings, predicting the value of missing sensor readings, and predicting how the monitored environmental parameters will evolve into the future. Our motivating scenario is the need to provide situational awareness support to first responders at the scene of a large scale incident, and we describe how we use an iterative formulation of a multi-output Gaussian process to build a probabilistic model of the environmental parameters being measured by local sensors, and the correlations and delays that exist between them. We validate our approach using data collected from a network of weather sensors located on the south coast of England

    Towards Real-Time Information Processing of Sensor Network Data using Computationally Efficient Multi-output Gaussian Processes

    No full text
    In this paper, we describe a novel, computationally efficient algorithm that facilitates the autonomous acquisition of readings from sensor networks (deciding when and which sensor to acquire readings from at any time), and which can, with minimal domain knowledge, perform a range of information processing tasks including modelling the accuracy of the sensor readings, predicting the value of missing sensor readings, and predicting how the monitored environmental variables will evolve into the future. Our motivating scenario is the need to provide situational awareness support to first responders at the scene of a large scale incident, and to this end, we describe a novel iterative formulation of a multi-output Gaussian process that can build and exploit a probabilistic model of the environmental variables being measured (including the correlations and delays that exist between them). We validate our approach using data collected from a network of weather sensors located on the south coast of England

    Decentralised Control of Adaptive Sampling in Wireless Sensor Networks

    No full text
    The efficient allocation of the limited energy resources of a wireless sensor network in a way that maximises the information value of the data collected is a significant research challenge. Within this context, this paper concentrates on adaptive sampling as a means of focusing a sensor’s energy consumption on obtaining the most important data. Specifically, we develop a principled information metric based upon Fisher information and Gaussian process regression that allows the information content of a sensor’s observations to be expressed. We then use this metric to derive three novel decentralised control algorithms for information-based adaptive sampling which represent a trade-off in computational cost and optimality. These algorithms are evaluated in the context of a deployed sensor network in the domain of flood monitoring. The most computationally efficient of the three is shown to increase the value of information gathered by approximately 83%, 27%, and 8% per day compared to benchmarks that sample in a naive non-adaptive manner, in a uniform non-adaptive manner, and using a state-of-the-art adaptive sampling heuristic (USAC) correspondingly. Moreover, our algorithm collects information whose total value is approximately 75% of the optimal solution (which requires an exponential, and thus impractical, amount of time to compute)

    Big Data and Reliability Applications: The Complexity Dimension

    Full text link
    Big data features not only large volumes of data but also data with complicated structures. Complexity imposes unique challenges in big data analytics. Meeker and Hong (2014, Quality Engineering, pp. 102-116) provided an extensive discussion of the opportunities and challenges in big data and reliability, and described engineering systems that can generate big data that can be used in reliability analysis. Meeker and Hong (2014) focused on large scale system operating and environment data (i.e., high-frequency multivariate time series data), and provided examples on how to link such data as covariates to traditional reliability responses such as time to failure, time to recurrence of events, and degradation measurements. This paper intends to extend that discussion by focusing on how to use data with complicated structures to do reliability analysis. Such data types include high-dimensional sensor data, functional curve data, and image streams. We first provide a review of recent development in those directions, and then we provide a discussion on how analytical methods can be developed to tackle the challenging aspects that arise from the complexity feature of big data in reliability applications. The use of modern statistical methods such as variable selection, functional data analysis, scalar-on-image regression, spatio-temporal data models, and machine learning techniques will also be discussed.Comment: 28 pages, 7 figure

    Self-Calibration Methods for Uncontrolled Environments in Sensor Networks: A Reference Survey

    Get PDF
    Growing progress in sensor technology has constantly expanded the number and range of low-cost, small, and portable sensors on the market, increasing the number and type of physical phenomena that can be measured with wirelessly connected sensors. Large-scale deployments of wireless sensor networks (WSN) involving hundreds or thousands of devices and limited budgets often constrain the choice of sensing hardware, which generally has reduced accuracy, precision, and reliability. Therefore, it is challenging to achieve good data quality and maintain error-free measurements during the whole system lifetime. Self-calibration or recalibration in ad hoc sensor networks to preserve data quality is essential, yet challenging, for several reasons, such as the existence of random noise and the absence of suitable general models. Calibration performed in the field, without accurate and controlled instrumentation, is said to be in an uncontrolled environment. This paper provides current and fundamental self-calibration approaches and models for wireless sensor networks in uncontrolled environments

    Oil Spill Detection Analyzing “Sentinel 2“ Satellite Images: A Persian Gulf Case Study

    Get PDF
    Oil spills near exploitation areas and oil loading ports are often related to the ambitions of governments to get more oil market share and the negligence at the time of the loading in large tankers or ships. The present study investigates one oil spill event using multi sensor satellite images in the Al Khafji (between Kuwait and Saudi Arabia) zone. Oil slicks have been characterized with multi sensor satellite images over the Persian Gulf and then analyzed in order to detect and classify oil spills in this zone. In particular this paper discusses oil pollution detection in the Persian Gulf by using multi sensor satellite images data. Oil spill images have been selected by using Sentinel 2 images pinpointing oil spill zones. ENVI software for analysing satellite images and ADIOS (Automated Data Inquiry for Oil Spills) for oil weathering modelling have been used. The obtained results in Al Khafji zone show that the oil spill moves towards the coastline firstly increasing its surface and then decreasing it until reaching the coastline
    corecore