4,533 research outputs found

    Towards an Efficient, Scalable Stream Query Operator Framework for Representing and Analyzing Continuous Fields

    Get PDF
    Advancements in sensor technology have made it less expensive to deploy massive numbers of sensors to observe continuous geographic phenomena at high sample rates and stream live sensor observations. This fact has raised new challenges since sensor streams have pushed the limits of traditional geo-sensor data management technology. Data Stream Engines (DSEs) provide facilities for near real-time processing of streams, however, algorithms supporting representing and analyzing Spatio-Temporal (ST) phenomena are limited. This dissertation investigates near real-time representation and analysis of continuous ST phenomena, observed by large numbers of mobile, asynchronously sampling sensors, using a DSE and proposes two novel stream query operator frameworks. First, the ST Interpolation Stream Query Operator Framework (STI-SQO framework) continuously transforms sensor streams into rasters using a novel set of stream query operators that perform ST-IDW interpolation. A key component of the STI-SQO framework is the 3D, main memory-based, ST Grid Index that enables high performance ST insertion and deletion of massive numbers of sensor observations through Isotropic Time Cell and Time Block-based partitioning. The ST Grid Index facilitates fast ST search for samples using ST shell-based neighborhood search templates, namely the Cylindrical Shell Template and Nested Shell Template. Furthermore, the framework contains the stream-based ST-IDW algorithms ST Shell and ST ak-Shell for high performance, parallel grid cell interpolation. Secondly, the proposed ST Predicate Stream Query Operator Framework (STP-SQO framework) efficiently evaluates value predicates over ST streams of ST continuous phenomena. The framework contains several stream-based predicate evaluation algorithms, including Region-Growing, Tile-based, and Phenomenon-Aware algorithms, that target predicate evaluation to regions with seed points and minimize the number of raster cells that are interpolated when evaluating value predicates. The performance of the proposed frameworks was assessed with regard to prediction accuracy of output results and runtime. The STI-SQO framework achieved a processing throughput of 250,000 observations in 2.5 s with a Normalized Root Mean Square Error under 0.19 using a 500×500 grid. The STP-SQO framework processed over 250,000 observations in under 0.25 s for predicate results covering less than 40% of the observation area, and the Scan Line Region Growing algorithm was consistently the fastest algorithm tested

    CAREER: Data Management for Ad-Hoc Geosensor Networks

    Get PDF
    This project explores data management methods for geosensor networks, i.e. large collections of very small, battery-driven sensor nodes deployed in the geographic environment that measure the temporal and spatial variations of physical quantities such as temperature or ozone levels. An important task of such geosensor networks is to collect, analyze and estimate information about continuous phenomena under observation such as a toxic cloud close to a chemical plant in real-time and in an energy-efficient way. The main thrust of this project is the integration of spatial data analysis techniques with in-network data query execution in sensor networks. The project investigates novel algorithms such as incremental, in-network kriging that redefines a traditional, highly computationally intensive spatial data estimation method for a distributed, collaborative and incremental processing between tiny, energy and bandwidth constrained sensor nodes. This work includes the modeling of location and sensing characteristics of sensor devices with regard to observed phenomena, the support of temporal-spatial estimation queries, and a focus on in-network data aggregation algorithms for complex spatial estimation queries. Combining high-level data query interfaces with advanced spatial analysis methods will allow domain scientists to use sensor networks effectively in environmental observation. The project has a broad impact on the community involving undergraduate and graduate students in spatial database research at the University of Maine as well as being a key component of a current IGERT program in the areas of sensor materials, sensor devices and sensor. More information about this project, publications, simulation software, and empirical studies are available on the project\u27s web site (http://www.spatial.maine.edu/~nittel/career/)

    AN INTEGRATED APPROACH FOR POLLUTION MONITORING: SMART ACQUIREMENT AND SMART INFORMATION

    Get PDF
    Air quality is a factor of primary importance for the quality of life. The increase of the pollutants percentage in the air can cause serious problems to the human and environmental health. For this reason it is essential to monitor its values to prevent the consequences of an excessive concentration, to reduce the pollution production or to avoid the contact with major pollutant concentration through the available tools. Some recently developed tools for the monitoring and sharing of the data in an effective system permit to manage the information in a smart way, in order to improve the knowledge of the problem and, consequently, to take preventing measures in favour of the urban air quality and human health. In this paper, the authors describe an innovative solution that implements geomatics sensors (GNSS) and pollutant measurement sensors to develop a low cost sensor for the acquisition of pollutants dynamic data using a mobile platform based on bicycles. The acquired data can be analysed to evaluate the local distribution of pollutant density and shared through web platforms that use standard protocols for an effective smart use

    4Sensing - decentralized processing for participatory sensing data

    Get PDF
    Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática.Participatory sensing is a new application paradigm, stemming from both technical and social drives, which is currently gaining momentum as a research domain. It leverages the growing adoption of mobile phones equipped with sensors, such as camera, GPS and accelerometer, enabling users to collect and aggregate data, covering a wide area without incurring in the costs associated with a large-scale sensor network. Related research in participatory sensing usually proposes an architecture based on a centralized back-end. Centralized solutions raise a set of issues. On one side, there is the implications of having a centralized repository hosting privacy sensitive information. On the other side, this centralized model has financial costs that can discourage grassroots initiatives. This dissertation focuses on the data management aspects of a decentralized infrastructure for the support of participatory sensing applications, leveraging the body of work on participatory sensing and related areas, such as wireless and internet-wide sensor networks, peer-to-peer data management and stream processing. It proposes a framework covering a common set of data management requirements - from data acquisition, to processing, storage and querying - with the goal of lowering the barrier for the development and deployment of applications. Alternative architectural approaches - RTree, QTree and NTree - are proposed and evaluated experimentally in the context of a case-study application - SpeedSense - supporting the monitoring and prediction of traffic conditions, through the collection of speed and location samples in an urban setting, using GPS equipped mobile phones

    REAL-TIME EARTHQUAKE MONITORING WITH SPATIO-TEMPORAL FIELDS

    Get PDF

    Spatiotemporal Wireless Sensor Network Field Approximation with Multilayer Perceptron Artificial Neural Network Models

    Get PDF
    As sensors become increasingly compact and dependable in natural environments, spatially-distributed heterogeneous sensor network systems steadily become more pervasive. However, any environmental monitoring system must account for potential data loss due to a variety of natural and technological causes. Modeling a natural spatial region can be problematic due to spatial nonstationarities in environmental variables, and as particular regions may be subject to specific influences at different spatial scales. Relationships between processes within these regions are often ephemeral, so models designed to represent them cannot remain static. Integrating temporal factors into this model engenders further complexity. This dissertation evaluates the use of multilayer perceptron neural network models in the context of sensor networks as a possible solution to many of these problems given their data-driven nature, their representational flexibility and straightforward fitting process. The relative importance of parameters is determined via an adaptive backpropagation training process, which converges to a best-fit model for sensing platforms to validate collected data or approximate missing readings. As conditions evolve over time such that the model can no longer adapt to changes, new models are trained to replace the old. We demonstrate accuracy results for the MLP generally on par with those of spatial kriging, but able to integrate additional physical and temporal parameters, enabling its application to any region with a collection of available data streams. Potential uses of this model might be not only to approximate missing data in the sensor field, but also to flag potentially incorrect, unusual or atypical data returned by the sensor network. Given the potential for spatial heterogeneity in a monitored phenomenon, this dissertation further explores the benefits of partitioning a space and applying individual MLP models to these partitions. A system of neural models using both spatial and temporal parameters can be envisioned such that a spatiotemporal space partitioned by k-means is modeled by k neural models with internal weightings varying individually according to the dominant processes within the assigned region of each. Evaluated on simulated and real data on surface currents of theGulf ofMaine, partitioned models show significant improved results over single global models

    Evaluating the Differences of Gridding Techniques for Digital Elevation Models Generation and Their Influence on the Modeling of Stony Debris Flows Routing: A Case Study From Rovina di Cancia Basin (North-Eastern Italian Alps)

    Get PDF
    Debris \ufb02ows are among the most hazardous phenomena in mountain areas. To cope with debris \ufb02ow hazard, it is common to delineate the risk-prone areas through routing models. The most important input to debris \ufb02ow routing models are the topographic data, usually in the form of Digital Elevation Models (DEMs). The quality of DEMs depends on the accuracy, density, and spatial distribution of the sampled points; on the characteristics of the surface; and on the applied gridding methodology. Therefore, the choice of the interpolation method affects the realistic representation of the channel and fan morphology, and thus potentially the debris \ufb02ow routing modeling outcomes. In this paper, we initially investigate the performance of common interpolation methods (i.e., linear triangulation, natural neighbor, nearest neighbor, Inverse Distance to a Power, ANUDEM, Radial Basis Functions, and ordinary kriging) in building DEMs with the complex topography of a debris \ufb02ow channel located in the Venetian Dolomites (North-eastern Italian Alps), by using small footprint full- waveform Light Detection And Ranging (LiDAR) data. The investigation is carried out through a combination of statistical analysis of vertical accuracy, algorithm robustness, and spatial clustering of vertical errors, and multi-criteria shape reliability assessment. After that, we examine the in\ufb02uence of the tested interpolation algorithms on the performance of a Geographic Information System (GIS)-based cell model for simulating stony debris \ufb02ows routing. In detail, we investigate both the correlation between the DEMs heights uncertainty resulting from the gridding procedure and that on the corresponding simulated erosion/deposition depths, both the effect of interpolation algorithms on simulated areas, erosion and deposition volumes, solid-liquid discharges, and channel morphology after the event. The comparison among the tested interpolation methods highlights that the ANUDEM and ordinary kriging algorithms are not suitable for building DEMs with complex topography. Conversely, the linear triangulation, the natural neighbor algorithm, and the thin-plate spline plus tension and completely regularized spline functions ensure the best trade-off among accuracy and shape reliability. Anyway, the evaluation of the effects of gridding techniques on debris \ufb02ow routing modeling reveals that the choice of the interpolation algorithm does not signi\ufb01cantly affect the model outcomes

    Data science applications to connected vehicles: Key barriers to overcome

    Get PDF
    The connected vehicles will generate huge amount of pervasive and real time data, at very high frequencies. This poses new challenges for Data science. How to analyse these data and how to address short-term and long-term storage are some of the key barriers to overcome.JRC.C.6-Economics of Climate Change, Energy and Transpor
    • …
    corecore