12,474 research outputs found
Provenance Management over Linked Data Streams
Provenance describes how results are produced starting from data sources, curation, recovery, intermediate processing, to the final results. Provenance has been applied to solve many problems and in particular to understand how errors are propagated in large-scale environments such as Internet of Things, Smart Cities. In fact, in such environments operations on data are often performed by multiple uncoordinated parties, each potentially introducing or propagating errors. These errors cause uncertainty of the overall data analytics process that is further amplified when many data sources are combined and errors get propagated across multiple parties. The ability to properly identify how such errors influence the results is crucial to assess the quality of the results. This problem becomes even more challenging in the case of Linked Data Streams, where data is dynamic and often incomplete. In this paper, we introduce methods to compute provenance over Linked Data Streams. More specifically, we propose provenance management techniques to compute provenance of continuous queries executed over complete Linked Data streams. Unlike traditional provenance management techniques, which are applied on static data, we focus strictly on the dynamicity and heterogeneity of Linked Data streams. Specifically, in this paper we describe: i) means to deliver a dynamic provenance trace of the results to the user, ii) a system capable to execute queries over dynamic Linked Data and compute provenance of these queries, and iii) an empirical evaluation of our approach using real-world datasets
Data Provenance and Management in Radio Astronomy: A Stream Computing Approach
New approaches for data provenance and data management (DPDM) are required
for mega science projects like the Square Kilometer Array, characterized by
extremely large data volume and intense data rates, therefore demanding
innovative and highly efficient computational paradigms. In this context, we
explore a stream-computing approach with the emphasis on the use of
accelerators. In particular, we make use of a new generation of high
performance stream-based parallelization middleware known as InfoSphere
Streams. Its viability for managing and ensuring interoperability and integrity
of signal processing data pipelines is demonstrated in radio astronomy. IBM
InfoSphere Streams embraces the stream-computing paradigm. It is a shift from
conventional data mining techniques (involving analysis of existing data from
databases) towards real-time analytic processing. We discuss using InfoSphere
Streams for effective DPDM in radio astronomy and propose a way in which
InfoSphere Streams can be utilized for large antennae arrays. We present a
case-study: the InfoSphere Streams implementation of an autocorrelating
spectrometer, and using this example we discuss the advantages of the
stream-computing approach and the utilization of hardware accelerators
Estimating Fire Weather Indices via Semantic Reasoning over Wireless Sensor Network Data Streams
Wildfires are frequent, devastating events in Australia that regularly cause
significant loss of life and widespread property damage. Fire weather indices
are a widely-adopted method for measuring fire danger and they play a
significant role in issuing bushfire warnings and in anticipating demand for
bushfire management resources. Existing systems that calculate fire weather
indices are limited due to low spatial and temporal resolution. Localized
wireless sensor networks, on the other hand, gather continuous sensor data
measuring variables such as air temperature, relative humidity, rainfall and
wind speed at high resolutions. However, using wireless sensor networks to
estimate fire weather indices is a challenge due to data quality issues, lack
of standard data formats and lack of agreement on thresholds and methods for
calculating fire weather indices. Within the scope of this paper, we propose a
standardized approach to calculating Fire Weather Indices (a.k.a. fire danger
ratings) and overcome a number of the challenges by applying Semantic Web
Technologies to the processing of data streams from a wireless sensor network
deployed in the Springbrook region of South East Queensland. This paper
describes the underlying ontologies, the semantic reasoning and the Semantic
Fire Weather Index (SFWI) system that we have developed to enable domain
experts to specify and adapt rules for calculating Fire Weather Indices. We
also describe the Web-based mapping interface that we have developed, that
enables users to improve their understanding of how fire weather indices vary
over time within a particular region.Finally, we discuss our evaluation results
that indicate that the proposed system outperforms state-of-the-art techniques
in terms of accuracy, precision and query performance.Comment: 20pages, 12 figure
Collaborative Reuse of Streaming Dataflows in IoT Applications
Distributed Stream Processing Systems (DSPS) like Apache Storm and Spark
Streaming enable composition of continuous dataflows that execute persistently
over data streams. They are used by Internet of Things (IoT) applications to
analyze sensor data from Smart City cyber-infrastructure, and make active
utility management decisions. As the ecosystem of such IoT applications that
leverage shared urban sensor streams continue to grow, applications will
perform duplicate pre-processing and analytics tasks. This offers the
opportunity to collaboratively reuse the outputs of overlapping dataflows,
thereby improving the resource efficiency. In this paper, we propose
\emph{dataflow reuse algorithms} that given a submitted dataflow, identifies
the intersection of reusable tasks and streams from a collection of running
dataflows to form a \emph{merged dataflow}. Similar algorithms to unmerge
dataflows when they are removed are also proposed. We implement these
algorithms for the popular Apache Storm DSPS, and validate their performance
and resource savings for 35 synthetic dataflows based on public OPMW workflows
with diverse arrival and departure distributions, and on 21 real IoT dataflows
from RIoTBench.Comment: To appear in IEEE eScience Conference 201
Towards structured sharing of raw and derived neuroimaging data across existing resources
Data sharing efforts increasingly contribute to the acceleration of
scientific discovery. Neuroimaging data is accumulating in distributed
domain-specific databases and there is currently no integrated access mechanism
nor an accepted format for the critically important meta-data that is necessary
for making use of the combined, available neuroimaging data. In this
manuscript, we present work from the Derived Data Working Group, an open-access
group sponsored by the Biomedical Informatics Research Network (BIRN) and the
International Neuroimaging Coordinating Facility (INCF) focused on practical
tools for distributed access to neuroimaging data. The working group develops
models and tools facilitating the structured interchange of neuroimaging
meta-data and is making progress towards a unified set of tools for such data
and meta-data exchange. We report on the key components required for integrated
access to raw and derived neuroimaging data as well as associated meta-data and
provenance across neuroimaging resources. The components include (1) a
structured terminology that provides semantic context to data, (2) a formal
data model for neuroimaging with robust tracking of data provenance, (3) a web
service-based application programming interface (API) that provides a
consistent mechanism to access and query the data model, and (4) a provenance
library that can be used for the extraction of provenance data by image
analysts and imaging software developers. We believe that the framework and set
of tools outlined in this manuscript have great potential for solving many of
the issues the neuroimaging community faces when sharing raw and derived
neuroimaging data across the various existing database systems for the purpose
of accelerating scientific discovery
- …