578 research outputs found

    When things matter: A survey on data-centric Internet of Things

    Get PDF
    With the recent advances in radio-frequency identification (RFID), low-cost wireless sensor devices, and Web technologies, the Internet of Things (IoT) approach has gained momentum in connecting everyday objects to the Internet and facilitating machine-to-human and machine-to-machine communication with the physical world. IoT offers the capability to connect and integrate both digital and physical entities, enabling a whole new class of applications and services, but several significant challenges need to be addressed before these applications and services can be fully realized. A fundamental challenge centers around managing IoT data, typically produced in dynamic and volatile environments, which is not only extremely large in scale and volume, but also noisy and continuous. This paper reviews the main techniques and state-of-the-art research efforts in IoT from data-centric perspectives, including data stream processing, data storage models, complex event processing, and searching in IoT. Open research issues for IoT data management are also discussed

    Knowledge-Driven Harmonization of Sensor Observations: Exploiting Linked Open Data for IoT Data Streams

    Get PDF
    The rise of the Internet of Things leads to an unprecedented number of continuous sensor observations that are available as IoT data streams. Harmonization of such observations is a labor-intensive task due to heterogeneity in format, syntax, and semantics. We aim to reduce the effort for such harmonization tasks by employing a knowledge-driven approach. To this end, we pursue the idea of exploiting the large body of formalized public knowledge represented as statements in Linked Open Data

    Web-oriented Event Processing

    Get PDF
    How can the Web be made situation-aware? Event processing is a suitable technology for gaining the necessary real-time results. The Web, however, has many users and many application domains. Thus, we developed multi-schema friendly data models allowing the re-use and mix from diverse users and application domains. Furthermore, our methods describe protocols to exchange events on the Web, algorithms to execute the language and to calculate access rights

    Integrating building and urban semantics to empower smart water solutions

    Get PDF
    Current urban water research involves intelligent sensing, systems integration, proactive users and data-driven management through advanced analytics. The convergence of building information modeling with the smart water field provides an opportunity to transcend existing operational barriers. Such research would pave the way for demand-side management, active consumers, and demand-optimized networks, through interoperability and a system of systems approach. This paper presents a semantic knowledge management service and domain ontology which support a novel cloud-edge solution, by unifying domestic socio-technical water systems with clean and waste networks at an urban scale, to deliver value-added services for consumers and network operators. The web service integrates state of the art sensing, data analytics and middleware components. We propose an ontology for the domain which describes smart homes, smart metering, telemetry, and geographic information systems, alongside social concepts. This integrates previously isolated systems as well as supply and demand-side interventions, to improve system performance. A use case of demand-optimized management is introduced, and smart home application interoperability is demonstrated, before the performance of the semantic web service is presented and compared to alternatives. Our findings suggest that semantic web technologies and IoT can merge to bring together large data models with dynamic data streams, to support powerful applications in the operational phase of built environment systems

    From Heterogeneous Sensor Networks to Integrated Software Services: Design and Implementation of a Semantic Architecture for the Internet of Things at ARCES@UNIBO

    Get PDF
    The Internet of Things (IoTs) is growing fast both in terms of number of devices connected and of complexity of deployments and applications. Several research studies an- alyzing the economical impact of the IoT worldwide identify the interoperability as one of the main boosting factor for its growth, thanks to the possibility to unlock novel commercial opportunities derived from the integration of heterogeneous systems which are currently not interconnected. However, at present, interoperability constitutes a relevant practical issue on any IoT deployments that is composed of sensor platforms mapped on different wireless technologies, network protocols or data formats. The paper addresses such issue, and investigates how to achieve effective data interoperability and data reuse on complex IoT deployments, where multiple users/applications need to consume sensor data produced by heterogeneous sensor networks. We propose a generic three-tier IoT architecture, which decouples the sensor data producers from the sensor data consumers, thanks to the intermediation of a semantic broker which is in charge of translating the sensor data into a shared ontology, and of providing publish-subscribe facilities to the producers/consumers. Then, we describe the real-world implementation of such architecture devised at the Advanced Research Center on Electronic System (ARCES) of the University of Bologna. The actual system collects the data produced by three different sensor networks, integrates them through a SPARQL Event Processing Architecture (SEPA), and supports two front- end applications for the data access, i.e. a web dashboard and an Amazon Alexa voice service

    Data semantic enrichment for complex event processing over IoT Data Streams

    Get PDF
    This thesis generalizes techniques for processing IoT data streams, semantically enrich data with contextual information, as well as complex event processing in IoT applications. A case study for ECG anomaly detection and signal classification was conducted to validate the knowledge foundation

    Connected Information Management

    Get PDF
    Society is currently inundated with more information than ever, making efficient management a necessity. Alas, most of current information management suffers from several levels of disconnectedness: Applications partition data into segregated islands, small notes don’t fit into traditional application categories, navigating the data is different for each kind of data; data is either available at a certain computer or only online, but rarely both. Connected information management (CoIM) is an approach to information management that avoids these ways of disconnectedness. The core idea of CoIM is to keep all information in a central repository, with generic means for organization such as tagging. The heterogeneity of data is taken into account by offering specialized editors. The central repository eliminates the islands of application-specific data and is formally grounded by a CoIM model. The foundation for structured data is an RDF repository. The RDF editing meta-model (REMM) enables form-based editing of this data, similar to database applications such as MS access. Further kinds of data are supported by extending RDF, as follows. Wiki text is stored as RDF and can both contain structured text and be combined with structured data. Files are also supported by the CoIM model and are kept externally. Notes can be quickly captured and annotated with meta-data. Generic means for organization and navigation apply to all kinds of data. Ubiquitous availability of data is ensured via two CoIM implementations, the web application HYENA/Web and the desktop application HYENA/Eclipse. All data can be synchronized between these applications. The applications were used to validate the CoIM ideas

    Storing and querying evolving knowledge graphs on the web

    Get PDF
    • …
    corecore