865 research outputs found

    Disaster Monitoring with Wikipedia and Online Social Networking Sites: Structured Data and Linked Data Fragments to the Rescue?

    Full text link
    In this paper, we present the first results of our ongoing early-stage research on a realtime disaster detection and monitoring tool. Based on Wikipedia, it is language-agnostic and leverages user-generated multimedia content shared on online social networking sites to help disaster responders prioritize their efforts. We make the tool and its source code publicly available as we make progress on it. Furthermore, we strive to publish detected disasters and accompanying multimedia content following the Linked Data principles to facilitate its wide consumption, redistribution, and evaluation of its usefulness.Comment: Accepted for publication at the AAAI Spring Symposium 2015: Structured Data for Humanitarian Technologies: Perfect fit or Overkill? #SD4HumTech1

    Your {JSON} is not my {JSON} : a case for more fine-grained content negotiation

    Get PDF
    Information resources can be expressed in different representations along many dimensions such as format, language, and time. Through content negotiation, http clients and servers can agree on which representation is most appropriate for a given piece of data. For instance, interactive clients typically indicate they prefer HTML, whereas automated clients would ask for JSON or RDF. However, labels such as “JSON” and “RDF” are insufficient to negotiate between the rich variety of possibilities offered by today’s languages and data models. This position paper argues that, despite widespread misuse, content negotiation remains the way forward. However, we need to extend it with more granular options in order to serve different current and future Web clients sustainably

    Continuous client-side query evaluation over dynamic linked data

    Get PDF
    Existing solutions to query dynamic Linked Data sources extend the SPARQL language, and require continuous server processing for each query. Traditional SPARQL endpoints already accept highly expressive queries, so extending these endpoints for time-sensitive queries increases the server cost even further. To make continuous querying over dynamic Linked Data more affordable, we extend the low-cost Triple Pattern Fragments (TPF) interface with support for time-sensitive queries. In this paper, we introduce the TPF Query Streamer that allows clients to evaluate SPARQL queries with continuously updating results. Our experiments indicate that this extension significantly lowers the server complexity, at the expense of an increase in the execution time per query. We prove that by moving the complexity of continuously evaluating queries over dynamic Linked Data to the clients and thus increasing bandwidth usage, the cost at the server side is significantly reduced. Our results show that this solution makes real-time querying more scalable for a large amount of concurrent clients when compared to the alternatives
    corecore