1,232 research outputs found

    Criteria for the identification of ineffective open data portals: pretender open data portals

    Get PDF
    Open data are considered an essential resource for governments, businesses, and citizens. In that context, open data portals have potential for creating enormous economic growth. Open data portals should allow the reuse of open data, ensure the efficiency of data transmission, and enable professional initiatives based on data reuse. However, there are portals that are inefficient because they do not allow reuse of their data. The objective of this work is to define and identify open data portals that do not offer the possibility for professional reuse of their data. We refer to them as “pretender open data portals”. The following research questions are considered herein: What minimum criteria must an open data portal satisfy to enable professional reuse of open data? How can portals that do not meet these criteria be identified? And, what problems might these portals present, and how they might be solved? The results of an analysis of two samples of open data portals in Spain reveal that 63.8% and 56.1% of the portals analyzed in 2019 and 2021, respectively, can be considered pretender open data portals. The existence of pretender open data portals can have negative economic and social impacts, such as wasting public resources and projecting a negative image of the government’s open data policies. To find coordination mechanisms to develop open data portals that, through the professional re-use of their data can create economic and social value, is one important challenge. The analysis of best practices of open data portals can be also a way to go in deep in the understanding of open data reuse impact not only from a professional standpoint.

    Towards Cleaning-up Open Data Portals: A Metadata Reconciliation Approach

    Full text link
    This paper presents an approach for metadata reconciliation, curation and linking for Open Governamental Data Portals (ODPs). ODPs have been lately the standard solution for governments willing to put their public data available for the society. Portal managers use several types of metadata to organize the datasets, one of the most important ones being the tags. However, the tagging process is subject to many problems, such as synonyms, ambiguity or incoherence, among others. As our empiric analysis of ODPs shows, these issues are currently prevalent in most ODPs and effectively hinders the reuse of Open Data. In order to address these problems, we develop and implement an approach for tag reconciliation in Open Data Portals, encompassing local actions related to individual portals, and global actions for adding a semantic metadata layer above individual portals. The local part aims to enhance the quality of tags in a single portal, and the global part is meant to interlink ODPs by establishing relations between tags.Comment: 8 pages,10 Figures - Under Revision for ICSC201

    Quality of metadata in open data portals

    Get PDF
    During the last decade, numerous governmental, educational or cultural institutions have launched Open Data initiatives that have facilitated the access to large volumes of datasets on the web. The main way to disseminate this availability of data has been the deployment of Open Data catalogs exposing metadata of these datasets, which are easily indexed by web search engines. Open Source platforms have facilitated enormously the labor of institutions involved in Open Data initiatives, making the setup of Open Data portals almost a trivial task. However, few approaches have analyzed how precisely metadata describes the associated datasets. Taking into account the existing approaches for analyzing the quality of metadata in the Open Data context and other related domains, this work contributes to the state of the art by extending an ISO 19157 based method for checking the quality of geographic metadata to the context of Open Data metadata. Focusing on metadata models compliant with the Data Catalog Vocabulary proposed by W3C, the proposed extended method has been applied for the evaluation of the Open Data catalog of the Spanish Government. The results have been also compared with those obtained by the Metadata Quality Assessment methodology proposed at the European Data Portal

    Enabling Spatio-Temporal Search in Open Data

    Get PDF
    Intuitively, most datasets found in Open Data are organised by spatio-temporal scope, that is, single datasets provide data for a certain region, valid for a certain time period. For many use cases (such as for instance data journalism and fact checking) a pre-dominant need is to scope down the relevant datasets to a particular period or region. Therefore, we argue that spatio-temporal search is a crucial need for Open Data portals and across Open Data portals, yet - to the best of our knowledge - no working solution exists. We argue that - just like for for regular Web search - knowledge graphs can be helpful to significantly improve search: in fact, the ingredients for a public knowledge graph of geographic entities as well as time periods and events exist already on the Web of Data, although they have not yet been integrated and applied - in a principled manner - to the use case of Open Data search. In the present paper we aim at doing just that: we (i) present a scalable approach to construct a spatio-temporal knowledge graph that hierarchically structures geographical, as well as temporal entities, (ii) annotate a large corpus of tabular datasets from open data portals, (iii) enable structured, spatio-temporal search over Open Data catalogs through our spatio-temporal knowledge graph, both via a search interface as well as via a SPARQL endpoint, available at data.wu.ac.at/odgraphsearch/Series: Working Papers on Information Systems, Information Business and Operation

    A rubric driven evaluation of open data portals and their data in transportation

    Get PDF
    In recent years, the open data movement is gaining momentum in the transportation industry with multiple State\u27s Department of Transportation (DOT) launching their own repository of datasets. The quality of data, ease of usage and availability of metadata varies from source to source. There is an imminent need to assess the quality of open data portals to provide agencies a yardstick to measure their performance and draw inspirations from higher ranking portals. We propose a data portal evaluation rubric (DPER) which can serve this purpose. DPER is designed to capture the essence of the National Open Data Policy. The DPER was used to evaluate 43 data portals at the state and national level which provide transportation datasets. DPER evaluates the quality of the portal, the openness of data, and the relevance of its content to the transportation sector. The portal of the State of New York scores the highest due to its user-friendly interface with interactive visualization tools, relevant data content, detailed data information and useful API references for application developers

    Automatic Publication of Open Data from OGC Services: the Use Case of TRAFAIR Project

    Get PDF
    This work proposes a workflow for the publication of Open Spatial Data. The main contribution of this work is the automatic generation of metadata extracted from OGC spatial services providing access to feature types and coverages. Besides, this work adopts the GeoDCAT-AP metadata profile for the description of datasets because it allows for an appropriate crosswalk between the annotation requirements in the spatial domain and the metadata models accepted in general Open Data portals. The feasibility of the proposed workflow has been tested within the framework of the TRAFAIR project to publish monitoring and forecasting air quality data

    Tailored Digitization for Rural Development

    Get PDF
    Abstract The widespread use of many digital technologies along the food supply chain might have negative effects on rural development and on small and medium farms. One conclusion of this paper is that in order for rural areas to exploit all the benefits from digitization, avoiding the associated risks, there should be more agricultural extension services to farmers and more open data portals and platforms. This is in order to develop technologies specifically tailored for the economic, natural and social environment of rural areas, and therefore to be able to promote their modernization without giving up their cultural heritages. View Full-Tex
    • …
    corecore