11 research outputs found

    Data Ingredients: smart disclosure and open government data as complementary tools to meet policy objectives. The case of energy efficiency.

    Get PDF
    Open government data are considered a key asset for eGovernment. One could argue that governments can influence other types of data disclosure, as potential ingredients of innovative services. To discuss this assumption, we took the example of the U.S. 'Green Button' initiative – based on the disclosure of energy consumption data to each user – and analysed 36 energy-oriented digital services reusing these and other data, in order to highlight their set of inputs. We find that apps suggesting to a user a more efficient consumption behaviour also benefit from average retail electricity cost/price information; that energy efficiency 'scoring' apps also need, at least, structured and updated information on buildings performance; and that value-added services that derive insights from consumption data frequently rely on average energy consumption information. More in general, most of the surveyed services combine consumption data, open government data, and corporate data. When setting sector-specific agendas grounded on data disclosure, public agencies should therefore consider (contributing) to make available all three layers of information. No widely acknowledged initiatives of energy consumption data disclosure to users are being implemented in the EU. Moreover, browsing EU data portals and websites of public agencies, we find that other key data ingredients are not supplied (or, at least, not as open data), leaving room for possible improvements in this arena

    Is there such a thing as free government data?

    Get PDF
    The recently-amended European Public Sector Information (PSI) Directive rests on the assumption that government data is a valuable input for the knowledge economy. As a default principle, the directive sets marginal costs as an upper bound for charging PSI. This article discusses the terms under which the 2013 consultation on the implementation of the PSI Directive addresses the calculation criteria for marginal costs, which are complex to define, especially for internet-based services. What is found is that the allowed answers of the consultation indirectly lead the responder to reason in terms of the average incremental cost of allowing reuse, instead of the marginal cost of reproduction, provision and dissemination. Moreover, marginal-cost pricing (or zero pricing) is expected to lead to economically efficient results, while aiming at recouping the average incremental cost of allowing re-use may lead to excessive fees

    Collaborative Open Data versioning: a pragmatic approach using Linked Data

    Get PDF
    Most Open Government Data initiatives are centralised and unidirectional (i.e., they release data dumps in CSV or PDF format). Hence for non trivial applications reusers make copies of the government datasets to curate their local data copy. This situation is not optimal as it leads to duplication of efforts and reduces the possibility of sharing improvements. To improve the usefulness of publishing open data, several authors recommeded to use standard formats and data versioning. Here we focus on publishing versioned open linked data (i.e., in RDF format) because they allow one party to annotate data released independently by another party thus reducing the need to duplicate entire datasets. After describing a pipeline to open up legacy-databases data in RDF format, we argue that RDF is suitable to implement a scalable feedback channel, and we investigate what steps are needed to implement a distributed RDFversioning system in production

    An Exploratory Empirical Assessment of Italian Open Government Data Quality With an eye to enabling linked open data

    Get PDF
    Context The diffusion of Linked Data and Open Data in recent years kept a very fast pace. However evidence from practitioners shows that disclosing data without proper quality control may jeopardize datasets reuse in terms of apps, linking, and other transformations. Objective Our goals are to understand practical problems experienced by open data users in using and integrating them and build a set of concrete metrics to assess the quality of disclosed data and better support the transition towards linked open data. Method We focus on Open Government Data (OGD), collecting problems experienced by developers and mapping them to a data quality model available in literature. Then we derived a set of metrics and applied them to evaluate a few samples of Italian OGD. Result We present empirical evidence concerning the common quality problems experienced by open data users when using and integrating datasets. The measurements effort showed a few acquired good practices and common weaknesses, and a set of discriminant factors among datasets. Conclusion The study represents the first empirical attempt to evaluate the quality of open datasets at an operational level. Our long-term goal is to support the transition towards Linked Open Government Data (LOGD) with a quality improvement process in the wake of the current practices in Software Qualit

    Open Data Quality Measurement Framework: Definition and Application to Open Government Data

    Get PDF
    The diffusion of Open Government Data (OGD) in recent years kept a very fast pace. However, evidence from practitioners shows that disclosing data without proper quality control may jeopardize datasets reuse and negatively affect civic participation. Current approaches to the problem in literature lack of a comprehensive theoretical framework. Moreover, most of the evaluations concentrate on open data platforms, rather than on datasets. In this work, we address these two limitations and set up a framework of indicators to measure the quality of Open Government Data on a series of data quality dimensions at most granular level of measurement. We validated the evaluation framework by applying it to compare two cases of Italian OGD datasets: an internationally recognized good example of OGD, with centralized disclosure and extensive data quality controls, and samples of OGD from decentralized data disclosure (municipalities level), with no possibility of extensive quality controls as in the former case, hence with supposed lower quality. Starting from measurements based on the quality framework, we were able to verify the difference in quality: the measures showed a few common acquired good practices and weaknesses, and a set of discriminating factors that pertain to the type of datasets and the overall approach. On the basis of this evaluation, we also provided technical and policy guidelines to overcome the weaknesses observed in the decentralized release policy, addressing specific quality aspects

    Open Government Data: A Focus on Key Economic and Organizational Drivers

    Get PDF
    Grounding the analysis on multidisciplinary literature on the topic, the existing EU legislation and relevant examples, this working paper aims at highlighting some key economic and organizational aspects of the "Open Government Data" paradigm and its drivers and implications within and outside Public Administrations. The discussion intends to adopt an "Internet Science" perspective, taking into account as enabling factors the digital environment itself, as well as specific models and tools. More "traditional" and mature markets grounded on Public Sector Information are also considered, in order to indirectly detect the main differences with respect to the aforementioned paradig

    Privacy evaluation: what empirical research on users’ valuation of personal data tells us

    Get PDF
    The EU General Data Protection Regulation is supposed to introduce several innovations, including the right of data portability for data subjects. In this article, we review recent literature documenting experiments to assess users’ valuation of personal data, with the purpose to provide policy-oriented remarks. In particular, contextual aspects, conflicts between declared and revealed preferences, as well as the suggestion that personal data is not conceivable as a single good, but instead as a bundle, are taken into account, also discussing potential shortcomings and pitfalls in the surveyed experiments. Data portability is supposed to increase consumer empowerment; still, several technological preconditions need to apply to make this right actually enforceable

    NSC104970

    Get PDF
    According to the EU Directive 2003/98 public sector bodies can currently charge the cost of collection, production, reproduction and dissemination, together with a reasonable return on investment. If the upper limit for charging was lowered to the marginal costs of reproduction and dissemination of documents, with a possibility for a limited number of exhaustively spelled out exceptions, what could these exceptions be? Who would decide in practice on the exceptions: Member States or local public sector bodies? Accordingly, the analysis presented specifically focuses on an hypothetical regime which provides that charging is subject to an upper limit, identified with the marginal costs of reproduction and dissemination of documents; and admits that the default is overridden by specific exceptions. The underlying assumption is that the current rules concerning charges are amended; and that the current recoverability also of the cost of "collection" and "production" of the documents, as well as of "a reasonable return on investment" made in view of the collection, production, reproduction and dissemination from charges made by public sector bodies is for the future admitted only in specific, exceptional cases. While the present discussion shall mainly deal with the identification of the various options available under the new regime as far as exceptions are concerned and with the governance level at which decisions on the same exceptions should be taken, the scrutiny shall extend to the rationale itself of this hypothetical new regime, to the extent necessary to clarify the available option
    corecore