2,707 research outputs found

    Open Data, Grey Data, and Stewardship: Universities at the Privacy Frontier

    Full text link
    As universities recognize the inherent value in the data they collect and hold, they encounter unforeseen challenges in stewarding those data in ways that balance accountability, transparency, and protection of privacy, academic freedom, and intellectual property. Two parallel developments in academic data collection are converging: (1) open access requirements, whereby researchers must provide access to their data as a condition of obtaining grant funding or publishing results in journals; and (2) the vast accumulation of 'grey data' about individuals in their daily activities of research, teaching, learning, services, and administration. The boundaries between research and grey data are blurring, making it more difficult to assess the risks and responsibilities associated with any data collection. Many sets of data, both research and grey, fall outside privacy regulations such as HIPAA, FERPA, and PII. Universities are exploiting these data for research, learning analytics, faculty evaluation, strategic decisions, and other sensitive matters. Commercial entities are besieging universities with requests for access to data or for partnerships to mine them. The privacy frontier facing research universities spans open access practices, uses and misuses of data, public records requests, cyber risk, and curating data for privacy protection. This paper explores the competing values inherent in data stewardship and makes recommendations for practice, drawing on the pioneering work of the University of California in privacy and information security, data governance, and cyber risk.Comment: Final published version, Sept 30, 201

    Citation and peer review of data: moving towards formal data publication

    Get PDF
    This paper discusses many of the issues associated with formally publishing data in academia, focusing primarily on the structures that need to be put in place for peer review and formal citation of datasets. Data publication is becoming increasingly important to the scientific community, as it will provide a mechanism for those who create data to receive academic credit for their work and will allow the conclusions arising from an analysis to be more readily verifiable, thus promoting transparency in the scientific process. Peer review of data will also provide a mechanism for ensuring the quality of datasets, and we provide suggestions on the types of activities one expects to see in the peer review of data. A simple taxonomy of data publication methodologies is presented and evaluated, and the paper concludes with a discussion of dataset granularity, transience and semantics, along with a recommended human-readable citation syntax

    BBSRC Data Sharing Policy

    Get PDF
    BBSRC recognizes the importance of contributing to the growing international efforts in data sharing. BBSRC is committed to getting the best value for the funds we invest and believes that making research data more readily available will reinforce open scientific inquiry and stimulate new investigations and analyses. BBSRC supports the view that data sharing should be led by the scientific community and driven by scientific need. It should also be cost effective and the data shared should be of the highest quality. Members of the community are expected and encouraged to practice and promote data sharing, determine standards and best practice, and create a scientific culture in which data sharing is embedded

    Broken chocolate:biomarkers as a method for delivering cocoa supply chain visibility

    Get PDF
    Purpose: This paper examines the potential of “biomarkers” to provide immutable identification for food products (chocolate), providing traceability and visibility in the supply chain from retail product back to farm. Design/methodology/approach: This research uses qualitative data collection, including fieldwork at cocoa farms and chocolate manufacturers in Ecuador and the Netherlands and semi-structured interviews with industry professionals to identify challenges and create a supply chain map from cocoa plant to retailer, validated by area experts. A library of biomarkers is created using DNA collected from fieldwork and the International Cocoa Quarantine Centre, holders of cocoa varieties from known locations around the world. Matching sample biomarkers with those in the library enables identification of origins of cocoa used in a product, even when it comes from multiple different sources and has been processed. Findings: Supply chain mapping and interviews identify areas of the cocoa supply chain that lack the visibility required for management to guarantee sustainability and quality. A decoupling point, where smaller farms/traders’ goods are combined to create larger economic units, obscures product origins and limits visibility. These factors underpin a potential boundary condition to institutional theory in the industry’s fatalism to environmental and human abuses in the face of rising institutional pressures. Biomarkers reliably identify product origin, including specific farms and (fermentation) processing locations, providing visibility and facilitating control and trust when purchasing cocoa. Research limitations/implications: The biomarker “meta-barcoding” of cocoa beans used in chocolate manufacturing accurately identifies the farm, production facility or cooperative, where a cocoa product came from. A controlled data set of biomarkers of registered locations is required for audit to link chocolate products to origin. Practical implications: Where biomarkers can be produced from organic products, they offer a method for closing visibility gaps, enabling responsible sourcing. Labels (QR codes, barcodes, etc.) can be swapped and products tampered with, but biological markers reduce reliance on physical tags, diminishing the potential for fraud. Biomarkers identify product composition, pinpointing specific farm(s) of origin for cocoa in chocolate, allowing targeted audits of suppliers and identifying if cocoa of unknown origin is present. Labour and environmental abuses exist in many supply chains and enabling upstream visibility may help firms address these challenges. Social implications: By describing a method for firms in cocoa supply chains to scientifically track their cocoa back to the farm level, the research shows that organizations can conduct social audits for child labour and environmental abuses at specific farms proven to be in their supply chains. This provides a method for delivering supply chain visibility (SCV) for firms serious about tackling such problems. Originality/value: This paper provides one of the very first examples of biomarkers for agricultural SCV. An in-depth study of stakeholders from the cocoa and chocolate industry elucidates problematic areas in cocoa supply chains. Biomarkers provide a unique biological product identifier. Biomarkers can support efforts to address environmental and social sustainability issues such as child labour, modern slavery and deforestation by providing visibility into previously hidden areas of the supply chain

    Do you see what I mean?

    Get PDF
    Visualizers, like logicians, have long been concerned with meaning. Generalizing from MacEachren's overview of cartography, visualizers have to think about how people extract meaning from pictures (psychophysics), what people understand from a picture (cognition), how pictures are imbued with meaning (semiotics), and how in some cases that meaning arises within a social and/or cultural context. If we think of the communication acts carried out in the visualization process further levels of meaning are suggested. Visualization begins when someone has data that they wish to explore and interpret; the data are encoded as input to a visualization system, which may in its turn interact with other systems to produce a representation. This is communicated back to the user(s), who have to assess this against their goals and knowledge, possibly leading to further cycles of activity. Each phase of this process involves communication between two parties. For this to succeed, those parties must share a common language with an agreed meaning. We offer the following three steps, in increasing order of formality: terminology (jargon), taxonomy (vocabulary), and ontology. Our argument in this article is that it's time to begin synthesizing the fragments and views into a level 3 model, an ontology of visualization. We also address why this should happen, what is already in place, how such an ontology might be constructed, and why now
    • …
    corecore