1,735 research outputs found

    OGC GeoScience DWG: a forum to enhance and organize geoscience data standardization

    Get PDF
    International audienc

    Practices, Challenges, and Prospects of Big Data Curation: a Case Study in Geoscience

    Get PDF
    Open and persistent access to past, present, and future scientific data is fundamental for transparent and reproducible data-driven research. The scientific community is now facing both challenges and opportunities caused by the growingly complex disciplinary data systems. Concerted efforts from domain experts, information professionals, and Internet technology experts are essential to ensure the accessibility and interoperability of the big data. Here we review current practices in building and managing big data within the context of large data infrastructure, using geoscience cyberinfrastructure such as Interdisciplinary Earth Data Alliance (IEDA) and EarthCube as a case study. Geoscience is a data-rich discipline with a rapid expansion of sophisticated and diverse digital data sets. Having started to embrace the digital age, the community have applied big data and data mining tools into the new type of research. We also identified current challenges, key elements, and prospects to construct a more robust and future-proof big data infrastructure for research and publication for the future, as well as the roles, qualifications, and opportunities for librarians/information professionals in the data era

    SwissEnvEO: A FAIR National Environmental Data Repository for Earth Observation Open Science

    Get PDF
    Environmental scientific research is highly becoming data-driven and dependent on high performance computing infrastructures to process ever increasing large volume and diverse data sets. Consequently, there is a growing recognition of the need to share data, methods, algorithms, and infrastructure to make scientific research more effective, efficient, open, transparent, reproducible, accessible, and usable by different users. However, Earth Observations (EO) Open Science is still undervalued, and different challenges remains to achieve the vision of transforming EO data into actionable knowledge by lowering the entry barrier to massive-use Big Earth Data analysis and derived information products. Currently, FAIR-compliant digital repositories cannot fully satisfy the needs of EO users, while Spatial Data Infrastructures (SDI) are not fully FAIR-compliant and have difficulties in handling Big Earth Data. In response to these issues and the need to strengthen Open and Reproducible EO science, this paper presents SwissEnvEO, a Spatial Data Infrastructure complemented with digital repository capabilities to facilitate the publication of Ready to Use information products, at national scale, derived from satellite EO data available in an EO Data Cube in full compliance with FAIR principles

    Exploring digital preservation requirements: a case study from the National Geoscience Data Centre (NGDC)

    Get PDF
    Purpose This case study is based on an MSc dissertation research undertaken at Northumbria University. The aim was to explore digital preservation requirements within the wider NGDC organisational framework in preparation for developing a preservation policy and integrating associated preservation workflows throughout the existing research data management processes. Design/methodology/approach This mixed methods case study used quantitative and qualitative data to explore the preservation requirements and triangulation to strengthen the design validity. Corporate and the wider scientific priorities were identified through literature and a stakeholder survey. Organisational preparedness was investigated through staff interviews. Findings Stakeholders expect data to be reliable, reusable, and available in preferred formats. To ensure digital continuity, the creation of high quality metadata is critical, and data depositors need data management training to achieve this. Recommendations include completing a risk assessment, creating a digital asset register, and a technology watch to mitigate against risks. Research limitations/implications The main constraint in this study is the lack of generalisability of results. As the NGDC is a unique organisation, it may not be possible to generalise the organisational findings although those relating to research data management may be transferrable. Originality/value This research examines the specific nature of geoscience data retention requirements and looks at existing NGDC procedures in terms of enhancing digital continuity, providing new knowledge on the preservation requirements for a number of national datasets

    Geospatial Standards and the Knowledge Generation Lifescycle

    Get PDF
    Standards play an essential role at each stage in the sequence of processes by which knowledge is generated from geoscience observations, simulations and analysis. This paper provides an introduction to the field of informatics and the knowledge generation lifecycle in the context of the geosciences. In addition we discuss how the newly formed Earth Science Informatics Technical Committee is helping to advance the application of standards and best practices to make data and data systems more usable and interoperable

    Report of the user requirements and web based access for eResearch workshops

    Get PDF
    The User Requirements and Web Based Access for eResearch Workshop, organized jointly by NeSC and NCeSS, was held on 19 May 2006. The aim was to identify lessons learned from e-Science projects that would contribute to our capacity to make Grid infrastructures and tools usable and accessible for diverse user communities. Its focus was on providing an opportunity for a pragmatic discussion between e-Science end users and tool builders in order to understand usability challenges, technological options, community-specific content and needs, and methodologies for design and development. We invited members of six UK e-Science projects and one US project, trying as far as possible to pair a user and developer from each project in order to discuss their contrasting perspectives and experiences. Three breakout group sessions covered the topics of user-developer relations, commodification, and functionality. There was also extensive post-meeting discussion, summarized here. Additional information on the workshop, including the agenda, participant list, and talk slides, can be found online at http://www.nesc.ac.uk/esi/events/685/ Reference: NeSC report UKeS-2006-07 available from http://www.nesc.ac.uk/technical_papers/UKeS-2006-07.pd

    A service-oriented middleware for integrated management of crowdsourced and sensor data streams in disaster management

    Get PDF
    The increasing number of sensors used in diverse applications has provided a massive number of continuous, unbounded, rapid data and requires the management of distinct protocols, interfaces and intermittent connections. As traditional sensor networks are error-prone and difficult to maintain, the study highlights the emerging role of “citizens as sensors” as a complementary data source to increase public awareness. To this end, an interoperable, reusable middleware for managing spatial, temporal, and thematic data using Sensor Web Enablement initiative services and a processing engine was designed, implemented, and deployed. The study found that its approach provided effective sensor data-stream access, publication, and filtering in dynamic scenarios such as disaster management, as well as it enables batch and stream management integration. Also, an interoperability analytics testing of a flood citizen observatory highlighted even variable data such as those provided by the crowd can be integrated with sensor data stream. Our approach, thus, offers a mean to improve near-real-time applications

    Open data and interoperability standards : opportunities for animal welfare in extensive livestock systems

    Get PDF
    Extensive livestock farming constitutes a sizeable portion of agriculture, not only in relation to land use, but in contribution to feeding a growing human population. In addition to meat, it contributes other economically valuable commodities such as wool, hides and other products. The livestock industries are adopting technologies under the banner of Precision Livestock Farming (PLF) to help meet higher production and efficiency targets as well as help to manage the multiple challenges impacting the industries, such as climate change, environmental concerns, globalisation of markets, increasing rules of governance and societal scrutiny especially in relation to animal welfare. PLF is particularly dependent on the acquisition and management of data and metadata and on the interoperability standards that allow data discovery and federation. A review of interoperability standards and PLF adoption in extensive livestock farming systems identified a lack of domain specific standards and raised questions related to the amount and quality of public data which has potential to inform livestock farming. A systematic review of public datasets, which included an assessment based on the principles that data must be findable, accessible, interoperable and reusable (FAIR) was developed. Custom software scripts were used to conduct a dataset search to determine the quantity and quality of domain specific datasets yielded 419 unique Australian datasets directly related to extensive livestock farming. A FAIR assessment of these datasets using a set of non-domain specific, general metrics showed a moderate level of compliance. The results suggest that domain specific FAIR metrics may need to be developed to provide a more accurate data quality assessment, but also that the level of interoperability and reusability is not particularly high which has implications if public data is to be included in decision support tools. To test the usefulness of available public datasets in informing decision support in relation to livestock welfare, a case study was designed and farm animal welfare elements were extracted from Australian welfare standards to guide a dataset search. It was found that with few exceptions, these elements could be supported with public data, although there were gaps in temporal and spatial coverage. The development of a geospatial animal welfare portal including these datasets further explored and confirmed the potential for using public data to enhance livestock welfare.Doctor of Philosoph

    2011 Strategic roadmap for Australian research infrastructure

    Get PDF
    The 2011 Roadmap articulates the priority research infrastructure areas of a national scale (capability areas) to further develop Australia’s research capacity and improve innovation and research outcomes over the next five to ten years. The capability areas have been identified through considered analysis of input provided by stakeholders, in conjunction with specialist advice from Expert Working Groups   It is intended the Strategic Framework will provide a high-level policy framework, which will include principles to guide the development of policy advice and the design of programs related to the funding of research infrastructure by the Australian Government. Roadmapping has been identified in the Strategic Framework Discussion Paper as the most appropriate prioritisation mechanism for national, collaborative research infrastructure. The strategic identification of Capability areas through a consultative roadmapping process was also validated in the report of the 2010 NCRIS Evaluation. The 2011 Roadmap is primarily concerned with medium to large-scale research infrastructure. However, any landmark infrastructure (typically involving an investment in excess of $100 million over five years from the Australian Government) requirements identified in this process will be noted. NRIC has also developed a ‘Process to identify and prioritise Australian Government landmark research infrastructure investments’ which is currently under consideration by the government as part of broader deliberations relating to research infrastructure. NRIC will have strategic oversight of the development of the 2011 Roadmap as part of its overall policy view of research infrastructure
    corecore