32 research outputs found

    A multi-decade record of high quality fCO2 data in version 3 of the Surface Ocean CO2 Atlas (SOCAT)

    Get PDF
    The Surface Ocean CO2 Atlas (SOCAT) is a synthesis of quality-controlled fCO2 (fugacity of carbon dioxide) values for the global surface oceans and coastal seas with regular updates. Version 3 of SOCAT has 14.7 million fCO2 values from 3646 data sets covering the years 1957 to 2014. This latest version has an additional 4.6 million fCO2 values relative to version 2 and extends the record from 2011 to 2014. Version 3 also significantly increases the data availability for 2005 to 2013. SOCAT has an average of approximately 1.2 million surface water fCO2 values per year for the years 2006 to 2012. Quality and documentation of the data has improved. A new feature is the data set quality control (QC) flag of E for data from alternative sensors and platforms. The accuracy of surface water fCO2 has been defined for all data set QC flags. Automated range checking has been carried out for all data sets during their upload into SOCAT. The upgrade of the interactive Data Set Viewer (previously known as the Cruise Data Viewer) allows better interrogation of the SOCAT data collection and rapid creation of high-quality figures for scientific presentations. Automated data upload has been launched for version 4 and will enable more frequent SOCAT releases in the future. High-profile scientific applications of SOCAT include quantification of the ocean sink for atmospheric carbon dioxide and its long-term variation, detection of ocean acidification, as well as evaluation of coupled-climate and ocean-only biogeochemical models. Users of SOCAT data products are urged to acknowledge the contribution of data providers, as stated in the SOCAT Fair Data Use Statement. This ESSD (Earth System Science Data) “living data” publication documents the methods and data sets used for the assembly of this new version of the SOCAT data collection and compares these with those used for earlier versions of the data collection (Pfeil et al., 2013; Sabine et al., 2013; Bakker et al., 2014). Individual data set files, included in the synthesis product, can be downloaded here: doi:10.1594/PANGAEA.849770. The gridded products are available here: doi:10.3334/CDIAC/OTG.SOCAT_V3_GRID

    Ocean data product integration through innovation-the next level of data interoperability

    Get PDF
    In the next decade the pressures on ocean systems and the communities that rely on them will increase along with impacts from the multiple stressors of climate change and human activities. Our ability to manage and sustain our oceans will depend on the data we collect and the information and knowledge derived from it. Much of the uptake of this knowledge will be outside the ocean domain, for example by policy makers, local Governments, custodians, and other organizations, so it is imperative that we democratize or open the access and use of ocean data. This paper looks at how technologies, scoped by standards, best practice and communities of practice, can be deployed to change the way that ocean data is accessed, utilized, augmented and transformed into information and knowledge. The current portal-download model which requires the user to know what data exists, where it is stored, in what format and with what processing, limits the uptake and use of ocean data. Using examples from a range of disciplines, a web services model of data and information flows is presented. A framework is described, including the systems, processes and human components, which delivers a radical rethink about the delivery of knowledge from ocean data. A series of statements describe parts of the future vision along with recommendations about how this may be achieved. The paper recommends the development of virtual test-beds for end-to-end development of new data workflows and knowledge pathways. This supports the continued development, rationalization and uptake of standards, creates a platform around which a community of practice can be developed, promotes cross discipline engagement from ocean science through to ocean policy, allows for the commercial sector, including the informatics sector, to partner in delivering outcomes and provides a focus to leverage long term sustained funding. The next 10 years will be “make or break” for many ocean systems. The decadal challenge is to develop the governance and co-operative mechanisms to harness emerging information technology to deliver on the goal of generating the information and knowledge required to sustain oceans into the future

    Best practice data standards for discrete chemical oceanographic observations

    Get PDF
    © The Author(s), 2022. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Jiang, L.-Q., Pierrot, D., Wanninkhof, R., Feely, R. A., Tilbrook, B., Alin, S., Barbero, L., Byrne, R. H., Carter, B. R., Dickson, A. G., Gattuso, J.-P., Greeley, D., Hoppema, M., Humphreys, M. P., Karstensen, J., Lange, N., Lauvset, S. K., Lewis, E. R., Olsen, A., PĂ©rez, F. F., Sabine, C., Sharp, J. D., Tanhua, T., Trull, T. W., Velo, A., Allegra, A. J., Barker, P., Burger, E., Cai, W-J., Chen, C-T. A., Cross, J., Garcia, H., Hernandez-Ayon J. M., Hu, X., Kozyr, A., Langdon, C., Lee., K, Salisbury, J., Wang, Z. A., & Xue, L. Best practice data standards for discrete chemical oceanographic observations. Frontiers in Marine Science, 8, (2022): 705638, https://doi.org/10.3389/fmars.2021.705638.Effective data management plays a key role in oceanographic research as cruise-based data, collected from different laboratories and expeditions, are commonly compiled to investigate regional to global oceanographic processes. Here we describe new and updated best practice data standards for discrete chemical oceanographic observations, specifically those dealing with column header abbreviations, quality control flags, missing value indicators, and standardized calculation of certain properties. These data standards have been developed with the goals of improving the current practices of the scientific community and promoting their international usage. These guidelines are intended to standardize data files for data sharing and submission into permanent archives. They will facilitate future quality control and synthesis efforts and lead to better data interpretation. In turn, this will promote research in ocean biogeochemistry, such as studies of carbon cycling and ocean acidification, on regional to global scales. These best practice standards are not mandatory. Agencies, institutes, universities, or research vessels can continue using different data standards if it is important for them to maintain historical consistency. However, it is hoped that they will be adopted as widely as possible to facilitate consistency and to achieve the goals stated above.Funding for L-QJ and AK was from NOAA Ocean Acidification Program (OAP, Project ID: 21047) and NOAA National Centers for Environmental Information (NCEI) through NOAA grant NA19NES4320002 [Cooperative Institute for Satellite Earth System Studies (CISESS)] at the University of Maryland/ESSIC. BT was in part supported by the Australia’s Integrated Marine Observing System (IMOS), enabled through the National Collaborative Research Infrastructure Strategy (NCRIS). AD was supported in part by the United States National Science Foundation. AV and FP were supported by BOCATS2 Project (PID2019-104279GB-C21/AEI/10.13039/501100011033) funded by the Spanish Research Agency and contributing to WATER:iOS CSIC interdisciplinary thematic platform. MH was partly funded by the European Union’s Horizon 2020 Research and Innovation Program under grant agreement N°821001 (SO-CHIC)

    Ocean FAIR Data Services

    Get PDF
    Well-founded data management systems are of vital importance for ocean observing systems as they ensure that essential data are not only collected but also retained and made accessible for analysis and application by current and future users. Effective data management requires collaboration across activities including observations, metadata and data assembly, quality assurance and control (QA/QC), and data publication that enables local and interoperable discovery and access and secures archiving that guarantees long-term preservation. To achieve this, data should be findable, accessible, interoperable, and reusable (FAIR). Here, we outline how these principles apply to ocean data and illustrate them with a few examples. In recent decades, ocean data managers, in close collaboration with international organizations, have played an active role in the improvement of environmental data standardization, accessibility, and interoperability through different projects, enhancing access to observation data at all stages of the data life cycle and fostering the development of integrated services targeted to research, regulatory, and operational users. As ocean observing systems evolve and an increasing number of autonomous platforms and sensors are deployed, the volume and variety of data increase dramatically. For instance, there are more than 70 data catalogs that contain metadata records for the polar oceans, a situation that makes comprehensive data discovery beyond the capacity of most researchers. To better serve research, operational, and commercial users, more efficient turnaround of quality data in known formats and made available through Web services is necessary. In particular, automation of data workflows will be critical to reduce friction throughout the data value chain. Adhering to the FAIR principles with free, timely, and unrestricted access to ocean observation data is beneficial for the originators, has obvious benefits for users, and is an essential foundation for the development of new services made possible with big data technologies

    Evolving and sustaining ocean best practices and standards for the next decade

    Get PDF
    The oceans play a key role in global issues such as climate change, food security, and human health. Given their vast dimensions and internal complexity, efficient monitoring and predicting of the planet’s ocean must be a collaborative effort of both regional and global scale. A first and foremost requirement for such collaborative ocean observing is the need to follow well-defined and reproducible methods across activities: from strategies for structuring observing systems, sensor deployment and usage, and the generation of data and information products, to ethical and governance aspects when executing ocean observing. To meet the urgent, planet-wide challenges we face, methods across all aspects of ocean observing should be broadly adopted by the ocean community and, where appropriate, should evolve into “Ocean Best Practices.” While many groups have created best practices, they are scattered across the Web or buried in local repositories and many have yet to be digitized. To reduce this fragmentation, we introduce a new open access, permanent, digital repository of best practices documentation (oceanbestpractices.org) that is part of the Ocean Best Practices System (OBPS). The new OBPS provides an opportunity space for the centralized and coordinated improvement of ocean observing methods. The OBPS repository employs user-friendly software to significantly improve discovery and access to methods. The software includes advanced semantic technologies for search capabilities to enhance repository operations. In addition to the repository, the OBPS also includes a peer reviewed journal research topic, a forum for community discussion and a training activity for use of best practices. Together, these components serve to realize a core objective of the OBPS, which is to enable the ocean community to create superior methods for every activity in ocean observing from research to operations to applications that are agreed upon and broadly adopted across communities. Using selected ocean observing examples, we show how the OBPS supports this objective. This paper lays out a future vision of ocean best practices and how OBPS will contribute to improving ocean observing in the decade to come

    Do people living with HIV experience greater age advancement than their HIV-negative counterparts?

    Get PDF
    Objectives: Despite successful antiretroviral therapy, people living with HIV (PLWH) may show signs of premature/accentuated aging. We compared established biomarkers of aging in PLWH, appropriately chosen HIV-negative individuals, and blood donors, and explored factors associated with biological age advancement. Design: Cross-sectional analysis of 134 PLWH on suppressive antiretroviral therapy, 79 lifestyle-comparable HIV-negative controls aged 45 years or older from the Co-mor- Bidity in Relation to AIDS (COBRA) cohort, and 35 age-matched blood donors. Methods: Biological age was estimated using a validated algorithm based on 10 biomarkers. Associations between ‘age advancement’ (biological minus chronological age) and HIV status/parameters, lifestyle, cytomegalovirus (CMV), hepatitis B (HBV) and hepatitis C virus (HCV) infections were investigated using linear regression. Results: The average (95% CI) age advancement was greater in both HIV-positive [13.2 (11.6–14.9) years] and HIV-negative [5.5 (3.8–7.2) years] COBRA participants compared with blood donors [7.0 (4.1 to 9.9) years, both P’s<0.001)], but also in HIV-positive compared with HIV-negative participants (P<0.001). Chronic HBV, higher anti-CMV IgG titer and CD8ĂŸ T-cell count were each associated with increased age advancement, independently of HIV-status/group. Among HIV-positive participants, age advancement was increased by 3.5 (0.1–6.8) years among those with nadir CD4ĂŸ T-cell count less than 200 cells/ml and by 0.1 (0.06–0.2) years for each additional month of exposure to saquinavir
    corecore