66 research outputs found

    SeaDataNet – Pan-European infrastructure for marine and ocean data management: unified access to distributed data sets

    Get PDF
    Data availability is of vital importance for marine and oceanographic research but most of the European data are fragmented, not always validated and not easily accessible. In the countries bordering the European seas, more than 1000 scientific laboratories from governmental organisations and private industry collect data using various sensors on board of research vessels, submarines, fixed and drifting platforms, aeroplanes and satellites to measure physical, geophysical, geological, biological and chemical parameters, biological species and others. SeaDataNet is an Integrated Research Infrastructure Initiative (I3) (2006–2011) in the EU FP6 framework programme. It is developing an efficient distributed Pan-European marine data management infrastructure for managing these large and diverse data sets. It is interconnecting the existing professional data centres of 35 countries, active in data collection and providing integrated databases of standardised quality on-line. This article describes the architecture and the features of the SeaDataNet infrastructure. In particular it describes the way interoperability is achieved between all the contributing data centres. Finally it highlights the on-going developments and challenges

    From SeaDataNet to SeaDataCloud: historical data collections and new data products

    Get PDF
    Temperature and Salinity historical data collections covering the time period 1900-2013/2014 were created for each European marginal sea (Arctic Sea, Baltic Sea, Black Sea, North Sea, North Atlantic Ocean, and Mediterranean Sea) within the framework of SeaDataNet2 Project and they are available as ODV collections through a web catalog (https://www.seadatanet.org/Products/Aggregated-datasets). Two versions have been published and they represent a snapshot of the SeaDataNet database content at two different times: V1.1 (January 2014) and V2 (March 2015). A Quality Control Strategy (QCS) was developped and continuously refined in order to improve the quality of the database content and create the best data products. The QCS consists of four main phases: 1) data harvesting from the data infrastructure; 2) file and parameter aggregation; 3) secondary quality check analysis; 4) correction of data anomalies. The approach is iterative to facilitate the upgrade of the database content and it allows a versioning of data products. Regional temperature and salinity monthly climatologies have been produced from V1.1 historical data collections and they are also available (https://www.seadatanet.org/Products/Climatologies). Within the new SeaDataCloud Project the release of updated historical data collections and new climatologies is planned. SeaDataCloud novelties are the introduction of decadal climatologies at various resolutions, the development of climatologies for the Global Ocean and a task dedicated to new data products, like Mixed Layer Depth climatologies, Ocean Heat Content estimates, coastal climatologies from HF radar data. All SeaDataCloud products are available through a dedicated web catalogue together with their relative Digital Object Identifier (DOI) and Product Information Document (PIDoc) containing all specifications about product’s generation, quality assessment and technical details to facilitate users’ uptake. The presentation will briefly overview the existing SeaDataNet products and introduce the SeaDataCloud products’ plan, but the main focus will be on the first release (February 2018) of SeaDataCloud Temperature and Salinity historical data collections, spanning the time period 1900-2017, their characteristics in terms of space-time data distribution and their usability.SeaDataCloud ProjectPublishedVienna4A. Oceanografia e clim

    The BASE-platform project: Deriving the bathymetry from combined satellite data

    Get PDF
    The project »BAthymetry SErvice platform« (BASE-platform) addresses the lack of available up-to-date, high-resolution bathymetry data in many areas of the world. With the increasing number of earth observation satellites, e.g. by the ongoing deployment of ESA’s Sentinel fleet, remote sensing data of the oceans are widely available. Three sources of satellite information are combined in BASE-platform: optical, synthetic aperture radar (SAR) and altimetry data. BASEplatform’s ambition is to use these data for creating bathymetric maps and supply them to end users via a bathymetry data portal, where data will be available off-the-shelf as well as on demand. Adequate metadata will be provided along with the bathymetry so usability by the end user is ensured

    SeaDataCloud Data Products for the European marginal seas and the Global Ocean

    Full text link
    Data products, based on in situ temperature and salinity observations from SeaDataNet infrastructure, have been released within the framework of SeaDataCloud (SDC) project. The data from different data providers are integrated and harmonized thanks to standardized quality assurance and quality control methodologies conducted at various stages of the data value chain. The data ingested within SeaDataNet are earlier validated by data providers who assign corresponding quality flags, but a Quality Assurance Strategy has been implemented and progressively refined to guarantee the consistency of the database content and high quality derived products. Two versions of aggregated datasets for the European marginal seas have been published and used to compute regional high resolution climatologies. External datasets, the World Ocean Database from NOAA and the CORA dataset from the Copernicus Marine Service in situ Thematic Assembly Center, have been integrated with SDC data collections to maximize data coverage and minimize the mapping error. The products are available through the SDC catalogue accompanied by Product Information Documents containing the specifications about product’s generation, characteristics and usability. Digital Object Identifiers are assigned to products and relative documentation to foster transparency of the production chain, acknowledging all actors involved from data providers to information producers

    Targeting Cattle-Borne Zoonoses and Cattle Pathogens Using a Novel Trypanosomatid-Based Delivery System

    Get PDF
    Trypanosomatid parasites are notorious for the human diseases they cause throughout Africa and South America. However, non-pathogenic trypanosomatids are also found worldwide, infecting a wide range of hosts. One example is Trypanosoma (Megatrypanum) theileri, a ubiquitous protozoan commensal of bovids, which is distributed globally. Exploiting knowledge of pathogenic trypanosomatids, we have developed Trypanosoma theileri as a novel vehicle to deliver vaccine antigens and other proteins to cattle. Conditions for the growth and transfection of T. theileri have been optimised and expressed heterologous proteins targeted for secretion or specific localisation at the cell interior or surface using trafficking signals from Trypanosoma brucei. In cattle, the engineered vehicle could establish in the context of a pre-existing natural T. theileri population, was maintained long-term and generated specific immune responses to an expressed Babesia antigen at protective levels. Building on several decades of basic research into trypanosomatid pathogens, Trypanosoma theileri offers significant potential to target multiple infections, including major cattle-borne zoonoses such as Escherichia coli, Salmonella spp., Brucella abortus and Mycobacterium spp. It also has the potential to deliver therapeutics to cattle, including the lytic factor that protects humans from cattle trypanosomiasis. This could alleviate poverty by protecting indigenous African cattle from African trypanosomiasis

    Ocean FAIR Data Services

    Get PDF
    Well-founded data management systems are of vital importance for ocean observing systems as they ensure that essential data are not only collected but also retained and made accessible for analysis and application by current and future users. Effective data management requires collaboration across activities including observations, metadata and data assembly, quality assurance and control (QA/QC), and data publication that enables local and interoperable discovery and access and secures archiving that guarantees long-term preservation. To achieve this, data should be findable, accessible, interoperable, and reusable (FAIR). Here, we outline how these principles apply to ocean data and illustrate them with a few examples. In recent decades, ocean data managers, in close collaboration with international organizations, have played an active role in the improvement of environmental data standardization, accessibility, and interoperability through different projects, enhancing access to observation data at all stages of the data life cycle and fostering the development of integrated services targeted to research, regulatory, and operational users. As ocean observing systems evolve and an increasing number of autonomous platforms and sensors are deployed, the volume and variety of data increase dramatically. For instance, there are more than 70 data catalogs that contain metadata records for the polar oceans, a situation that makes comprehensive data discovery beyond the capacity of most researchers. To better serve research, operational, and commercial users, more efficient turnaround of quality data in known formats and made available through Web services is necessary. In particular, automation of data workflows will be critical to reduce friction throughout the data value chain. Adhering to the FAIR principles with free, timely, and unrestricted access to ocean observation data is beneficial for the originators, has obvious benefits for users, and is an essential foundation for the development of new services made possible with big data technologies
    corecore