5 research outputs found

    Turbulence and mixing by internal waves in the Celtic Sea determined from ocean glider microstructure measurements

    Get PDF
    We present a new series of data from a 9-day deployment of an ocean microstructure glider (OMG) in the Celtic Sea during the summer of 2012. The OMG has been specially adapted to measure shear microstructure and coincident density structure from which we derive the dissipation rate of turbulent kinetic energy (ε) and diapycnal diffusion rates (K). The methods employed to provide trustworthy turbulent parameters are described and data from 766 profiles of ε, temperature, salinity and density structure are presented. Surface and bottom boundary layers are intuitively controlled by wind and tidal forcing. Interior dynamics is dominated by a highly variable internal wave-field with peak vertical displacements in excess of 50 m, equivalent to over a third of the water depth. Following a relatively quiescent period internal wave energy, represented by the available potential energy (APE), increases dramatically close to the spring tide flow. Rather than follow the assumed spring-neap cycle however, APE is divided into two distinct peak periods lasting only one or two days. Pycnocline ε also increases close to the spring tide period and similar to APE, is distinguishable as two distinct energetic periods, however the timing of these periods is not consistent with APE. Pycnocline mixing associated with the observed ε is shown to be responsible for the majority of the observed reduction in bottom boundary layer density suggesting that diapycnal exchange is a key mechanism in controlling or limiting exchange between the continental shelf and the deep ocean. Results confirm pycnocline turbulence to be highly variable and difficult to predict however a log-normal distribution does suggest that natural variability could be reproduced if the mean state can be accurately simulated

    Hidden Potential in Predicting Wintertime Temperature Anomalies in the Northern Hemisphere

    Get PDF
    Variability of the North Atlantic Oscillation (NAO) drives wintertime temperature anomalies in the Northern Hemisphere. Dynamical seasonal prediction systems can skilfully predict the winter NAO. However, prediction of the NAO-dependent air temperature anomalies remains elusive, partially due to the low variability of predicted NAO. Here, we demonstrate a hidden potential of a multi-model ensemble of operational seasonal prediction systems for predicting wintertime temperature by increasing the variability of predicted NAO. We identify and subsample those ensemble members which are close to NAO index statistically estimated from initial autumn conditions. In our novel multi-model approach, the correlation prediction skill for wintertime Central Europe temperature is improved from 0.25 to 0.66, accompanied by an increased winter NAO prediction skill of 0.9. Thereby, temperature anomalies can be skilfully predicted for the upcoming winter over a large part of the Northern Hemisphere through increased variability and skill of predicted NAO

    Palaeo-sea-level and palaeo-ice-sheet databases: Problems, strategies, and perspectives

    Get PDF
    Sea-level and ice-sheet databases have driven numerous advances in understanding the Earth system. We describe the challenges and offer best strategies that can be adopted to build self-consistent and standardised databases of geological and geochemical information used to archive palaeo-sea-levels and palaeo-ice-sheets. There are three phases in the development of a database: (i) measurement, (ii) interpretation, and (iii) database creation. Measurement should include the objective description of the position and age of a sample, description of associated geological features, and quantification of uncertainties. Interpretation of the sample may have a subjective component, but it should always include uncertainties and alternative or contrasting interpretations, with any exclusion of existing interpretations requiring a full justification. During the creation of a database, an approach based on accessibility, transparency, trust, availability, continuity, completeness, and communication of content (ATTAC3) must be adopted. It is essential to consider the community that creates and benefits from a database. We conclude that funding agencies should not only consider the creation of original data in specific research-question-oriented projects, but also include the possibility of using part of the funding for IT-related and database creation tasks, which are essential to guarantee accessibility and maintenance of the collected data

    Automated Quality Evaluation for a More Effective Data Peer Review

    Get PDF
    A peer review scheme comparable to that used in traditional scientific journals is a major element missing in bringing publications of raw data up to standards equivalent to those of traditional publications. This paper introduces a quality evaluation process designed to analyse the technical quality as well as the content of a dataset. This process is based on quality tests, the results of which are evaluated with the help of the knowledge of an expert. As a result, the quality is estimated by a single value only. Further, the paper includes an application and a critical discussion on the potential for success, the possible introduction of the process into data centres, and practical implications of the scheme
    corecore