104 research outputs found
Surface Turbulent Fluxes From the MOSAiC Campaign Predicted by Machine Learning
Reliable boundary-layer turbulence parametrizations for polar conditions are needed to reduce uncertainty in projections of Arctic sea ice melting rate and its potential global repercussions. Surface turbulent fluxes of sensible and latent heat are typically represented in weather/climate models using bulk formulae based on the Monin-Obukhov Similarity Theory, sometimes finely tuned to high stability conditions and the potential presence of sea ice. In this study, we test the performance of new, machine-learning (ML) flux parametrizations, using an advanced polar-specific bulk algorithm as a baseline. Neural networks, trained on observations from previous Arctic campaigns, are used to predict surface turbulent fluxes measured over sea ice as part of the recent MOSAiC expedition. The ML parametrizations outperform the bulk at the MOSAiC sites, with RMSE reductions of up to 70 percent. We provide a plug-in Fortran implementation of the neural networks for use in models
Upper limb disease evolution in exon 53 skipping eligible patients with Duchenne muscular dystrophy
Objective:
To understand the natural disease upper limb progression over 3 years of ambulatory and non-ambulatory patients with Duchenne muscular dystrophy (DMD) using functional assessments and quantitative magnetic resonance imaging (MRI) and to exploratively identify prognostic factors.
Methods:
Forty boys with DMD (22 non-ambulatory and 18 ambulatory) with deletions in dystrophin that make them eligible for exon 53-skipping therapy were included. Clinical assessments, including Brooke score, motor function measure (MFM), hand grip and key pinch strength, and upper limb distal coordination and endurance (MoviPlate), were performed every 6 months and quantitative MRI of fat fraction (FF) and lean muscle cross sectional area (flexor and extensor muscles) were performed yearly.
Results:
In the whole population, there were strong nonlinear correlations between outcome measures. In non-ambulatory patients, annual changes over the course of 3 years were detected with high sensitivity standard response mean (|SRM| ≥0.8) for quantitative MRI-based FF, hand grip and key pinch, and MFM. Boys who presented with a FF27% were able to bring a glass to their mouth and retained this ability in the following 3 years. Ambulatory patients with grip strength >35% of predicted value and FF <10% retained ambulation 3 years later.
Interpretation:
We demonstrate that continuous decline in upper limb strength, function, and MRI measured muscle structure can be reliably measured in ambulatory and non-ambulatory boys with DMD with high SRM and strong correlations between outcomes. Our results suggest that a combination of grip strength and FF can be used to predict important motor milestones
High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6
Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest the possibility for significant changes in both large-scale aspects of circulation, as well as improvements in small-scale processes and extremes. However, such high resolution global simulations at climate time scales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centers and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other MIPs. Increases in High Performance Computing (HPC) resources, as well as the revised experimental design for CMIP6, now enables a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility to extend to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulation. HighResMIP thereby focuses on one of the CMIP6 broad questions: “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges
The Arctic predictability and prediction on seasonal-to-interannual timescales (APPOSITE) data set version 1
This is the final version of the article. Available from the publisher via the DOI in this record.
Discussion paper (published on 15 Oct 2015)Recent decades have seen significant developments in seasonal-to-interannual timescale climate prediction capabilities. However, until recently the potential of such systems to predict Arctic climate had not been assessed. This paper describes a multi- 5 model predictability experiment which was run as part of the Arctic Predictability and Prediction On Seasonal to Inter-annual Timescales (APPOSITE) project. The main goal of APPOSITE was to quantify the timescales on which Arctic climate is predictable. In order to achieve this, a coordinated set of idealised initial-value predictability experiments, with seven general circulation models, was conducted. This was the first model 10 intercomparison project designed to quantify the predictability of Arctic climate on seasonal to inter-annual timescales. Here we present a description of the archived data set (which is available at the British Atmospheric Data Centre) and an update of the project's results. Although designed to address Arctic predictability, this data set could also be used to assess the predictability of other regions and modes of climate vari15 ability on these timescales, such as the El Niño Southern Oscillation.This work was supported by the Natural Environment Research Council
(grant NE/I029447/1). Helge Goessling was supported by a fellowship of the German Research
Foundation (DFG grant GO 2464/1-1). Data storage and processing capacity was kindly provided
by the British Atmospheric Data Centre (BADC). Thanks to Yanjun Jiao (CCCma) for his
assistance with the CanCM4 simulations and to Bill Merryfield for his comments on a draft of the pape
Spatio-temporal evolution of global surface temperature distributions
Climate is known for being characterised by strong non-linearity and chaotic
behaviour. Nevertheless, few studies in climate science adopt statistical
methods specifically designed for non-stationary or non-linear systems. Here we
show how the use of statistical methods from Information Theory can describe
the non-stationary behaviour of climate fields, unveiling spatial and temporal
patterns that may otherwise be difficult to recognize. We study the maximum
temperature at two meters above ground using the NCEP CDAS1 daily reanalysis
data, with a spatial resolution of 2.5 by 2.5 degree and covering the time
period from 1 January 1948 to 30 November 2018. The spatial and temporal
evolution of the temperature time series are retrieved using the Fisher
Information Measure, which quantifies the information in a signal, and the
Shannon Entropy Power, which is a measure of its uncertainty -- or
unpredictability. The results describe the temporal behaviour of the analysed
variable. Our findings suggest that tropical and temperate zones are now
characterized by higher levels of entropy. Finally, Fisher-Shannon Complexity
is introduced and applied to study the evolution of the daily maximum surface
temperature distributions.Comment: 7 pages, 4 figure
Recommended from our members
Seasonal to interannual Arctic sea-ice predictability in current GCMs
We establish the first inter-model comparison of seasonal to interannual predictability of present-day Arctic climate by performing coordinated sets of idealized ensemble predictions with four state-of-the-art global climate models. For Arctic sea-ice extent and volume, there is potential predictive skill for lead times of up to three years, and potential prediction errors have similar growth rates and magnitudes across the models. Spatial patterns of potential prediction errors differ substantially between the models, but some features are robust. Sea-ice concentration errors are largest in the marginal ice zone, and in winter they are almost zero away from the ice edge. Sea-ice thickness errors are amplified along the coasts of the Arctic Ocean, an effect that is dominated by sea-ice advection. These results give an upper bound on the ability of current global climate models to predict important aspects of Arctic climate
Initialization shock in decadal hindcasts due to errors in wind stress over the tropical Pacific
Low prediction skill in the tropical Pacific is a common problem in decadal prediction systems, especially for lead years 2–5 which, in many systems, is lower than in uninitialized experiments. On the other hand, the tropical Pacific is of almost worldwide climate relevance through its teleconnections with other tropical and extratropical regions and also of importance for global mean temperature. Understanding the causes of the reduced prediction skill is thus of major interest for decadal climate predictions. We look into the problem of reduced prediction skill by analyzing the Max Planck Institute Earth System Model (MPI-ESM) decadal hindcasts for the fifth phase of the Climate Model Intercomparison Project and performing a sensitivity experiment in which hindcasts are initialized from a model run forced only by surface wind stress. In both systems, sea surface temperature variability in the tropical Pacific is successfully initialized, but most skill is lost at lead years 2–5. Utilizing the sensitivity experiment enables us to pin down the reason for the reduced prediction skill in MPI-ESM to errors in wind stress used for the initialization. A spurious trend in the wind stress forcing displaces the equatorial thermocline in MPI-ESM unrealistically. When the climate model is then switched into its forecast mode, the recovery process triggers artificial El Niño and La Niña events at the surface. Our results demonstrate the importance of realistic wind stress products for the initialization of decadal prediction
Recommended from our members
The extreme European summer 2012
The European summer of 2012 was marked by strongly contrasting rainfall anomalies, which led to flooding in northern Europe and droughts and wildfires in southern Europe. This season was not an isolated event, rather the latest in a string of summers characterized by a southward shifted Atlantic storm track as described by the negative phase of the SNAO. The degree of decadal variability in these features suggests a role for forcing from outside the dynamical atmosphere, and preliminary numerical experiments suggest that the global SST and low Arctic sea ice extent anomalies are likely to have played a role and that warm North Atlantic SSTs were a particular contributing factor. The direct effects of changes in radiative forcing from greenhouse gas and aerosol forcing are not included in these experiments, but both anthropogenic forcing and natural variability may have influenced the SST and sea ice changes
Explaining Extreme Events of 2012 from a Climate Perspective
Attribution of extreme events is a challenging science and one that is currently undergoing considerable evolution. In this paper are 19 analyses by 18 different research groups, often using quite different methodologies, of 12 extreme events that occurred in 2012. In addition to investigating the causes of these extreme events, the multiple analyses of four of the events, the high temperatures in the United States, the record low levels of Arctic sea ice, and the heavy rain in northern Europe and eastern Australia, provide an opportunity to compare and contrast the strengths and weaknesses of the various methodologies. The differences also provide insights into the structural uncertainty of event attribution, that is, the uncertainty that arises directly from the differences in analysis methodology. In these cases, there was considerable agreement between the different assessments of the same event. However, different events had very different causes. Approximately half the analyses found some evidence that anthropogenically caused climate change was a contributing factor to the extreme event examined, though the effects of natural fluctuations of weather and climate on the evolution of many of the extreme events played key roles as well.Peer Reviewe
Recommended from our members
Mechanisms of decadal variability in the Labrador Sea and the wider North Atlantic in a high-resolution climate model
A necessary step before assessing the performance of decadal predictions is the evaluation of the processes that bring memory to the climate system, both in climate models and observations. These mechanisms are particularly relevant in the North Atlantic, where the ocean circulation, related to both the Subpolar Gyre and the Meridional Overturning Circulation (AMOC), is thought to be important for driving significant heat content anomalies. Recently, a rapid decline in observed densities in the deep Labrador Sea has pointed to an ongoing slowdown of the AMOC strength taking place since the mid 90s, a decline also hinted by in-situ observations from the RAPID array.
This study explores the use of Labrador Sea densities as a precursor of the ocean circulation changes, by analysing a 300-year long simulation with the state-of-the-art coupled model HadGEM3-GC2. The major drivers of Labrador Sea density variability are investigated, and are characterised by three major contributions. First, the integrated effect of local surface heat fluxes, mainly driven by year-to-year changes in the North Atlantic Oscillation, which accounts for 62% of the total variance. Additionally, two multidecadal-to-centennial contributions from the Greenland-Scotland Ridge outflows are quantified; the first associated with freshwater exports via the East Greenland Current, and the second with density changes in the Denmark Strait Overflow. Finally, evidence is shown that decadal trends in Labrador Sea densities are followed by important atmospheric impacts. In particular, a negative winter NAO response appears to follow the positive Labrador Sea density trends, and provides a phase reversal mechanism
- …