28 research outputs found
Perception of breakfast ingestion enhances high intensity cycling performance
PURPOSE: To examine the effect on short duration, high intensity cycling time trial performance when a semi-solid breakfast containing carbohydrate or a taste and texture matched placebo is ingested 90 minutes pre-exercise compared to a water control. METHODS: Thirteen well trained cyclists (25 ± 8 years, 71.1 ± 5.9 kg, 1.76 ± 0.04 m, 383 ± 46 Wmax, VO2peak 4.42 ± 0.53 L·min-1) performed three experimental trials examining breakfast ingestion 90 minutes before a 10 minute steady state cycle (60% Wmax) and a ~20 minute time trial (to complete a workload target of 376 ± 36 kJ). Subjects consumed either water (WAT), a semi-solid carbohydrate breakfast (2 g carbohydrate·kg-1 body mass; CHO) or a taste and texture matched placebo (PLA). Blood lactate and glucose concentrations were measured periodically throughout the rest and exercise periods. RESULTS: The time trial was completed quicker in CHO (1120 ± 69 s; P=0.006) and PLA (1112 ± 50 s; P=0.030) compared to WAT (1146 ± 74 s). Ingestion of carbohydrate caused an increase in blood glucose concentration throughout the rest period in CHO (peak at 30 minutes rest: 7.37 ± 1.10 mmol·l-1; P<0.0001) before dropping below baseline levels after the steady state cycling.
CONCLUSION: A short duration cycling time trial was completed quicker when subjects perceived that they consumed breakfast (PLA or CHO) 90 minutes prior to the start of exercise. The improvement in performance is likely attributable to a psychological rather than physiological effect
Recommended from our members
A reassessment of temperature variations and trends from global reanalyses and monthly surface climatological datasets
The ERA-Interim and JRA-55 reanalyses of synoptic data and several conventional analyses of monthly climatological data provide similar estimates of global-mean surface warming since 1979. They broadly agree on the character of interannual variability and the extremity of the 2015/2016 warm spell to which a strong El Niño and low Arctic sea-ice cover contribute. Nevertheless global and regional averages differ on various time-scales due to differences in data coverage and sea-surface temperature analyses; averages from those conventional datasets that infill where they lack direct observations agree better with the averages from the reanalyses. The latest warm event is less extreme when viewed in terms of atmospheric energy, which gives more weight to variability in the Tropics, where the thermal signal has greater vertical penetration and latent energy is a larger factor.
Surface warming from 1998 to 2012 is larger than indicated by earlier versions of the conventional datasets used to characterize what the Fifth Assessment Report of the Intergovernmental Panel on Climate Change termed a hiatus in global warming. None of the datasets exhibit net warming over the Antarctic since 1979.
Centennial trends from the conventional datasets, HadCRUT4 on the one hand and GISTEMP and NOAAGlobalTemp on the other, differ mainly because sea-surface temperatures differ. Infilling of values where direct observations are lacking is more questionable for the data-sparse earlier decades. Change since the eighteenth century is inevitably more uncertain than change over and after a modern baseline period. The latter is arguably best estimated separately for taking stock of actions to limit climate change, exploiting reanalyses and using satellite data to refine the conventional approach. Nevertheless, early in 2016 the global temperature appears to have first touched or briefly breached a level 1.5 °C above that early in the Industrial Revolution, having touched the 1.0 °C level in 1998 during a previous El Niño
Erroneous arctic temperature trends in the ERA-40 reanalysis: A closer look
© Copyright 2011 American Meteorological Society (AMS). Permission to use figures, tables, and brief excerpts from this work in scientific and educational works is hereby granted provided that the source is acknowledged. Any use of material in this work that is determined to be “fair use” under Section 107 of the U.S. Copyright Act September 2010 Page 2 or that satisfies the conditions specified in Section 108 of the U.S. Copyright Act (17 USC §108, as revised by P.L. 94-553) does not require the AMS’s permission. Republication, systematic reproduction, posting in electronic form, such as on a web site or in a searchable database, or other uses of this material, except as exempted by the above statement, requires written permission or a license from the AMS. Additional details are provided in the AMS Copyright Policy, available on the AMS Web site located at (http://www.ametsoc.org/) or from the AMS at 617-227-2425 or [email protected] reanalyses can be useful tools for examining climate variability and change; however, they must be used cautiously because of time-varying biases that can induce artificial trends. This study explicitly documents a discontinuity in the 40-yr European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERA-40) that leads to significantly exaggerated warming in the Arctic mid- to lower troposphere, and demonstrates that the continuing use of ERA-40 to study Arctic temperature trends is problematic. The discontinuity occurs in 1997 in response to refined processing of satellite radiances prior to their assimilation into the reanalysis model. It is clearly apparent in comparisons of ERA-40 output against satellite-derived air temperatures, in situ observations, and alternative reanalyses. Decadal or multidecadal Arctic temperature trends calculated over periods that include 1997 are highly inaccurate, particularly below 600 hPa. It is shown that ERA-40 is poorly suited to studying Arctic temperature trends and their vertical profile, and conclusions based upon them must be viewed with extreme caution. Consequently, its future use for this purpose is discouraged. In the context of the wider scientific debate on the suitability of reanalyses for trend analyses, the results show that a series of alternative reanalyses are in broad-scale agreement with observations. Thus, the authors encourage their discerning use instead of ERA-40 for examining Arctic climate change while also reaffirming the importance of verifying reanalyses with observations whenever possibl
The ESA climate change initiative: Satellite data records for essential climate variables
The ESA’s Climate Change Initiative is reprocessing and reassessing over 40 years of multi-sensor satellite records to generate consistent, traceable, long-term datasets of “essential climate variables” for the climate modeling and research communities
Stratospheric temperature changes during the satellite era
Satellite-based layer average stratospheric temperature (T) climate data records (CDRs) now span more than three decades and so can elucidate climate variability associated with processes on multiple time scales. We intercompare and analyze available published T CDRs covering at least two decades, with a focus on Stratospheric Sounding Unit (SSU) and Microwave Sounding Unit (MSU) CDRs. Recent research has reduced but not eliminated discrepancies between SSU CDRs developed by NOAA and the UK Meteorological Office. The MSU CDRs from NOAA and Remote Sensing Systems are in closer agreement than the CDR from the University of Alabama in Huntsville. The latter has a previously unreported inhomogeneity in 2005, revealed by an abrupt increase in the magnitude and spatial variability of T anomaly differences between CDRs. Although time-varying biases remain in both SSU and MSU CDRs, multiple linear regression analyses reveal consistent solar, El Niño–Southern Oscillation (ENSO), quasi-biennial oscillation, aerosol, and piecewise-linear trend signals. Together, these predictors explain 80 to 90% of the variance in the near-global-average T CDRs. The most important predictor variables (in terms of percent explained variance in near-global-average T) for lower stratospheric T measured by MSU are aerosols, solar variability, and ENSO. Trends explain the largest percentage of variance in observations from all three SSU channels. In MSU and SSU CDRs, piecewise-linear trends, with a 1995 break point, indicate cooling during 1979–1994 but no trend during 1995–2013 for MSU and during 1995–2005 for SSU. These observational findings provide a basis for evaluating climate model simulations of stratospheric temperature during the past 35 years
Agreement in late twentieth century Southern Hemisphere stratospheric temperature trends in observations and CCMVal-2, CMIP3 and CMIP5 models
We present a comparison of temperature trends using different satellite and radiosonde observations and climate (GCM) and chemistry-climate model (CCM) output, focusing on the role of photochemical ozone depletion in the Antarctic lower stratosphere during the second half of the twentieth century. Ozone-induced stratospheric cooling peaks during November at an altitude of approximately 100 hPa in radiosonde observations, with 1969-1998 trends in the range -3.8 to -4.7 K / dec. This stratospheric cooling trend is more than 50% greater than the previously estimated value of -2.4 K / dec [Thompson and Solomon, 2002], which suggested that the CCMs were overestimating the stratospheric cooling, and that the less complex GCMs forced by prescribed ozone were matching observations better. Corresponding ensemble mean model trends are -3.8 K / dec for the CCMs, -3.5 K / dec for the CMIP5 GCMs, and -2.7 K / dec for the CMIP3 GCMs. Accounting for various sources of uncertainty – including sampling uncertainty, measurement error, model spread, and trend confidence intervals – observations, and CCM and GCM ensembles are consistent in this new analysis. This consistency does not apply to every individual that comprises the GCM and CCM ensembles, and some do not show significant ozone-induced cooling. Nonetheless, analysis of the joint ozone and temperature trends in the CCMs suggests that the modeled cooling/ozone-depletion relationship is within the range of observations. Overall, this study emphasizes the need to use a wide range of observations for model validation, as well as sufficient accounting of uncertainty in both models and measurements
Homogenization of scatterometer wind retrievals
Surface winds (10 m equivalent neutral wind velocity) from scatterometer missions since 1992 to present require homogenization to meet the requirements for oceanic and atmospheric climate data records. Sources of differences between winds retrieved from different scatterometer measurements mainly arise from calibration/validation procedures used for each scatterometer and differences in measurement physics. In this study, we focus on the calibration/validation component of the European Remote Sensing Satellite (ERS)-1 and ERS-2 wind speed biases. ERS-1 and ERS-2 data, named as WNF products, are from the Institut Français de Recherche pour l'Exploitation de la MER (IFREMER). In addition to WNF data, the newly calibrated ERS-2 products provided by the European Space Agency (ESA), indicated as ASPS2.0 products, are also used. Our approach utilizes collocated satellite-buoy data. Expected values of the normalized radar cross section (NRCS) are calculated from buoy winds for each antenna beam using the Cmod5.n geophysical model function. The comparisons between expected and measured NRCS examine differences along with variables such as backscatter coefficient and incidence angle ranges. The difference between the expected and measured NRCS is then used to set up empirical models aiming at the correction for biases in ERS-1 and ERS-2 WNF NRCS calibrations. Finally, ERS-1 and ERS-2 wind retrievals are reprocessed using the corrected NRCS and Cmod5.n. These earlier corrected ERS-1/2 winds are analysed along with later scatterometer data (QuikSCAT and ASCAT-A) for their deviations from in situ buoy winds during 1992–2011 period. The scatterometer data homogeneity is also investigated at global scales based on the use of collocated scatterometer retrievals and atmospheric re-analyses winds derived from ERA Interim and CFSR models