20 research outputs found

    Global, regional, and national age-sex-specific mortality and life expectancy, 1950–2017: a systematic analysis for the Global Burden of Disease Study 2017

    Get PDF
    BACKGROUND: Assessments of age-specific mortality and life expectancy have been done by the UN Population Division, Department of Economics and Social Affairs (UNPOP), the United States Census Bureau, WHO, and as part of previous iterations of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD). Previous iterations of the GBD used population estimates from UNPOP, which were not derived in a way that was internally consistent with the estimates of the numbers of deaths in the GBD. The present iteration of the GBD, GBD 2017, improves on previous assessments and provides timely estimates of the mortality experience of populations globally. METHODS: The GBD uses all available data to produce estimates of mortality rates between 1950 and 2017 for 23 age groups, both sexes, and 918 locations, including 195 countries and territories and subnational locations for 16 countries. Data used include vital registration systems, sample registration systems, household surveys (complete birth histories, summary birth histories, sibling histories), censuses (summary birth histories, household deaths), and Demographic Surveillance Sites. In total, this analysis used 8259 data sources. Estimates of the probability of death between birth and the age of 5 years and between ages 15 and 60 years are generated and then input into a model life table system to produce complete life tables for all locations and years. Fatal discontinuities and mortality due to HIV/AIDS are analysed separately and then incorporated into the estimation. We analyse the relationship between age-specific mortality and development status using the Socio-demographic Index, a composite measure based on fertility under the age of 25 years, education, and income. There are four main methodological improvements in GBD 2017 compared with GBD 2016: 622 additional data sources have been incorporated; new estimates of population, generated by the GBD study, are used; statistical methods used in different components of the analysis have been further standardised and improved; and the analysis has been extended backwards in time by two decades to start in 1950. FINDINGS: Globally, 18·7% (95% uncertainty interval 18·4–19·0) of deaths were registered in 1950 and that proportion has been steadily increasing since, with 58·8% (58·2–59·3) of all deaths being registered in 2015. At the global level, between 1950 and 2017, life expectancy increased from 48·1 years (46·5–49·6) to 70·5 years (70·1–70·8) for men and from 52·9 years (51·7–54·0) to 75·6 years (75·3–75·9) for women. Despite this overall progress, there remains substantial variation in life expectancy at birth in 2017, which ranges from 49·1 years (46·5–51·7) for men in the Central African Republic to 87·6 years (86·9–88·1) among women in Singapore. The greatest progress across age groups was for children younger than 5 years; under-5 mortality dropped from 216·0 deaths (196·3–238·1) per 1000 livebirths in 1950 to 38·9 deaths (35·6–42·83) per 1000 livebirths in 2017, with huge reductions across countries. Nevertheless, there were still 5·4 million (5·2–5·6) deaths among children younger than 5 years in the world in 2017. Progress has been less pronounced and more variable for adults, especially for adult males, who had stagnant or increasing mortality rates in several countries. The gap between male and female life expectancy between 1950 and 2017, while relatively stable at the global level, shows distinctive patterns across super-regions and has consistently been the largest in central Europe, eastern Europe, and central Asia, and smallest in south Asia. Performance was also variable across countries and time in observed mortality rates compared with those expected on the basis of development. INTERPRETATION: This analysis of age-sex-specific mortality shows that there are remarkably complex patterns in population mortality across countries. The findings of this study highlight global successes, such as the large decline in under-5 mortality, which reflects significant local, national, and global commitment and investment over several decades. However, they also bring attention to mortality patterns that are a cause for concern, particularly among adult men and, to a lesser extent, women, whose mortality rates have stagnated in many countries over the time period of this study, and in some cases are increasing

    Global age-sex-specific fertility, mortality, healthy life expectancy (HALE), and population estimates in 204 countries and territories, 1950–2019: a comprehensive demographic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Accurate and up-to-date assessment of demographic metrics is crucial for understanding a wide range of social, economic, and public health issues that affect populations worldwide. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 produced updated and comprehensive demographic assessments of the key indicators of fertility, mortality, migration, and population for 204 countries and territories and selected subnational locations from 1950 to 2019. Methods: 8078 country-years of vital registration and sample registration data, 938 surveys, 349 censuses, and 238 other sources were identified and used to estimate age-specific fertility. Spatiotemporal Gaussian process regression (ST-GPR) was used to generate age-specific fertility rates for 5-year age groups between ages 15 and 49 years. With extensions to age groups 10–14 and 50–54 years, the total fertility rate (TFR) was then aggregated using the estimated age-specific fertility between ages 10 and 54 years. 7417 sources were used for under-5 mortality estimation and 7355 for adult mortality. ST-GPR was used to synthesise data sources after correction for known biases. Adult mortality was measured as the probability of death between ages 15 and 60 years based on vital registration, sample registration, and sibling histories, and was also estimated using ST-GPR. HIV-free life tables were then estimated using estimates of under-5 and adult mortality rates using a relational model life table system created for GBD, which closely tracks observed age-specific mortality rates from complete vital registration when available. Independent estimates of HIV-specific mortality generated by an epidemiological analysis of HIV prevalence surveys and antenatal clinic serosurveillance and other sources were incorporated into the estimates in countries with large epidemics. Annual and single-year age estimates of net migration and population for each country and territory were generated using a Bayesian hierarchical cohort component model that analysed estimated age-specific fertility and mortality rates along with 1250 censuses and 747 population registry years. We classified location-years into seven categories on the basis of the natural rate of increase in population (calculated by subtracting the crude death rate from the crude birth rate) and the net migration rate. We computed healthy life expectancy (HALE) using years lived with disability (YLDs) per capita, life tables, and standard demographic methods. Uncertainty was propagated throughout the demographic estimation process, including fertility, mortality, and population, with 1000 draw-level estimates produced for each metric. Findings: The global TFR decreased from 2•72 (95% uncertainty interval [UI] 2•66–2•79) in 2000 to 2•31 (2•17–2•46) in 2019. Global annual livebirths increased from 134•5 million (131•5–137•8) in 2000 to a peak of 139•6 million (133•0–146•9) in 2016. Global livebirths then declined to 135•3 million (127•2–144•1) in 2019. Of the 204 countries and territories included in this study, in 2019, 102 had a TFR lower than 2•1, which is considered a good approximation of replacement-level fertility. All countries in sub-Saharan Africa had TFRs above replacement level in 2019 and accounted for 27•1% (95% UI 26•4–27•8) of global livebirths. Global life expectancy at birth increased from 67•2 years (95% UI 66•8–67•6) in 2000 to 73•5 years (72•8–74•3) in 2019. The total number of deaths increased from 50•7 million (49•5–51•9) in 2000 to 56•5 million (53•7–59•2) in 2019. Under-5 deaths declined from 9•6 million (9•1–10•3) in 2000 to 5•0 million (4•3–6•0) in 2019. Global population increased by 25•7%, from 6•2 billion (6•0–6•3) in 2000 to 7•7 billion (7•5–8•0) in 2019. In 2019, 34 countries had negative natural rates of increase; in 17 of these, the population declined because immigration was not sufficient to counteract the negative rate of decline. Globally, HALE increased from 58•6 years (56•1–60•8) in 2000 to 63•5 years (60•8–66•1) in 2019. HALE increased in 202 of 204 countries and territories between 2000 and 2019. Interpretation: Over the past 20 years, fertility rates have been dropping steadily and life expectancy has been increasing, with few exceptions. Much of this change follows historical patterns linking social and economic determinants, such as those captured by the GBD Socio-demographic Index, with demographic outcomes. More recently, several countries have experienced a combination of low fertility and stagnating improvement in mortality rates, pushing more populations into the late stages of the demographic transition. Tracking demographic change and the emergence of new patterns will be essential for global health monitoring. Funding: Bill & Melinda Gates Foundation. © 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 licens

    Global burden of 87 risk factors in 204 countries and territories, 1990�2019: a systematic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Rigorous analysis of levels and trends in exposure to leading risk factors and quantification of their effect on human health are important to identify where public health is making progress and in which cases current efforts are inadequate. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 provides a standardised and comprehensive assessment of the magnitude of risk factor exposure, relative risk, and attributable burden of disease. Methods: GBD 2019 estimated attributable mortality, years of life lost (YLLs), years of life lived with disability (YLDs), and disability-adjusted life-years (DALYs) for 87 risk factors and combinations of risk factors, at the global level, regionally, and for 204 countries and territories. GBD uses a hierarchical list of risk factors so that specific risk factors (eg, sodium intake), and related aggregates (eg, diet quality), are both evaluated. This method has six analytical steps. (1) We included 560 risk�outcome pairs that met criteria for convincing or probable evidence on the basis of research studies. 12 risk�outcome pairs included in GBD 2017 no longer met inclusion criteria and 47 risk�outcome pairs for risks already included in GBD 2017 were added based on new evidence. (2) Relative risks were estimated as a function of exposure based on published systematic reviews, 81 systematic reviews done for GBD 2019, and meta-regression. (3) Levels of exposure in each age-sex-location-year included in the study were estimated based on all available data sources using spatiotemporal Gaussian process regression, DisMod-MR 2.1, a Bayesian meta-regression method, or alternative methods. (4) We determined, from published trials or cohort studies, the level of exposure associated with minimum risk, called the theoretical minimum risk exposure level. (5) Attributable deaths, YLLs, YLDs, and DALYs were computed by multiplying population attributable fractions (PAFs) by the relevant outcome quantity for each age-sex-location-year. (6) PAFs and attributable burden for combinations of risk factors were estimated taking into account mediation of different risk factors through other risk factors. Across all six analytical steps, 30 652 distinct data sources were used in the analysis. Uncertainty in each step of the analysis was propagated into the final estimates of attributable burden. Exposure levels for dichotomous, polytomous, and continuous risk factors were summarised with use of the summary exposure value to facilitate comparisons over time, across location, and across risks. Because the entire time series from 1990 to 2019 has been re-estimated with use of consistent data and methods, these results supersede previously published GBD estimates of attributable burden. Findings: The largest declines in risk exposure from 2010 to 2019 were among a set of risks that are strongly linked to social and economic development, including household air pollution; unsafe water, sanitation, and handwashing; and child growth failure. Global declines also occurred for tobacco smoking and lead exposure. The largest increases in risk exposure were for ambient particulate matter pollution, drug use, high fasting plasma glucose, and high body-mass index. In 2019, the leading Level 2 risk factor globally for attributable deaths was high systolic blood pressure, which accounted for 10·8 million (95 uncertainty interval UI 9·51�12·1) deaths (19·2% 16·9�21·3 of all deaths in 2019), followed by tobacco (smoked, second-hand, and chewing), which accounted for 8·71 million (8·12�9·31) deaths (15·4% 14·6�16·2 of all deaths in 2019). The leading Level 2 risk factor for attributable DALYs globally in 2019 was child and maternal malnutrition, which largely affects health in the youngest age groups and accounted for 295 million (253�350) DALYs (11·6% 10·3�13·1 of all global DALYs that year). The risk factor burden varied considerably in 2019 between age groups and locations. Among children aged 0�9 years, the three leading detailed risk factors for attributable DALYs were all related to malnutrition. Iron deficiency was the leading risk factor for those aged 10�24 years, alcohol use for those aged 25�49 years, and high systolic blood pressure for those aged 50�74 years and 75 years and older. Interpretation: Overall, the record for reducing exposure to harmful risks over the past three decades is poor. Success with reducing smoking and lead exposure through regulatory policy might point the way for a stronger role for public policy on other risks in addition to continued efforts to provide information on risk factor harm to the general public. Funding: Bill & Melinda Gates Foundation. © 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 licens

    DISTRIBUTION OF PRESEASON TRAINING INTENSITY IN MASTERS-AGED COMPETITIVE CROSS COUNTRY SKIERS

    No full text
    DISTRIBUTION OF PRESEASON TRAINING INTENSITY IN MASTERS-AGED COMPETITIVE CROSS COUNTRY SKIERS E.C. Ranta, T.K. Vetrone, and D.P. Heil, FACSM Montana State University, Bozeman, MT INTRODUCTION: The distribution of preseason training intensity amongst competitive Masters-aged cross country (XC) skiers has been largely overlooked in the research literature. To develop effective training programs for these athletes, current training practices must first be understood. Thus, the purpose of this study was to classify the weekly training practices of competitive Masters-aged XC skiers during the 2013 preseason (August-October) using telemetry-based heart rate monitors (HRM). METHODS: Fifty-seven competitive XC skiers, all 40+ years of age, volunteered to wear a HRM (5-sec recording interval) and use a training log to record all bouts of exercise over 14 consecutive days of typical preseason training. Subjects who donned the HRM for less than 70% of their logged training time were excluded from the study due to insufficient data collection. Heart rate data were downloaded to a computer and summarized by both absolute training time (T, mins) and relative time (P, %) spent within six HR zones (Z) calculated as a percentage of age-predicted maximum HR (APMHR): Z5= HR≥90% of APMHR; Z4= 80%55 yrs.) using two-sample T-tests (0.05 alpha). RESULTS: Only 41 of the 57 subjects recorded adequate data for evaluation. These 22 men (Mean±SD: 57±8 yrs.; 23.5±1.8 kg/m2) and 19 women (56±7 yrs.; 21.0±1.1 kg/m2) reported significantly less T in Z5 when compared to Z0-Z4 (Mean±SE: 35.9±5.3 vs. 76.0±12.6, 116.5±10.9, 136.6±8.0, 125.8±8.8, 88.6±8.5 mins/wk for Z0-Z4, respectively) as well as significantly less P within Z5 when compared to Z1-Z4 (6.8±1.1% vs. 19.5±1.3%, 23.5±1.0%, 22.2±1.4%, 15.9±1.4% for Z1-Z4, respectively). Additionally, the male subjects recorded significantly (P\u3c.01) more exercise bouts per week on the training logs and HRM than the female subjects (10.4±.75 vs. 7.9±0.5 and 8.6±0.7 vs. 6.3±0.3 bouts/wk, M vs. F, respectively). There were no significant differences between age groups. CONCLUSION: The relatively even distribution of Z1-Z4 training time, with significantly less Z5 time, is contrary to the preseason polarized training regimen followed by elite athletes. For Masters skiers to achieve a more optimal training distribution, akin to competitive elite athletes, emphasis should be placed on increasing the P spent in Z1, Z2, and Z5 while simultaneously decreasing the P spent in Z3 and Z4

    HEART RATE RESPONSES PRIOR TO AND DURING A 15-KM SKATE SKI RACE: A PILOT STUDY

    No full text
    E.C. Ranta, T. Vetrone, & D.P. Heil Montana State University, Bozeman, MT The unique population of Masters-level cross country ski racers has been largely overlooked in prior ski research. In order to accurately develop coaching techniques for Masters skiers, current training practices and race strategies must first be understood. Collecting heart rate (HR) response data, through the use of telemetry-based heart rate monitor (HRM) systems, is one method of characterizing these athletes. PURPOSE: This study tested the feasibility of utilizing HRMs as a means to collect HR data on multiple Masters-level cross country ski racers competing simultaneously. A secondary purpose was to explore correlations between warm-up (WU) and race HR responses. METHODS: Five men and two women volunteered to participate in the study. Two subjects were dropped from the results due to incomplete data collection and imprecise HRM recordings. The remaining four men (M±SD: 43±7 yrs; 71.8±5.3 kg; 179.1±2.5 cm; 8±4.3 yrs race experience) and one woman (38 yrs; 67.1 kg; 172.7 cm; 1 yr race experience) wore telemetry-based HRM systems (set at 5-sec sample intervals) during a 15-km skate ski race, in addition to the 45-min WU period immediately preceding the race. Subjects were instructed to warm-up and race as they would normally. Participants also filled out an online questionnaire for self-reporting of demographic, training, and racing history. HR data was downloaded to a computer and summarized for both the WU and race periods. Race HR was defined as two minutes past the initial onset of a steady-state HR through the last highest recorded HR value. Summary HR values were then combined with race performance times and compared using Pearson’s correlation at an alpha of 0.10. RESULTS: Average race and WU HR were 168±4 BPM and 124±16 BPM, respectively, while WU HR as a percentage of race HR was 74%±7%. Race time correlated significantly with WU time spent at or above average race HR (r =0 .971; P = 0.006), while average race HR correlated significantly with average WU HR (r = 0.948; P = 0.014) as well as WU time spent within the subject’s range of race HR (r =0 .930; P = 0.022). CONCLUSION: The use of telemetry-based HRM systems are a feasible option for characterizing the WU and race HR response patterns of Masters-level skiers despite the mass start format and extremely cold air temperatures (-22˚C). Additionally, WU HR patterns correlated well with self-selected race pacing strategies within this particular population. Although future studies would benefit from increasing subject recruitment and reducing HRM data collection errors, similar methodological strategies are practical for further examining the training and racing practices of Masters-level cross country skiers

    CHARACTERIZING PRE-SEASON TRAINING HABITS OF COMPETITIVE MASTERS-AGED CROSS-COUNTRY SKIERS

    No full text
    CHARACTERIZING PRE-SEASON TRAINING HABITS OF COMPETITIVE MASTERS-AGED CROSS-COUNTRY SKIERS T.K. Vetrone, E.C. Ranta, & D.P. Heil, FACSM. Movement Science \ Human Performance Lab, Montana State University, Bozeman, MT Introduction: The training habits of Masters-aged cross-country skiers (i.e., those 40+ years of age) have been overlooked in the research literature. The primary purpose of this study was to characterize the pre-season training habits of Masters-aged cross-country skiers using self-report training logs in order to develop more specific guidelines to improve future training practices. Methods: Masters cross-country skiers (24 men: (Mean±SD) 57 ± 8 yrs., 40-73 yrs.; 19 women: 55 ± 7 yrs., 40-69 yrs.) were recruited from the Pacific Northwest region. Over one 14-day data collection period (September-October, 2013), subjects were instructed to self-record all activity bouts in a spreadsheet-based training log. Activity bouts were classified into one of four categories by degree of cross-country ski training specificity: C1 = activities of daily living (ADL) and lowest specificity; C2 = low specificity (e.g., yoga); C3 = moderate specificity (e.g., running, cycling); C4 = highest specificity (e.g., roller skiing). Total time within each activity bout was recorded into self-assessed intensity zones: Z0 = minimal intensity; Z1 = low to moderate intensity; Z2 = high and race pace intensity; Z3 = above race pace intensity. Total time recorded in each category and training zone was evaluated using a two-factor RM ANOVA and Sheffe’s post-hoc test (0.05 alpha). Results: Total time self-reported within the logs was 12.4±2.1 hrs/wk. Total time (T) within C3 activities (TC3= 8.5±1.1 hrs/wk) was significantly higher (P\u3c0.001) than all other categories (1.5±0.4, 0.9±0.4, 0.6±0.4 hrs/wk for TC1, TC2, and TC4, respectively,). In addition, total time (T) in Z1 (TZ1= 6.3±0.9 hrs/wk) was significantly higher (P\u3c0.001) than other zone values (2.6±0.7, 2.4±0.4, 1.1±0.2 hrs/wk for TZ0, TZ2, and TZ3, respectively). Conclusion: These skiers tended to self-select low-moderate intensity training (Zone 1) that was predominantly low impact and moderate in specificity (Category 3) to cross country skiing. These preferences may reflect an avoidance of traditional high impact training (e.g., running), as well as high risk activities (e.g., roller skiing), while the Zone 1 preference may simply reflect a dominance of base training. These results indicate that the observed training habits differ from what is known to be optimal for elite and junior skiers (i.e. polarized training models) and may in fact be the optimal training to avoid high risk or high impact activities

    PREDICTING THE ENERGY COST OF STEEP UPHILL TREADMILL WALKING: A CROSS-VALIDATION

    No full text
    PREDICTING THE ENERGY COST OF STEEP UPHILL TREADMILL WALKING: A CROSS-VALIDATION E.Davila1, E.C. Ranta1, L.M. Whalen1, D. Weishar2, A. Blake2, D.E. Lankford2, & D.P. Heil, FACSM1. 1Montana State University, Bozeman, MT; 2Brigham Young University, Rexburg, ID INTRODUCTION: There are a growing number of commercially-available electronic monitoring devices that claim to predict the energy cost of exercising as a function of one or more predictive metrics. Devices relying upon global positioning satellite (GPS) data, for example, can predict the energy cost of walking and running outdoors by determining real-time changes in travel speed and surface incline. These data can then be combined with a laboratory-derived prediction algorithm, but the algorithm must include both steep uphill and downhill inclines to remain ecologically valid. A well-known formula by Minetti et al. (JAP 2002), considered valid for inclines between -45% and +45%, would seem well suited but does not appear to have been cross-validated in the literature. The purpose of this study was to cross-validate the original Minetti formula for predicting relative energy cost (CW, J/kg/m) between -5% and +30% for treadmill walking using a broad range of healthy adults. METHODS: 31 recreationally-active adults (18 men: (Mean±SD) 28 ± 8 yrs, 20-45 yrs, and 23.0±3.3, 21.0-34.4 kg/m2; 13 women: 29 ± 3 yrs, 25-35 yrs, and 23.1±2.8, 19.1-29.8 kg/m2) were recruited to walk on a treadmill at 53.6 m/min (2.0 MPH) at one of four lower inclines (-5%, -2.5%, 0%, +8%) and one of four steeper inclines (+15%, +19%, +22.5%, +30%) across four separate lab visits for 20 mins at each incline. Steady-state oxygen consumption (VO2), recorded via standard indirect calorimetry procedures, were averaged from the end of each 20-min test. Net VO2 (exercise VO2 – resting VO2) was then converted to CW using the measured non-protein RER, body mass, and treadmill speed. Predicted CW, as determined from the original Minetti formula, was compared to measured CW using a two-factor repeated measures ANOVA, including a comparison by gender, at each incline (0.05 alpha). RESULTS: Mean predicted CW was statistically similar between genders for both measured and predicted CW. In contrast, measured CW was significantly lower (P\u3c0.001) than predicted CW at the lower inclines (≤ +15%), statistically similar at the +19% incline, and significantly higher at the steepest inclines (+22.5% and +30%; P\u3c0.001). CONCLUSIONS: The Minetti formula for predicting CW for steep downhill and uphill walking was derived using well-trained men accustomed to mountain running. The present study, in contrast, used a more diverse adult population (i.e., recreationally active, men and women, wide range of BMI) which suggests that the Minetti formula may lack a broad generalizability for the range of inclines tested (-5% to +30%) for populations dissimilar to the original validation sample

    A strategy for research projects to impact standards and regulatory bodies: The approach of the EU-funded project MiWaveS

    No full text
    International audienceThis paper proposes a strategy, described via a process, which can help collaborative research projects impact standards and engage with regulatory bodies. The process is being successfully used by a running EU-funded project that is taken as a concrete example of how to link in a proper way innovation and pre-development activities to standards. The proposed methodology has a general validity and can be adopted by any other research project, working on new enabling technologies far ahead from market launch, aiming at the broadest possible deployment of such technologies and willing to impact effectively both the research and the industry ecosystems. © 2015 IEEE
    corecore