97 research outputs found

    Physical Fitness of Police Cadets: Baseline Characteristics and Changes during A 16-Week Academy

    Get PDF
    Police academies traditionally emphasize the importance of being physically fit. The purpose of this research was to determine cadet baseline physical fitness characteristics and assess effectiveness of a 16-week training program. Sixty-eight cadets (61 men, 7 women) volunteered to have baseline physical fitness characteristics assessed, and 55 cadets (49 men, 6 women) completed further testing at weeks 8 and 16. The testing comprised hand grip (strength), arm crank (upper-body power), 30 seconds Wingate (lower body power), sum of skinfolds and percentage body fat (body composition), 40-yard dash (sprint speed), 1 repetition maximum bench press (strength), T-test (agility), and sit-and-reach (flexibility). In addition, cadets completed standardized state testing (push-ups, sit-ups, vertical jump, and half-mile shuttle run). The training program consisted of 1 hour sessions, 3 d·wk, including aerobic, plyometrics, body weight, and resistance exercise. Significant changes were found in agility (p \u3c 0.01), upper-body and lower-body peak power (p ≤ 0.05), sit-ups (p \u3c 0.01), push-ups (p ≤ 0.05) across the first 8 weeks, and in agility (p ≤ 0.05), lower-body peak power (p ≤ 0.05), sit-ups (p \u3c 0.01), push-ups (p ≤ 0.05), half-mile shuttle run (p \u3c 0.01) across the full 16 weeks. However, none of the variables showed significant change across the second half of the program (weeks 8-16). A number of individual parameters of physical fitness showed evidence of improvement in the first 8 weeks, whereas none of the variables showed significant improvement in the second 8 weeks. This suggests modifications could be made to increase overall effectiveness of cadet physical training specifically after the 8-week mark

    Intertester reliability of brachial artery flow-mediated vasodilation using upper and lower arm occlusion in healthy subjects

    Get PDF
    The assessment of endothelial function as brachial artery flow-mediated vasodilatation is a widely used technique that determines the effect of risk factor intervention and may have the potential to predict the clinical benefit of antiatherogenic therapy. Previous studies suggest that flow-mediated dilation is greater using the upper-arm occlusion technique, but no data are available to compare intertester reliability between technicians. This study was undertaken to compare the amount of hyperemia between upper and lower occlusion techniques and to determine reproducibility between testers. Nineteen healthy adults, ages 25 to 50, were included in the study. Brachial artery vasodilatation was measured 1 and 3 minutes post cuff deflation and was compared with the baseline and expressed as a percent change. There was a tester effect in the percent change in diameter across all measurements. The results of this study reveal inconsistencies between testers when using a blood pressure cuff to induce hyperemia for the assessment of endothelial function through brachial artery flow-mediated vasodilation. However, upper arm as compared to lower arm blood pressure cuff occlusion results in significantly greater hyperemia and vasodilatation, even though there was a difference in measurements between testers

    Effects of Six Weeks of Balance and Strength Training on Measures of Dynamic Balance of Older Adults

    Get PDF
    Purpose: Reliable tools on measuring outcomes of service-learning (SL) are scarce. This study aimed to develop and test a service-learning assessment tool to measure students’ perceived self-efficacy on program planning related competencies (SL-SEPP) and an overall SL impact scale. Methods: Students in a core Master of Public Health (MPH) course on program planning participated in the study (n=44). Course-based SL projects were incorporated into the learning process. Data from the baseline survey were used to assess the reliability of the 12-item SL-SEPP, and data from the posttest survey were used to assess the 5-item overall SL impact scale at the end of the course. Results: Data showed satisfactory reliability scores, with Cronbach alpha of .87 for the SL-SEPP and .84 for the overall impact scale. Even with this relatively small sample size, preliminary analyses showed that the SL-SEPP was sensitive to detect meaningful changes of self-efficacy scores after the course. Conclusion: This study provides needed pilot data supporting the reliabilities of the SL-SEPP tool. The study has implications for researchers and educators to apply or adapt this tool to assess student self-efficacy outcomes on program planning competencies

    Unpredictable shoe midsole perturbations provide an instability stimulus to train ankle posture and motion during forward and lateral gym lunges

    Get PDF
    Unstable footwear may enhance training effects to the lower-limb musculature and sensorimotor system during dynamic gym movements. This study compared the instability of an unstable shoe with irregular midsole deformations (IM) and a control shoe (CS) during forward and lateral lunges. Seventeen female gym class participants completed two sets of ten forward and lateral lunges in CS and IM. Ground reaction forces, lower-limb kinematics and ankle muscle activations were recorded. Variables around initial ground contact, toe-off, descending and ascending lunge phases were compared statistically (p < .05). Responses to IM compared to CS were similar across lunge directions. The IM induced instability by increasing the vertical loading rate (p < .001, p = .009) and variability of frontal ankle motion during descending (p = .001, p < .001) and ascending phases (p = .150, p = .003), in forward and lateral lunges, respectively. At initial ground contact, ankle adjustments enhanced postural stability in IM. Across muscles, there were no activation increases, although results indicate peroneus longus activations increased in IM during the ascending phase. As expected, IM provided a more demanding training stimulus during lunge exercises and has potential to reduce ankle injuries by training ankle positioning for unpredictable instability

    Risk factors for heat illness among British soldiers in the hot Collective Training Environment.

    Get PDF
    BACKGROUND: Heat illness is a preventable disorder in military populations. Measures that protect vulnerable individuals and contribute to effective Immediate Treatment may reduce the impact of heat illness, but depend upon adequate understanding and awareness among Commanders and their troops. OBJECTIVE: To assess risk factors for heat illness in British soldiers deployed to the hot Collective Training Environment (CTE) and to explore awareness of Immediate Treatment responses. METHODS: An anonymous questionnaire was distributed to British soldiers deployed in the hot CTEs of Kenya and Canada. Responses were analysed to determine the prevalence of individual (Intrinsic) and Command-practice (Extrinsic) risk factors for heat illness and the self-reported awareness of key Immediate Treatment priorities (recognition, first aid and casualty evacuation). RESULTS: The prevalence of Intrinsic risk factors was relatively low in comparison with Extrinsic risk factors. The majority of respondents were aware of key Immediate Treatment responses. The most frequently reported factors in each domain were increased risk by body composition scoring, inadequate time for heat acclimatisation and insufficient briefing about casualty evacuation. CONCLUSIONS: Novel data on the distribution and scale of risk factors for heat illness are presented. A collective approach to risk reduction by the accumulation of 'marginal gains' is proposed for the UK military. This should focus on limiting Intrinsic risk factors before deployment, reducing Extrinsic factors during training and promoting timely Immediate Treatment responses within the hot CTE

    Excess Post-Exercise Oxygen Consumption (Epoc) Following Multiple Effort Sprint and Moderate Aerobic Exercise

    Get PDF
    The purpose of this study was to investigate the effects of 30-second all-out sprint interval exercise (SIE) vs. moderate aerobic exercise (MA) on excess post-exercise oxygen consumption (EPOC). Six recreationally-trained males (age=23.3 +/- 1.4 yrs, weight=81.8 +/- 9.9 kg, height=180.8 +/- 6.3 cm) completed a sprint interval exercise session consisting of three repeated 30-second Wingate cycling tests separated by four minutes (duration similar to 11minutes) as well as a moderate aerobic exercise session consisting of 30-minute cycling at 60% heart rate reserve (HRR) in a random counterbalanced design. Baseline oxygen consumption (VO2) was determined by an average VO2 from the final five minutes of a 30-minute supine rest period prior to each trial. Following each protocol, VO2 was measured for 30 minutes or until baseline measures were reached. EPOC was determined by subtracting baseline VO2 from post-exercise VO2 measurements. Energy expenditure (kJ) was determined by multiplying kJ per liter of oxygen by the average VO2 during recovery. EPOC values were significantly higher in SIE (7.5 +/- 1.3 L) than MA (1.8 +/- 0.7 L). SIE produced a higher recovery caloric expenditure (156.9 kJ) compared to MA (41.0 kJ) and remained significantly elevated (p=.024) over resting levels during the entire recovery period (30 minutes) compared to MA (6 minutes, p=.003). The energy required to recover from three repeated maximal effort 30-second Wingate cycling tests was greater than 30 minutes of moderate aerobic exercise. Future studies should examine the chronic effects of maximal effort sprint training protocol on cardiovascular fitness and body composition

    Povećana potrošnja kisika nakon vježbanja (epoc) zabilježena nakon višekratnih sprintova i umjerene aerobne aktivnosti

    Get PDF
    The purpose of this study was to investigate the effects of 30-second all out sprint interval exercise (SIE) vs. moderate aerobic exercise (MA) on excess post-exercise oxygen consumption (EPOC). Six recreationally-trained males (age=23.3±1.4 yrs, weight=81.8±9.9 kg, height=180.8±6.3cm) completed a sprint interval exercise session consisting of three repeated 30-second Wingate cycling tests separated by four minutes (duration~11minutes) as well as a moderate aerobic exercise session consisting of 30-minute cycling at 60% heart rate reserve (HRR) in a random counterbalanced design. Baseline oxygen consumption (VO2) was determined by an average VO2 from the final five minutes of a 30-minute supine rest period prior to each trial. Following each protocol, VO2 was measured for 30-minutes or until baseline measures were reached. EPOC was determined by subtracting baseline VO2 from post-exercise VO2 measurements. Energy expenditure (kJ) was determined by multiplying kJ per liter of oxygen by the average VO2 during recovery. EPOC values were significantly higher in SIE (7.5±1.3 L) than MA (1.8±0.7 L). SIE produced a higher recovery caloric expenditure (156.9 kJ) compared to MA (41.0 kJ) and remained significantly elevated (p=.024) over resting levels during the entire recovery period (30 minutes) compared to MA (6 minutes, p=.003). The energy required to recover from three repeated maximal effort 30-second Wingate cycling tests was greater than 30-minutes of moderate aerobic exercise. Future studies should examine the chronic effects of maximal effort sprint training protocol on cardiovascular fitness and body composition.Cilj je ovog istraživanja bio utvrditi učinke maksimalnog intervalnog sprintanja po 30 sekunda (SIE) i usporediti ih s umjerenim aerobnim treniranjem (MA) na povišenu potrošnju kisika nakon vježbanja. Šest muškaraca rekreativaca (23.3±1.4 godina, 81.8 ± 9.9 kg i 180.8 ± 6.3 cm) je, u slučajnom uravnoteženom nacrtu eksperimenta, provelo intervalni sprinterski trening koji se sastojao od tri ponavljanja Wingate testa (30 sekunda) na biciklu sa odmorima po 4 minute (ukupno trajanje zadatka približno 11 minuta) te umjereni aerobni trening koji se sastojao od 30 minuta bicikliranja intenzitetom od 60% rezerve srčane frekvencije. Početni primitak kisika (VO2) je bio utvrđen kao prosječna vrijednost VO2 zabilježena u zadnjih 5 minuta 30-minutnog odmora u ležećem položaju koji se primjenjivao prije svakog eksperimentalnog protokola. Nakon svakog protokola, VO2 je bio mjeren tijekom 30 minuta ili do trenutka kada se VO2 spustio na početnu vrijednost. EPOC je utvrđen oduzimanjem početne vrijednosti VO2 od vrijednosti VO2 zabilježenih nakon eksperimentalnih protokola. Energetska potrošnja (kJ) je bila utvrđena množenjem potrošenih kJ po litri kisika sa prosječnim VO2 tijekom oporavka. Vrijednosti EPOC-a bile su značajno više u sprinterskoj grupi (SIE 7,5±1,3 l) u odnosu na vrijednosti u grupi MA (1,8±0,7). Sprinterski zadatak je proizveo višu kalorijsku potrošnju tijekom oporavka (156,9 kJ) u usporedbi s umjerenim aerobnim zadatkom MA (41,0 kJ) te je ona ostala značajno povišena (p=0,024) u odnosu na razinu u mirovanju tijekom cijelog perioda oporavka (30 minuta) za razliku od MA (6 minuta, p=0,003). Potrebna energija za oporavak nakon 3 ponovljena maksimalna Wingate testa od 30 sekunda bila je viša nego nakon 30 minuta umjerene aerobne aktivnosti. Buduća istraživanja trebala bi ispitati kronične učinke protokola maksimalnih sprinterskih napora na kardiovaskularni fitnes i sastav tijela

    RICORS2040 : The need for collaborative research in chronic kidney disease

    Get PDF
    Chronic kidney disease (CKD) is a silent and poorly known killer. The current concept of CKD is relatively young and uptake by the public, physicians and health authorities is not widespread. Physicians still confuse CKD with chronic kidney insufficiency or failure. For the wider public and health authorities, CKD evokes kidney replacement therapy (KRT). In Spain, the prevalence of KRT is 0.13%. Thus health authorities may consider CKD a non-issue: very few persons eventually need KRT and, for those in whom kidneys fail, the problem is 'solved' by dialysis or kidney transplantation. However, KRT is the tip of the iceberg in the burden of CKD. The main burden of CKD is accelerated ageing and premature death. The cut-off points for kidney function and kidney damage indexes that define CKD also mark an increased risk for all-cause premature death. CKD is the most prevalent risk factor for lethal coronavirus disease 2019 (COVID-19) and the factor that most increases the risk of death in COVID-19, after old age. Men and women undergoing KRT still have an annual mortality that is 10- to 100-fold higher than similar-age peers, and life expectancy is shortened by ~40 years for young persons on dialysis and by 15 years for young persons with a functioning kidney graft. CKD is expected to become the fifth greatest global cause of death by 2040 and the second greatest cause of death in Spain before the end of the century, a time when one in four Spaniards will have CKD. However, by 2022, CKD will become the only top-15 global predicted cause of death that is not supported by a dedicated well-funded Centres for Biomedical Research (CIBER) network structure in Spain. Realizing the underestimation of the CKD burden of disease by health authorities, the Decade of the Kidney initiative for 2020-2030 was launched by the American Association of Kidney Patients and the European Kidney Health Alliance. Leading Spanish kidney researchers grouped in the kidney collaborative research network Red de Investigación Renal have now applied for the Redes de Investigación Cooperativa Orientadas a Resultados en Salud (RICORS) call for collaborative research in Spain with the support of the Spanish Society of Nephrology, Federación Nacional de Asociaciones para la Lucha Contra las Enfermedades del Riñón and ONT: RICORS2040 aims to prevent the dire predictions for the global 2040 burden of CKD from becoming true

    Association between loop diuretic dose changes and outcomes in chronic heart failure: observations from the ESC-EORP Heart Failure Long-Term Registry

    Get PDF
    [Abstract] Aims. Guidelines recommend down-titration of loop diuretics (LD) once euvolaemia is achieved. In outpatients with heart failure (HF), we investigated LD dose changes in daily cardiology practice, agreement with guideline recommendations, predictors of successful LD down-titration and association between dose changes and outcomes. Methods and results. We included 8130 HF patients from the ESC-EORP Heart Failure Long-Term Registry. Among patients who had dose decreased, successful decrease was defined as the decrease not followed by death, HF hospitalization, New York Heart Association class deterioration, or subsequent increase in LD dose. Mean age was 66±13 years, 71% men, 62% HF with reduced ejection fraction, 19% HF with mid-range ejection fraction, 19% HF with preserved ejection fraction. Median [interquartile range (IQR)] LD dose was 40 (25–80) mg. LD dose was increased in 16%, decreased in 8.3% and unchanged in 76%. Median (IQR) follow-up was 372 (363–419) days. Diuretic dose increase (vs. no change) was associated with HF death [hazard ratio (HR) 1.53, 95% confidence interval (CI) 1.12–2.08; P = 0.008] and nominally with cardiovascular death (HR 1.25, 95% CI 0.96–1.63; P = 0.103). Decrease of diuretic dose (vs. no change) was associated with nominally lower HF (HR 0.59, 95% CI 0.33–1.07; P = 0.083) and cardiovascular mortality (HR 0.62 95% CI 0.38–1.00; P = 0.052). Among patients who had LD dose decreased, systolic blood pressure [odds ratio (OR) 1.11 per 10 mmHg increase, 95% CI 1.01–1.22; P = 0.032], and absence of (i) sleep apnoea (OR 0.24, 95% CI 0.09–0.69; P = 0.008), (ii) peripheral congestion (OR 0.48, 95% CI 0.29–0.80; P = 0.005), and (iii) moderate/severe mitral regurgitation (OR 0.57, 95% CI 0.37–0.87; P = 0.008) were independently associated with successful decrease. Conclusion. Diuretic dose was unchanged in 76% and decreased in 8.3% of outpatients with chronic HF. LD dose increase was associated with worse outcomes, while the LD dose decrease group showed a trend for better outcomes compared with the no-change group. Higher systolic blood pressure, and absence of (i) sleep apnoea, (ii) peripheral congestion, and (iii) moderate/severe mitral regurgitation were independently associated with successful dose decrease

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: A systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Background: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods: We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings: Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation: Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding: Bill & Melinda Gates Foundation
    corecore