67 research outputs found
Aerobic Fitness Level Affects Cardiovascular and Salivary Alpha Amylase Responses to Acute Psychosocial Stress
BACKGROUND: Good physical fitness seems to help the individual to buffer the potential harmful impact of psychosocial stress on somatic and mental health. The aim of the present study is to investigate the role of physical fitness levels on the autonomic nervous system (ANS; i.e. heart rate and salivary alpha amylase) responses to acute psychosocial stress, while controlling for established factors influencing individual stress reactions.
METHODS: The Trier Social Stress Test for Groups (TSST-G) was executed with 302 male recruits during their first week of Swiss Army basic training. Heart rate was measured continuously, and salivary alpha amylase was measured twice, before and after the stress intervention. In the same week, all volunteers participated in a physical fitness test and they responded to questionnaires on lifestyle factors and personal traits. A multiple linear regression analysis was conducted to determine ANS responses to acute psychosocial stress from physical fitness test performances, controlling for personal traits, behavioural factors, and socioeconomic data.
RESULTS: Multiple linear regression revealed three variables predicting 15 % of the variance in heart rate response (area under the individual heart rate response curve during TSST-G) and four variables predicting 12 % of the variance in salivary alpha amylase response (salivary alpha amylase level immediately after the TSST-G) to acute psychosocial stress. A strong performance at the progressive endurance run (high maximal oxygen consumption) was a significant predictor of ANS response in both models: low area under the heart rate response curve during TSST-G as well as low salivary alpha amylase level after TSST-G. Further, high muscle power, non-smoking, high extraversion, and low agreeableness were predictors of a favourable ANS response in either one of the two dependent variables.
CONCLUSIONS: Good physical fitness, especially good aerobic endurance capacity, is an important protective factor against health-threatening reactions to acute psychosocial stress
Transformational Leadership, Achievement Motivation, and Perceived Stress in Basic Military Training: A Longitudinal Study of Swiss Armed Forces
In Switzerland, military service is a civic obligation for all adult male citizens, and thus, leadership style can be particularly challenging. The present study investigated the impact of superiors’ leadership styles on recruits’ achievement motivation, organizational citizenship behavior (OCB), and perceived stress during their Basic Military Training (BMT). To this end, a total of 525 male recruits (mean age: 20.3 years) recruits were assessed both cross-sectionally and longitudinally. At the start of BMT (baseline), at week 7, and at week 11, participants completed a series of selfrating questionnaires covering demographic information, achievement motivation, organizational citizenship behavior (OCB), perceived stress, and their superiors’ leadership styles (transformational, transactional und laissez-faire). Longitudinally, scores for achievement motivation and OCB showed no significant difference between baseline and the 11th week. In a group comparison, the group experiencing higher transformational leadership (from week 7 to week 11) had the highest scores for achievement motivation and OCB, and the lowest scores for perceived stress, all at week 11. Exploratively, achievement motivation and OCB at baseline were associated with transformational leadership and transactional leadership at week 7 and week 11. Perceived stress at baseline correlated only with transformational leadership but not with transactional leadership, both at week 7 and week 11. Transformational leadership style fostered achievement motivation and OCB in Swiss military recruits and protected them from stress, both cross-sectionally and longitudinally.ISSN:2071-105
Evaluation of pulse rate measurement with a wrist worn device during different tasks and physical activity
The purpose of this study was to evaluate the wrist-worn device Mio FUSE, which estimates heart rate (HR) based on photo-plethysmography, 1) in a large study group during a standardised activity, 2) in a small group during a variety of activities and 3) to investigate factors affecting HR accuracy in a real-world setting. First, 53 male participants (20 ±1 years; 1.79 ±0.07 m; 76.1 ±10.5 kg) completed a 35-km march wearing the Equivital EQ-02 as a criterion measure. Second, 5 participants (whereof 3 female; 29 ±5 years; 1.74 ±0.07 m; 67.8 ±11.1 kg) independently performed 25 activities, categorised as sitting passive, sitting active, standing, cyclic and anti-cyclic activities with the Polar H7 as a criterion device. Equivalence testing and Bland-and-Altman analyses were undertaken to assess the accuracy to the criterion devices. Third, confounders affecting HR accuracy were investigated using multiple backwards regression analyses. The Mio FUSE was equivalent to the respective criterion measures with only small systematic biases of -3.5 bpm (-2.6%) and -1.7 bpm (-1.3%) with limits of agreements of ±10.1 bpm and ±10.8 bpm during the 35-km march and during different activities, respectively. Confounding factors negatively affecting the accuracy of the Mio FUSE were found to include larger wrist size and intensified arm and/or wrist movement. The wrist-worn Mio FUSE can be recommended to estimate overall HR accurately for different types of activities in healthy adults. However, during sporting activities involving intensified arm and/or wrist movement or for detailed continuous analysis, a chest strap is preferred to the Mio FUSE to optimise HR estimation accuracy
Validation of ambulatory monitoring devices to measure energy expenditure and heart rate in a military setting
Objectives.; To investigate the validity of different devices and algorithms used in military organizations worldwide to assess physical activity energy expenditure (PAEE) and heart rate (HR) among soldiers.; Design.; Device validation study.; Methods; . Twenty-three male participants serving their mandatory military service accomplished, firstly, nine different military specific activities indoors, and secondly, a normal military routine outdoors. Participants wore simultaneously an ActiHeart, Everion, MetaMax 3B, Garmin Fenix 3, Hidalgo EQ02, and PADIS 2.0 system. The PAEE and HR data of each system were compared to the criterion measures MetaMax 3B and Hidalgo EQ02, respectively.; Results; . Overall, the recorded systematic errors in PAEE estimation ranged from 0.1 (±1.8) kcal.min; -1; to -1.7 (±1.8) kcal.min; -1; for the systems PADIS 2.0 and Hidalgo EQ02 running the Royal Dutch Army algorithm, respectively, and in the HR assessment ranged from -0.1 (±2.1) b.min; -1; to 0.8 (±3.0) b.min; -1; for the PADIS 2.0 and ActiHeart systems, respectively. The mean absolute percentage error (MAPE) in PAEE estimation ranged from 29.9% to 75.1%, with only the Everion system showing an overall MAP
Children’s and adolescents’ rising animal-source food intakes in 1990–2018 were impacted by age, region, parental education and urbanicity
Animal-source foods (ASF) provide nutrition for children and adolescents’ physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the world’s child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 15–19 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes.publishedVersio
Incident type 2 diabetes attributable to suboptimal diet in 184 countries
The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.8–14.4 million) incident T2D cases, representing 70.3% (68.8–71.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.0–27.1%)), excess refined rice and wheat intake (24.6% (22.3–27.2%)) and excess processed meat intake (20.3% (18.3–23.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.4–87.7%)) and Latin America and the Caribbean (81.8% (80.1–83.4%)); and lowest proportional burdens were in South Asia (55.4% (52.1–60.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally.publishedVersio
Incident type 2 diabetes attributable to suboptimal diet in 184 countries
The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.8–14.4 million) incident T2D cases, representing 70.3% (68.8–71.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.0–27.1%)), excess refined rice and wheat intake (24.6% (22.3–27.2%)) and excess processed meat intake (20.3% (18.3–23.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.4–87.7%)) and Latin America and the Caribbean (81.8% (80.1–83.4%)); and lowest proportional burdens were in South Asia (55.4% (52.1–60.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally
Correction to: Risk-reducing salpingo-oophorectomy, natural menopause, and breast cancer risk: an international prospective cohort of BRCA1 and BRCA2 mutation carriers.
After publication of the original article [1], we were notified that columns in Table 2 were erroneously displayed
A Solve-RD ClinVar-based reanalysis of 1522 index cases from ERN-ITHACA reveals common pitfalls and misinterpretations in exome sequencing
Purpose
Within the Solve-RD project (https://solve-rd.eu/), the European Reference Network for Intellectual disability, TeleHealth, Autism and Congenital Anomalies aimed to investigate whether a reanalysis of exomes from unsolved cases based on ClinVar annotations could establish additional diagnoses. We present the results of the “ClinVar low-hanging fruit” reanalysis, reasons for the failure of previous analyses, and lessons learned.
Methods
Data from the first 3576 exomes (1522 probands and 2054 relatives) collected from European Reference Network for Intellectual disability, TeleHealth, Autism and Congenital Anomalies was reanalyzed by the Solve-RD consortium by evaluating for the presence of single-nucleotide variant, and small insertions and deletions already reported as (likely) pathogenic in ClinVar. Variants were filtered according to frequency, genotype, and mode of inheritance and reinterpreted.
Results
We identified causal variants in 59 cases (3.9%), 50 of them also raised by other approaches and 9 leading to new diagnoses, highlighting interpretation challenges: variants in genes not known to be involved in human disease at the time of the first analysis, misleading genotypes, or variants undetected by local pipelines (variants in off-target regions, low quality filters, low allelic balance, or high frequency).
Conclusion
The “ClinVar low-hanging fruit” analysis represents an effective, fast, and easy approach to recover causal variants from exome sequencing data, herewith contributing to the reduction of the diagnostic deadlock
- …