15 research outputs found

    Cutoff value for predicting success in triathlon mixed team relay

    Get PDF
    IntroductionThe Mixed-Team-Relay (MTR) triathlon is an original race format present on the international scene since 2009, which became an Olympic event at the Tokyo 2020 Games. The aim of this study was to define the probabilities of reaching a victory, a podium, or a finalist rank in a relay triathlon, according to the position of any of the four relayers (Women/Men/Women/Men) during each of the four segments (leg) of the race.MethodsAll MTR results from the World Series, Continental Championships, World Championships from 2009 to 2021 and Tokyo 2020 Olympics have been collected. We calculated the set of probability frequencies of reaching a given final state, according to any transient state during the race. All results are compared with a V' Cramer method.ResultsThe frequency of winning is similar at the end of Leg 1 for TOP1 (first position) and TOP2-3 (second and third positions). Then, a difference in the winning-associated frequencies is first observed after the Bike stage of Leg 2, where 47% of TOP1 athletes will win, vs 13% of the TOP2-3.DiscussionThis difference continually increases until the end of the race. Legs 2 and 3 are preponderant on the outcome of the race, the position obtained by each triathlete, especially in swimming and cycling, greatly influences the final performance of the team. Leg 1 allows to maintain contact with the head of the race, while Leg 4 sets in stone the position obtained by the rest of the team

    How Much Rugby is Too Much? A Seven-Season Prospective Cohort Study of Match Exposure and Injury Risk in Professional Rugby Union Players.

    Get PDF
    INTRODUCTION: Numerous studies have documented the incidence and nature of injuries in professional rugby union, but few have identified specific risk factors for injury in this population using appropriate statistical methods. In particular, little is known about the role of previous short-term or longer-term match exposures in current injury risk in this setting. OBJECTIVES: Our objective was to investigate the influence that match exposure has upon injury risk in rugby union. METHOD: We conducted a seven-season (2006/7-2012/13) prospective cohort study of time-loss injuries in 1253 English premiership professional players. Players' 12-month match exposure (number of matches a player was involved in for ≥20 min in the preceding 12 months) and 1-month match exposure (number of full-game equivalent [FGE] matches in preceding 30 days) were assessed as risk factors for injury using a nested frailty model and magnitude-based inferences. RESULTS: The 12-month match exposure was associated with injury risk in a non-linear fashion; players who had been involved in fewer than ≈15 or more than ≈35 matches over the preceding 12-month period were more susceptible to injury. Monthly match exposure was linearly associated with injury risk (hazard ratio [HR]: 1.14 per 2 standard deviation [3.2 FGE] increase, 90% confidence interval [CI] 1.08-1.20; likely harmful), although this effect was substantially attenuated for players in the upper quartile for 12-month match exposures (>28 matches). CONCLUSION: A player's accumulated (12-month) and recent (1-month) match exposure substantially influences their current injury risk. Careful attention should be paid to planning the workloads and monitoring the responses of players involved in: (1) a high (>≈35) number of matches in the previous year, (2) a low (<≈15) number of matches in the previous year, and (3) a low-moderate number of matches in previous year but who have played intensively in the recent past. These findings make a major contribution to evidence-based policy decisions regarding match workload limits in professional rugby union

    COL5A1 gene variants previously associated with reduced soft tissue injury risk are associated with elite athlete status in rugby.

    Get PDF
    BACKGROUND: Two common single nucleotide polymorphisms within the COL5A1 gene (SNPs; rs12722 C/T and rs3196378 C/A) have previously been associated with tendon and ligament pathologies. Given the high incidence of tendon and ligament injuries in elite rugby athletes, we hypothesised that both SNPs would be associated with career success. RESULTS: In 1105 participants (RugbyGene project), comprising 460 elite rugby union (RU), 88 elite rugby league athletes and 565 non-athlete controls, DNA was collected and genotyped for the COL5A1 rs12722 and rs3196378 variants using real-time PCR. For rs12722, the injury-protective CC genotype and C allele were more common in all athletes (21% and 47%, respectively) and RU athletes (22% and 48%) than in controls (16% and 41%, P ≤ 0.01). For rs3196378, the CC genotype and C allele were overrepresented in all athletes (23% and 48%) and RU athletes (24% and 49%) compared with controls (16% and 41%, P ≤ 0.02). The CC genotype in particular was overrepresented in the back and centres (24%) compared with controls, with more than twice the odds (OR = 2.25, P = 0.006) of possessing the injury-protective CC genotype. Furthermore, when considering both SNPs simultaneously, the CC-CC SNP-SNP combination and C-C inferred allele combination were higher in all the athlete groups (≥18% and ≥43%) compared with controls (13% and 40%; P = 0.01). However, no genotype differences were identified for either SNP when RU playing positions were compared directly with each other. CONCLUSION: It appears that the C alleles, CC genotypes and resulting combinations of both rs12722 and rs3196378 are beneficial for rugby athletes to achieve elite status and carriage of these variants may impart an inherited resistance against soft tissue injury, despite exposure to the high-risk environment of elite rugby. These data have implications for the management of inter-individual differences in injury risk amongst elite athletes

    Charge d’entraînement : de la définition duconcept aux méthodes de quantification Training load: From concept definition to quantification methods

    No full text
    Objective. — As the concept of training load grows in popularity, inconsistencies in definitions and quantification methods used in literature are more and more apparent. To limit such disagreements, principles founding the concept were studied to propose an operational definition of training load that was subsequently used to analyze the accuracy of quantification methods. News. — Training load might be defined as the value describing the dose of effort induced by the combination of the exercise variables. Training load metrics should not be skewed by excess weighting of exercise volume, intensity, or density. Early methods based on the product of intensity, volume, and density do not take into account the non-linear nature of increases in their components, and volume is overexpressed in their training load calculations. Conversely, fatigue achievement may accurately reflect the combined effects of all exercise variables by signaling the maximal psychophysiological stress and in consequence maximal attainable training load. Prospects and projects. — Fatigue-based quantification methods require better understanding and knowledge of exercise maximums. Probably new technologies for athletes monitoring might help to identify and record such maximums. Conclusion. — The present paper supported that for comparisons of exercises effects and to accumulate exercises of a program in total training load, the dose should be expressed relative to exercises maximums what drives to fatigue-based quantification methods

    Impact of fruit and vegetable vouchers and dietary advice on fruit and vegetable intake in a low-income population

    No full text
    Publication Inra prise en compte dans l'analyse bibliométrique des publications scientifiques mondiales sur les Fruits, les Légumes et la Pomme de terre. Période 2000-2012. http://prodinra.inra.fr/record/256699International audienceBackground/Objectives: Lower-income subgroups consume fewer servings of fruit and vegetables (FVs) compared with their more advantaged counterparts. To overcome financial barriers, FV voucher delivery has been proposed. Subjects/Methods: In a 12-month trial, 302 low-income adults 18-60 years old (defined by evaluation of deprivation and inequalities in health examination centers, a specific deprivation score) were randomized into two groups: dietary advice alone ('advice'), or dietary advice plus FV vouchers ('FV vouchers') (10-40 euros/month) exchangeable for fresh fruits and vegetables. Self-reported data were collected on FV consumption and socioeconomic status at baseline, 3, 9 and 12 months. Anthropometric and blood pressure measurements were conducted at these periods, as well as blood samples obtained for determination of vitamins. Descriptive analyses, multiple linear regression and logistic regression were performed to evaluate the impact of FV. Results: Between baseline and 3-month follow-up, mean FV consumption increased significantly in both the 'advice' (0.62 +/- 1.29 times/day, P=0.0004) and 'FV vouchers' groups (0.74 +/- 1.90, P=0.002), with no difference between groups. Subjects in the FV vouchers group had significantly decreased risk of low FV consumption (<1 time/day) compared with those in the advice group (P=0.008). No change was noted in vitamin levels (vitamin C and beta-carotene). The high number of lost-to-follow-up cases did not permit analysis at 9 or 12 months. Conclusion: In the low-income population, FV voucher delivery decreased the proportion of low FV consumers at 3 months. Longer-term studies are needed to assess their impact on nutritional status. European Journal of Clinical Nutrition (2012) 66, 369-375; doi: 10.1038/ejcn.2011.173; published online 12 October 201
    corecore