3,546 research outputs found
Bone strength, load tolerance and injury risk in elite Australian football
A paucity of research exists to characterise and investigate lower-body musculoskeletal characteristics and morphological adaptations in elite Australian Footballers with the aim to improve screening, monitoring and load management practices. Given the high prevalence of lower-body skeletal injuries in Australian Football; and the ability to measure, modify and train muscle and bone strength and their derivatives; this project served to extend scientific understanding of musculoskeletal morphology and bone strength characteristics in elite level field-based team sport athletes through a series of research studies using Dual-energy X-ray Absorptiometry (DXA) and peripheral Quantitative Computed Tomography (pQCT). In particular, studies one and two provided normative and comparative lower-body musculoskeletal profiles of elite Australian Footballers, stratified by training age (exposure), limb function (asymmetry) and injury incidence (stress fracture), while study three quantified the morphological changes and magnitude of adaptation and maladaptation experienced by Australian Footballers following an in-season and off-season annual phase. The general conclusion provided by the collective studies of this thesis promotes the importance of bone structure and geometry as potent contributors to skeletal robustness, and bone strength. Athletes with higher levels of training exposure and greater physical resilience exhibited higher tibial mass and cortical density with thicker cortical walls and larger muscle and bone cross-sectional areas. Asymmetrical adaptations from differential loading patterns between limbs through-out an in-season and off-season generate vastly different unilateral load tolerance capabilities when extrapolated overtime. The high-impact gravitational loads experienced by the support limb appear to optimise the development of robust skeletal properties specific to bone structure and geometry which may serve as a loading model to prophylactically enhance bilateral musculoskeletal strength and resilience.
Study one provided a set of normative and comparative lower-body musculoskeletal values to describe and compare muscle and bone morphology between less experienced and more experienced athletes (training age); and differential loading patterns between the kicking and support limbs (limb function). Fifty-five athletes were stratified into less experienced (≤ 3 years; n = 27) and more experienced (\u3e 3 years; n = 28) groups in accordance with their training age. All athletes underwent whole-body DXA scans and lower-body pQCT tibial scans on the kicking and support limbs respectively. More experienced players exhibited greater tibial mass, trabecular vBMD, cortical vBMD and total vBMD (p \u3c 0.009; d ≥ 0.79); greater cortical thickness and cortical area (p \u3c 0.001; d ≥ 0.92), and larger stress-strain indices and absolute fracture loads (p ≤ 0.018; d ≥ 0.57) than less experienced players. More experienced players also exhibited greater muscle mass and muscle cross-sectional area (p ≤ 0.016; d ≥ 0.68). Differences were also observed between limbs, with greater material (tibial mass and cortical vBMD), structural (trabecular area, cortical area, total area, periosteal area and cortical thickness) and strength (stress-strain index and absolute fracture load) characteristics evident in the support leg comparative to the kicking leg of more experienced players (d ≥ 0.20); with significantly higher asymmetries in tibial mass and cross-sectional area evident in more experienced players than less experienced players as a product of limb function over time. The findings of this study illustrate that training exposure and continued participation in Australian Football produced greater lower-body material, structural and strength adaptations; with chronic exposure to asymmetrical loading patterns developing differential morphological changes between the kicking and support
Study two provided a retrospective and comparative set of lower-body musculoskeletal data to describe and compare muscle and bone morphology between injured and non-injured Australian Football athletes, in addition to injured and non-injured limbs within injured players, in order to identify musculoskeletal characteristics which may predispose athletes to stress fractures or highlight skeletal fragility. Fifty-five athletes were stratified into injured (n = 13) and non-injured (n = 42) groups. All athletes underwent whole-body DXA scans and lower-body pQCT tibial scans across both limbs. Injured players exhibited lower tibial mass (p ≤ 0.019; d ≥ 0.68), cortical vBMD (d ≥ 0.38) and marrow vBMD (d ≥ 0.21); smaller cortical area and periosteal area (p ≤ 0.039; d ≥ 0.63); smaller trabecular area, marrow area, total area, endocortical area and cortical thickness (d ≥ 0.22); lower stress-strain indices, absolute fracture loads and relative fracture loads (support leg: p ≤ 0.043; d ≥ 0.70, kicking leg: d ≥ 0.48) than non-injured players. Injured players also exhibited lower muscle cross-sectional area and muscle mass (p ≤ 0.034; d ≥ 0.79), yet higher muscle density (d ≥ 0.28) than non-injured players. Differences between injured and non-injured limbs internal to injured players were also observed, with lower material (tibial mass and total vBMD), structural (cortical area and cortical thickness) and strength (stress-strain index and relative fracture load) in the injured limb comparative to the non-injured limb (d = 0.20 – 0.70). Muscle density was lower in the injured limb (d = 0.54). The findings of this study illustrate a general inferiority and global musculoskeletal weakness in injured players, with non-injured players ~10-12% stronger across both limbs. Injured players were skeletally slender with smaller muscle and bone cross-sectional areas and thinner cortices. Similarly, injured limbs of injured players also exhibited smaller structural proportions, highlighting the importance of cortical area and cortical thickness as key structural and geometric skeletal properties with potent contributions to bone strength and resilience. limbs. Indeed, routine high-impact, gravitational load afforded to the support limb preferentially improves bone structure and geometry (cross sectional area and thickness) as potent contributors to bone strength and skeletal fatigue resistance.
Study three provided a seasonal investigation into lower-body musculoskeletal adaptations over the course of a ~26 week in-season and ~10 week off-season period in Australian Football. Forty athletes (n = 40) and twenty-two athletes (n = 22) were recruited to quantify morphological changes in muscle and bone following the in-season and off-season periods respectively. All athletes underwent whole-body DXA scans and lower-body pQCT tibial scans for the kicking and support limbs at the commencement and conclusion of each season. Australian Football athletes exhibited increases in trabecular vBMD, total vBMD and cortical thickness in the kicking leg; with increased cortical vBMD, total vBMD, trabecular area, total area, periosteal area, cortical thickness and reduced endocortical area in the support leg following the in-season period. Percent changes between limbs were significantly different for trabecular vBMD, cortical vBMD, total vBMD and trabecular area (p ≤ 0.049; d ≥ 0.46), despite similar increments in bone strength (~44 – 50 N), demonstrating asymmetrical morphological responses to differential loading patterns in-season. Conversely, Australian Football athletes exhibited material decreases in tibial mass, trabecular vBMD, cortical vBMD and total vBMD in both limbs over the off-season by similar yet opposite magnitudes to the benefits accrued during the in-season, in addition to reduced muscle area, highlighting a general musculoskeletal de-training effect. Structural adaptations were mostly maintained or increased for both limbs over the off-season, with bone strength completely reversed in the kicking leg, yet wholly preserved in the support leg; a lasting adaptation from regular high-impact, gravitational loading specific to the support leg. The findings of this study illustrate the osteogenic potential of a ~26 week in-season, and the de-training potential of a ~10 week off-season. Specifically, the kicking and support limbs continued to show asymmetrical morphological adaptations to differential in-season and off-season loading and de-loading patterns
A kinanthropometric analysis of accurate and inaccurate kickers: Implications for kicking accuracy in Australian football
A paucity of research exists investigating the potential relationship between the technical and temporal strategy of accurate and inaccurate kickers in response to physical parameters modifiable by athletic conditioning. While recent studies have produced improvements in performance when kicking for distance following structured resistance training interventions, no studies have examined the influence of such interventions on the enhancement of kicking accuracy. It was therefore the purpose of this thesis to extend scientific understanding of those mechanisms which might underpin accurate kicking performances through examining kinanthropometric, strength and muscularity profiles of accurate and inaccurate kickers in Australian Football using a series of research studies. In particular, studies one and two established valid and reliable measurement protocols, while studies three, four and five quantified whole-body composition, anthropometrics, segmental masses of the lower limbs, unilateral and bilateral lower-body strength, and lower limb kinematics during the drop punt. Study one established a standardised and reliable body positioning and scan analysis model using Dual Energy X-ray Absorptiometry (DEXA) to accurately identify and assess appendicular segmental mass components (upper arm, forearm, hand, thigh, shank and foot segments); producing very high intra-tester reliability (CV ≤ 2.6%; ICC ≥ 0.941) and very high inter-tester reliability (CV ≤ 2.4%; ICC ≥ 0.961). This methodological determination of intralimb and interlimb quantities of lean, fat and total mass could be used by strength and conditioning practitioners to monitor the efficacy of training interventions; track athletes during long-term athletic development programs; or identify potential deficiencies acquired through-out injury onset and during rehabilitation. Study two assessed a portable isometric lower-body strength testing device, successfully demonstrating its ability to derive valid and reliable representations of maximal isometric force (peak force) under bilateral and unilateral conditions (CV ≤ 4.7%; ICC ≥ 0.961). This device was unable to reliably determine rate of force development across either bilateral or unilateral conditions (CV: 14.5% - 45.5%; ICC: 0.360 – 0.943); and required an extra second of contraction time to achieve peak force (p \u3c 0.001). The portable apparatus may provide a more sport-specific assessment of maximal strength in sports where balance is an important component; such as the support leg during the kicking motion. Using the methodological approach established in study one; study three was a descriptive study which assessed the lower limb segmental profile of accurate and inaccurate kickers. A noticeable difference in leg mass characteristics was evident, with accurate kickers containing significantly greater quantities of relative lean mass (p ≤ 0.004; r = 0.426 to 0.698), significantly lower quantities of relative fat mass (p ≤ 0.024; r = -0.431 to -0.585), and significantly higher lean-to-fat mass ratios (p ≤ 0.009; r = 0.482 to 0.622) across all segments within both kicking and support limbs. To examine how these lower limb characteristics might adjust biomechanical strategy; study four used the methodological approach from study one in conjunction with three-dimensional kinematic data. No relationship was found between foot velocity and kicking accuracy (r = -0.035 to -0.083). Instead, it was the co-contribution of leg mass and foot velocity which were discriminatory factors between accurate and inaccurate kickers. A significant and strong correlation was also found between relative lean mass and kicking accuracy (p ≤ 0.001; r = 0.631). Greater relative lean mass within accurate kickers may heighten limb control due to reduced volitional effort and lower relative muscular impulses required to generate limb velocity. Study five - the final study of the thesis - assessed lower limb strength and muscularity using methodologies presented in studies one and two. Study five was able to successfully demonstrate a positive relationship between relative bilateral strength and support-leg unilateral strength with kicking accuracy outcomes (r = 0.379 to 0.401). A significant negative relationship was established between strength imbalances and kicking accuracy (p = 0.002; r = 0.516), supported by the significant positive relationship between the limb symmetry index for lean mass quantities and kicking accuracy outcomes (p = 0.003 to 0.029; r = 0.312 to 0.402). This highlighted the potential benefit of greater limb symmetry for strength and muscularity between kicking and support limbs within Australian Footballers, with particular emphasis placed toward support leg strength. The general conclusion provided by the thesis promotes the importance and positive influence of relative lean mass and lower body strength to kicking accuracy production during the drop punt. The findings provide a valid rationale for strength and conditioning professionals and skill acquisition coaches to properly consider an athlete’s strength, muscularity and body mass profiles when attempting to improve kicking performance. Given the cross-sectional nature of the Thesis, longitudinal resistance training studies should be attempted in future, to establish interventions which may heighten athletic conditioning and technical proficiency in football sports, with an express aim to improve drop punt kicking accuracy
The potential role of genetic markers in talent identification and athlete assessment in elite sport
In elite sporting codes, the identification and promotion of future athletes into specialized talent pathways is heavily reliant upon objective physical, technical, and tactical characteristics, in addition to subjective coach assessments. Despite the availability of a plethora of assessments, the dependence on subjective forms of identification remain commonplace in most sporting codes. More recently, genetic markers, including several single nucleotide polymorphisms (SNPs), have been correlated with enhanced aerobic capacity, strength, and an overall increase in athletic ability. In this review, we discuss the effects of a number of candidate genes on athletic performance, across single-skilled and multifaceted sporting codes, and propose additional markers for the identification of motor skill acquisition and learning. While displaying some inconsistencies, both the ACE and ACTN3 polymorphisms appear to be more prevalent in strength and endurance sporting teams, and have been found to correlate to physical assessments. More recently, a number of polymorphisms reportedly correlating to athlete performance have gained attention, however inconsistent research design and varying sports make it difficult to ascertain the relevance to the wider sporting population. In elucidating the role of genetic markers in athleticism, existing talent identification protocols may significantly improve—and ultimately enable—targeted resourcing in junior talent pathways
Impact of Diet and Quality Grade on Shelf Life of Beef Steaks
Steers were fed a diet containing dry rolled corn, steam flaked corn, dry rolled corn with 30% dried distillers grains, or steam flaked corn with 30% dried distillers grains. Strip loins from upper 2/3 Choice and Select- grade carcasses were obtained to evaluate the effects of diet and quality grade on shelf life characteristics. Strip loins were aged for 2, 9, 16, or 23 days. Results suggest that steaks from cattle fed steam flaked corn (with or without dried distillers grains) and from cattle fed dried distillers grains (regardless of corn type) had higher levels of many unsaturated fatty acids, more discoloration, and greater lipid oxidation compared to the dry rolled corn treatments or the no dried distillers grains treatments, respectively. Feeding of dry rolled corn or diets without dried distillers grains maintained red color better during retail display. Choice- grade steaks had significantly higher levels of unsaturated fatty acids like 18:2 and total polyunsaturated fatty acids than Select- grade steaks but did not diff er in color stability or oxidation. These data indicate the longest shelf life will occur when cattle are fed diets containing dry rolled corn (versus steam flaked corn) or without dried distillers grains (versus with dried distillers grains) and that both steam flaked corn and distillers grains have a negative impact on shelf life. Quality grade did not affect color stability
Movement economy in soccer: Current data and limitations
Soccer is an intermittent team-sport, where performance is determined by a myriad of psychological, technical, tactical, and physical factors. Among the physical factors, endurance appears to play a key role into counteracting the fatigue-related reduction in running performance observed during soccer matches. One physiological determinant of endurance is movement economy, which represents the aerobic energy cost to exercise at a given submaximal velocity. While the role of movement economy has been extensively examined in endurance athletes, it has received little attention in soccer players, but may be an important factor, given the prolonged demands of match play. For this reason, the current review discusses the nature, impact, and trainability of movement economy specific to soccer players. A summary of current knowledge and limitations of movement economy in soccer is provided, with an insight into future research directions, to make this important parameter more valuable when assessing and training soccer players’ running performance
Testosterone replacement for male military personnel - A potential countermeasure to reduce injury and improve performance under extreme conditions
Tactical operators, inclusive of soldiers in the military, are reliant upon their physiological and psychological state in often volatile and extreme life or death situations that require correct decisions and precise actions to ensure operational success with minimal collateral damage. Accordingly, the development of physical and mental resilience are hallmarks of prophylactic and remedial programs designed to ensure military personnel are combat ready, thus optimising their capacity to perform at expert levels, while reducing their risk of injury or the severity of injury sustained..
The potential therapeutic effects of creatine supplementation on body composition and muscle function in cancer
Low muscle mass in individuals with cancer has a profound impact on quality of life and independence and is associated with greater treatment toxicity and poorer prognosis. Exercise interventions are regularly being investigated as a means to ameliorate treatment-related adverse effects, and nutritional/supplementation strategies to augment adaptations to exercise are highly valuable. Creatine (Cr) is a naturally-occurring substance in the human body that plays a critical role in energy provision during muscle contraction. Given the beneficial effects of Cr supplementation on lean body mass, strength, and physical function in a variety of clinical populations, there is therapeutic potential in individuals with cancer at heightened risk for muscle loss. Here, we provide an overview of Cr physiology, summarize the evidence on the use of Cr supplementation in various aging/clinical populations, explore mechanisms of action, and provide perspectives on the potential therapeutic role of Cr in the exercise oncology setting
Exercise medicine for advanced prostate cancer
Purpose of review:
Exercise is a provocative medicine, known for its preventive, complimentary and rehabilitative role in the management of cancer. Impressively, exercise is also emerging as a synergistic and targeted medicine to enhance symptom control, modulate tumour biology and delay disease progression, with the potential to increase overall survival. Given the complex clinical presentation of advanced prostate cancer patients and their omnipresent comorbidities, this review describes the current and potential role of exercise medicine in advanced prostate cancer. Recent findings:
Exercise has been shown to be safe, feasible and effective for advanced prostate cancer patients, inclusive of patients with bone metastases; a previously excluded population due to patient and clinician fear of adverse events. Preclinical data provide insight into the ability of exercise to modulate cancer-specific outcomes, may synergistically increase the potency of chemotherapy and radiotherapy and may endogenously and/or mechanically suppress tumour formation, growth and invasion in visceral and skeletal tissue. Epidemiological studies have also shown an association between physical activity and increased survival. Summary:
Exercise oncology is rapidly evolving, with impressive possibilities that may directly improve patient outcomes in advanced prostate cancer. Research must focus on translating preclinical trials into human clinical trials and investigate the direct effect of exercise on overall survival
Avoiding the pitfalls of the DSM-5: A primer for health professionals
The Diagnostic and Statistical Manual of Mental Disorders-5 (DSM-5), is the leading system guiding the diagnosis of mental disorders. Accurate diagnoses of mental disorders are fundamental to guiding treatment and supportive care and have potential impacts on resources available to individuals. However, the DSM has the allure of ‘tick box’ diagnosis rather than biopsychosocial formulation and treatment planning, as well as multiple limitations impacting validity. Further, even with accurate diagnosis, there are strong concerns related to the reliability and validity of DSM diagnoses for clinical practice and research efforts. Understanding these limitations can help reduce errors and sub-optimal clinical decisions, treatment and supportive care service provision. The purpose of this primer is to assist health professionals in avoiding pitfalls by presenting five key considerations applicable to the DSM: (1) Binary Categories, (2) Comorbidity, (3) Within-Disorder Symptom Heterogeneity, (4) Physical Symptoms, (5) Distress and Impairment Criteria.Fig. 1 provides a graphical summary of the five considerations
- …