22 research outputs found

    The effects of an eight over Cricket bowling spell upon pace bowling biomechanics and performance within different delivery lengths

    Get PDF
    Pace bowlers must often perform extended bowling spells with maximal ball release speed (BRS) while targeting different delivery lengths when playing a multi-day match. This study investigated the effect of an eight over spell upon pace bowling biomechanics and performance at different delivery lengths. Nine male bowlers (age = 18.8 ± 1.7 years) completed an eight over spell, while targeting different lengths (short: 7–10 m, good: 4–7 m, full: 0–4 m from the batter’s stumps, respectively) in a randomized order. Trunk, knee and shoulder kinematics and ground reaction forces at front foot contact (FFC), as well as run-up velocity and BRS were measured. Paired sample t-tests (p ≀ 0.01), Hedges’ g effect sizes, and statistical parametrical mapping were used to assess differences between mean variables from the first and last three overs. No significant differences (p = 0.05–0.98) were found in any discrete or continuous variables, with the magnitude of difference being trivial-to-medium (g = 0.00–0.73) across all variables. Results suggest pace bowlers sustain BRS through a single eight over spell while tolerating the repeatedly high whole-body biomechanical loads as suggested by maintaining the kinematics or technique at the assessed joints during FFC. Practically, the findings are advantageous for bowling performance and support current bowling load monitoring practices

    Change of Direction and Agility Tests

    Get PDF
    THE ABILITY TO CHANGE DIRECTION IS A HIGHLY VALUED ATHLETIC QUALITY IN SPORT AND HAS BEEN MEASURED EXTENSIVELY. DESPITE THE IMPORTANCE AND MAGNITUDE OF RESEARCH ON CHANGE OF DIRECTION (COD) AND AGILITY, THE VALIDITY OF THE PERFORMANCE MEASURES USED TO ASSESS THESE ABILITIES HAVE FACED LIMITED SCRUTINY. A CRITICAL EVALUATION OF OUR CURRENT MEASURES OF COD AND AGILITY ARE PRESENTED. FURTHERMORE, A SUMMARY OF RECOMMENDATIONS TO ENHANCE THE VALIDITY OF COD AND AGILITY ASSESSMENT IS PROVIDED IN THE ULTIMATE EFFORT TO IMPROVE OUR UNDERSTANDING OF THIS CRUCIAL ATHLETIC QUALITY. A VIDEO ABSTRACT DESCRIBING THIS ARTICLE CAN BE FOUND IN SUPPLEMENTAL DIGITAL CONTENT 1 (SEE VIDEO, http://links.lww.com/SCJ/A217)

    The relationship between inertial measurement unit derived 'force signatures' and ground reaction forces during cricket fast bowling

    Get PDF
    This study assessed the reliability and validity of segment measured accelerations in comparison to front foot contact (FFC) ground reaction force (GRF) during the delivery stride for cricket pace bowlers. Eleven recreational bowlers completed a 30-delivery bowling spell. Trunk- and tibia-mounted inertial measurement units (IMUs) were used to measure accelerations, converted to force, for comparisons to force plate GRF discrete measures. These measures included peak force, impulse and the continuous force–time curve in the vertical and braking (horizontal) planes. Reliability and validity was determined by intra-class correlation coefficients (ICC), coefficient of variation (CV), Bland–Altman plots, paired sample t-tests, Pearson’s correlation and one-dimensional (1D) statistical parametrical mapping (SPM). All ICC (0.90–0.98) and CV (4.23–7.41%) were acceptable, except for tibia-mounted IMU braking peak force (CV = 12.44%) and impulse (CV = 18.17%) and trunk vertical impulse (CV = 17.93%). Bland–Altman plots revealed wide limits of agreement between discrete IMU force signatures and force plate GRF. The 1D SPM outlined numerous significant (p \u3c 0.01) differences between trunk- and tibia-located IMU-derived measures and force plate GRF traces in vertical and braking (horizontal) planes. The trunk- and tibia-mounted IMUs appeared to not represent the GRF experienced during pace bowling FFC when compared to a gold-standard force plate. © 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group

    Not as simple as it seems: Front foot contact kinetics, muscle function and ball release speed in cricket pace bowlers.

    Get PDF
    This study investigated the relationship between front foot contact (FFC) ground reaction forces (GRF) during the delivery stride, lower-limb strength, eccentric dexterity and power, and ball release speed (BRS) among pace bowlers. Thirteen high-level male pace bowlers performed double and single leg drop landings; isometric mid-thigh pull; countermovement jump; and pace bowling (two-over bowling spell measuring BRS and FFC GRF). The relationship between assessed variables and BRS was determined via frequentist and Bayesian multiple linear regression. The model including peak braking force was the most probable given the data (Bayes Factor=1.713) but provided only evidence in comparison to the null model. The results of frequentist and Bayesian modelling were comparable with peak braking force explaining 23.3% of the variance in BRS ( =4.64, =0.054). Results indicate pace bowlers with greater peak braking GRF during FFC generally elicit higher BRS. However, the weak relationship between peak braking force and BRS, and the lack of a linear relationship between BRS and other variables, highlights the complexities and inter-individual variability inherent to pace bowling at a high-level. A more individual-focused analysis revealed varied strategies within pace bowlers to deliver the outcome (e.g., BRS) and should be considered in future study designs

    A Multicenter, Randomized, Placebo‐Controlled Trial of Atorvastatin for the Primary Prevention of Cardiovascular Events in Patients With Rheumatoid Arthritis

    Get PDF
    Objective: Rheumatoid arthritis (RA) is associated with increased cardiovascular event (CVE) risk. The impact of statins in RA is not established. We assessed whether atorvastatin is superior to placebo for the primary prevention of CVEs in RA patients. Methods: A randomized, double‐blind, placebo‐controlled trial was designed to detect a 32% CVE risk reduction based on an estimated 1.6% per annum event rate with 80% power at P 50 years or with a disease duration of >10 years who did not have clinical atherosclerosis, diabetes, or myopathy received atorvastatin 40 mg daily or matching placebo. The primary end point was a composite of cardiovascular death, myocardial infarction, stroke, transient ischemic attack, or any arterial revascularization. Secondary and tertiary end points included plasma lipids and safety. Results: A total of 3,002 patients (mean age 61 years; 74% female) were followed up for a median of 2.51 years (interquartile range [IQR] 1.90, 3.49 years) (7,827 patient‐years). The study was terminated early due to a lower than expected event rate (0.70% per annum). Of the 1,504 patients receiving atorvastatin, 24 (1.6%) experienced a primary end point, compared with 36 (2.4%) of the 1,498 receiving placebo (hazard ratio [HR] 0.66 [95% confidence interval (95% CI) 0.39, 1.11]; P = 0.115 and adjusted HR 0.60 [95% CI 0.32, 1.15]; P = 0.127). At trial end, patients receiving atorvastatin had a mean ± SD low‐density lipoprotein (LDL) cholesterol level 0.77 ± 0.04 mmoles/liter lower than those receiving placebo (P < 0.0001). C‐reactive protein level was also significantly lower in the atorvastatin group than the placebo group (median 2.59 mg/liter [IQR 0.94, 6.08] versus 3.60 mg/liter [IQR 1.47, 7.49]; P < 0.0001). CVE risk reduction per mmole/liter reduction in LDL cholesterol was 42% (95% CI −14%, 70%). The rates of adverse events in the atorvastatin group (n = 298 [19.8%]) and placebo group (n = 292 [19.5%]) were similar. Conclusion: Atorvastatin 40 mg daily is safe and results in a significantly greater reduction of LDL cholesterol level than placebo in patients with RA. The 34% CVE risk reduction is consistent with the Cholesterol Treatment Trialists’ Collaboration meta‐analysis of statin effects in other populations

    Root and collar rot pathogens associated with yield decline of processing tomatoes in Victoria, Australia

    No full text
    © 2020 Sophia CallaghanThe processing tomato industry in Victoria, Australia, has experienced a yield decline over the last decade, resulting in losses estimated at 10% per annum. The decline was attributed to the necrosis of lateral and feeder rootlets and the collar region resulting in plant stunting and a reduction in fruit production. Therefore, the hypothesis underlying this study was that the decline is caused by the cumulative effects of damage by a complex of soil-borne root and collar rot pathogens. Surveys of processing tomato crops were undertaken over three consecutive growing seasons between 2016 and 2019 to investigate the pathogens, symptoms and diseases associated with yield decline. Soil-borne fungal and oomycete pathogens were the focus but bacterial pathogens, viruses, nematodes and phytoplasmas were also noted. Systematic isolation from diseased roots and the collar region of plants putatively infected by fungal and oomycete pathogens was undertaken. Identification of isolates was based on cultural morphology, ITS sequencing and in some cases commercial qPCR testing. Fusarium oxysporum and Pythium spp. were the most abundant putative pathogens associated with plants exhibiting poor growth. Other putative pathogenic fungi and oomycetes which were less commonly encountered included Alternaria spp., Colletotrichum coccodes, Fusarium solani, Phytophthora nictotianae, Phytophthora cajani, Plectosphaerella spp., Rhizoctonia solani, Sclerotinia minor and S. sclerotiorum. A novel Fusarium collar and root rot disease of processing tomatoes was discovered during the surveys. The disease was characterised by chocolate-brown streaking in the internal collar and tap root tissue, as well as lateral root rot of stunted tomato plants. Morphological characterisation and multi-loci phylogenetics (ITS, ef1a and Pgx4), were used to identify the causal pathogen as Fusarium oxysporum. The disease was initially thought to resemble Fusarium Crown and Root Rot (FCRR) caused by Fusarium oxysporum f. sp. radicis-lycopersici (Forl), a disease which has not been reported in Australia. However, subsequent pathogenicity and physiological assessments of isolates suggested the disease was caused by a novel Fusarium pathogen. Consequently, this disease was named chocolate streak disease (CSD) to differentiate it from FCRR. Pythium was the second most abundant organism isolated during the surveys. As Pythium is a large genus consisting of species beneficial, neutral and detrimental to plant growth, further investigation was required to understand the impact of Pythium spp. on processing tomato growth and yield. Eleven species of Pythium were identified based on cultural characteristics and phylogenetic analysis using ITS, Cox-1 and Cox-2 gene sequences. None of these Pythium species had been reported previously from processing or table tomatoes in Australia. In addition, this is the first report of P. carolinianum, P. heterothallicum, P. recalcitrans and a new Pythium sp. from field-grown tomato crops globally. Pythium dissotocum was the most abundant and widespread species. Pythium ultimum, P. aphanidermatum and P. irregulare were the most aggressive towards both seedlings and mature plants, causing pre- and post-germination damping-off, severe root rot and stunting. Collectively, the evidence provided by this study supports the hypothesis that a complex of root and collar rot pathogens, particularly F. oxypsporum and Pythium spp., are contributing to the 10% yield loss in Victorian processing tomatoes

    Change of direction deficit: A more isolated measure of change of direction performance than total 505 time

    No full text
    Nimphius, S, Callaghan, SJ, Spiteri, T, and Lockie, RG. Change of direction deficit: A more isolated measure of change of direction performance than total 505 time. J Strength Cond Res 30 (11): 3024-3032, 2016 - Most change of direction (COD) tests use total time to evaluate COD performance. This makes it difficult to identify COD ability because the majority of time is a function of linear running. The COD deficit has been proposed as a practical measure to isolate COD ability independent of sprint speed. This study evaluated relationships between sprint time, 505 time, and COD deficit, and whether the COD deficit identified a different and more isolated measure of COD ability compared with 505 time. Seventeen cricketers performed the 505 for both left and right sides and 30-m sprint tests (with 10-m split time). The COD deficit for both sides was calculated as the difference between average 505 and 10-m time. Correlations were calculated between all variables (p ≀ 0.05). To compare 505 time and COD deficit, z-scores were calculated; the difference in these scores was evaluated for each subject. The COD deficit correlated to 505 (r 0.74-0.81) but not sprint time (r -0.11 to 0.10). In contrast, 505 time did correlate with sprint time (r 0.52-0.70). Five of 17 subjects were classified differently for COD ability when comparing standardized scores for 505 time vs. COD deficit. Most subjects (88-94%) had a meaningful difference between 505 time and COD deficit. Using 505 time to determine COD ability may result in a large amount of replication to linear speed assessments. The COD deficit may be a practical tool to better isolate and identify an athlete\u27s ability to change direction. © 2016 National Strength and Conditioning Association
    corecore