120 research outputs found

    Appearance of microvascular obstruction on high resolution first-pass perfusion, early and late gadolinium enhancement CMR in patients with acute myocardial infarction

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The presence and extent of microvascular obstruction (MO) after acute myocardial infarction can be measured by first-pass gadolinium-enhanced perfusion cardiovascular magnetic resonance (CMR) or after gadolinium injection with early or late enhancement (EGE/LGE) imaging. The volume of MO measured by these three methods may differ because contrast agent diffusion into the MO reduces its apparent extent over time. Theoretically, first-pass perfusion CMR should be the most accurate method to measure MO, but this technique has been limited by lower spatial resolution than EGE and LGE as well as incomplete cardiac coverage. These limitations of perfusion CMR can be overcome using spatio-temporal undersampling methods. The purpose of this study was to compare the extent of MO by high resolution first-pass <it>k-t </it>SENSE accelerated perfusion, EGE and LGE.</p> <p>Methods</p> <p>34 patients with acute ST elevation myocardial infarction, treated successfully with primary percutaneous coronary intervention (PPCI), underwent CMR within 72 hours of admission. <it>k-t </it>SENSE accelerated first-pass perfusion MR (7 fold acceleration, spatial resolution 1.5 mm × 1.5 mm × 10 mm, 8 slices acquired over 2 RR intervals, 0.1 mmol/kg Gd-DTPA), EGE (14 minutes after injection with a fixed TI of 440 ms) and LGE images (1012 minutes after injection, TI determined by a Look-Locker scout) were acquired. MO volume was determined for each technique by manual planimetry and summation of discs methodology.</p> <p>Results</p> <p><it>k-t </it>SENSE first-pass perfusion detected more cases of MO than EGE and LGE (22 vs. 20 vs. 14, respectively). The extent of MO imaged by first-pass perfusion (median mass 4.7 g, IQR 6.7) was greater than by EGE (median mass 2.3 g, IQR 7.1, p = 0.002) and LGE (median mass 0.2 g, IQR 2.4, p = 0.0003). The correlation coefficient between MO mass measured by first-pass perfusion and EGE was 0.91 (p < 0.001).</p> <p>Conclusion</p> <p>The extent of MO following acute myocardial infarction appears larger on high-resolution first-pass perfusion CMR than on EGE and LGE. Given the inevitable time delay between gadolinium administration and acquisition of either EGE or LGE images, high resolution first-pass perfusion imaging may be the most accurate method to quantify MO.</p

    Not as simple as it seems: Front foot contact kinetics, muscle function and ball release speed in cricket pace bowlers.

    Get PDF
    This study investigated the relationship between front foot contact (FFC) ground reaction forces (GRF) during the delivery stride, lower-limb strength, eccentric dexterity and power, and ball release speed (BRS) among pace bowlers. Thirteen high-level male pace bowlers performed double and single leg drop landings; isometric mid-thigh pull; countermovement jump; and pace bowling (two-over bowling spell measuring BRS and FFC GRF). The relationship between assessed variables and BRS was determined via frequentist and Bayesian multiple linear regression. The model including peak braking force was the most probable given the data (Bayes Factor=1.713) but provided only evidence in comparison to the null model. The results of frequentist and Bayesian modelling were comparable with peak braking force explaining 23.3% of the variance in BRS ( =4.64, =0.054). Results indicate pace bowlers with greater peak braking GRF during FFC generally elicit higher BRS. However, the weak relationship between peak braking force and BRS, and the lack of a linear relationship between BRS and other variables, highlights the complexities and inter-individual variability inherent to pace bowling at a high-level. A more individual-focused analysis revealed varied strategies within pace bowlers to deliver the outcome (e.g., BRS) and should be considered in future study designs

    Impact of harness attachment point on kinetics and kinematics during sled towing.

    Get PDF
    Resisted sprint training is performed in a horizontal direction, and involves similar muscles, velocities and ranges of motion (ROM) to those of normal sprinting. Generally, sleds are attached to the athletes via a lead (3m) and harness; the most common attachment points are the shoulder or waist. At present, it is not known how the different harness point's impact on the kinematics and kinetics associated with sled towing (ST). The aim of the current investigation was to examine the kinetics and kinematics of shoulder and waist harness attachment points in relation to the acceleration phase of ST. Fourteen trained males completed normal and ST trials, loaded at 10% reduction of sprint velocity. Sagittal plane kinematics from the trunk, hip, knee and ankle were measured, together with stance phase kinetics (third foot-strike). Kinetic and kinematic parameters were compared between harness attachments using one-way repeated measures analysis of variance. The results indicated that various kinetic differences were present between the normal and ST conditions. Significantly greater net horizontal mean force, net horizontal impulses, propulsive mean force and propulsive impulses were measured (p>0.05). Interestingly, the waist harness also led to greater net horizontal impulse when compared to the shoulder attachment (p = 0.000). In kinematic terms, ST conditions significantly increased peak flexion in hip, knee and ankle joints compared to the normal trials (p<0.05). Results highlighted that the shoulder harness had a greater impact on trunk and knee joint kinematics when compared to the waist harness (p<0.05). In summary, waist harnesses appear to be the most suitable attachment point for the acceleration phase of sprinting. Sled towing with these attachments resulted in fewer kinematic alterations and greater net horizontal impulse when compared to the shoulder harness. Future research is necessary, in order to explore the long-term adaptations of these acute changes

    An Analysis of Collegiate Club-Sport Female Lacrosse Players: Sport-Specific Field Test Performance and the Influence of Lacrosse Stick Carrying

    Get PDF
    International Journal of Exercise Science 11(4): 269-280, 2018. Lacrosse is a field-based, intermittent sport that requires players to use a stick with a shaft and mesh pocket to manipulate the ball. However, there has been limited analysis of the characteristics of collegiate club-sport players, and whether stick carry influences the sprinting speed of lacrosse players. As a result, this study investigated the field test characteristics of collegiate club-sport female lacrosse players, and the effects of stick carry on linear and change-of-direction speed. Nine players (seven field players, two goalkeepers) volunteered for this study and completed: vertical jump and standing broad jump; 30-meter (m) sprint (0-5, 0-10, and 0-30 m intervals) and modified T-test without and with a stick; and the Yo-Yo Intermittent Recovery Test. Magnitude-based inference analyses via effect sizes (d) compared the field players and goalkeepers. Data was pooled for the 30-m sprint and modified T-test to examine the effect of stick carry via paired samples t-tests (p\u3c0.05) and effect sizes. Field players performed better in most field tests (d=0.93-2.45), although goalkeepers generated greater vertical jump power (d=2.01). With regards to the effects of stick carry, there was a significant difference between the faster 0-5 m sprint interval without a stick compared to with a stick (p=0.02), but this had a small effect (d=0.25). There were no differences between the other sprint intervals and modified T-test (p=0.08-0.39; d=0.06-0.19). When contextualized with comparisons to other female collegiate athletes, the results indicated limitations in training exposure for collegiate club-sport lacrosse players. Furthermore, stick carry generally did not affect speed

    An investigation of the mechanics and sticking region of a one-repetition maximum close-grip bench press versus the traditional bench press

    Get PDF
    The close-grip bench press (CGBP) is a variation of the traditional bench press (TBP) that uses a narrower grip (~95% of biacromial distance (BAD)) and has potential application for athletes performing explosive arm actions from positions where the hands are held close to the torso. Limited research has investigated CGBP mechanics compared to the TBP. Twenty-seven resistance-trained individuals completed a one-repetition maximum TBP and CGBP. The TBP was performed with the preferred grip; the CGBP with a grip width of 95% BAD. A linear position transducer measured lift distance and duration; peak and mean power, velocity, and force; distance and time when peak power occurred; and work. Pre-sticking region (PrSR), sticking region, and post-sticking region distance and duration for each lift was measured. A repeated measures ANOVA was used to derive differences between TBP and CGBP mechanics (p \u3c 0.01); effect sizes (d) were also calculated. A greater load was lifted in the TBP, thus mean force was greater (d = 0.16–0.17). Peak power and velocity were higher in the CGBP, which had a longer PrSR distance (d = 0.49–1.32). The CGBP could emphasize power for athletes that initiate explosive upper-body actions with the hands positioned close to the torso

    A prospective observational study to assess the diagnostic accuracy of clinical decision rules for children presenting to emergency departments after head injuries (protocol): The Australasian Paediatric Head Injury Rules Study (APHIRST)

    Get PDF
    Background: Head injuries in children are responsible for a large number of emergency department visits. Failure to identify a clinically significant intracranial injury in a timely fashion may result in long term neurodisability and death. Whilst cranial computed tomography (CT) provides rapid and definitive identification of intracranial injuries, it is resource intensive and associated with radiation induced cancer. Evidence based head injury clinical decision rules have been derived to aid physicians in identifying patients at risk of having a clinically significant intracranial injury. Three rules have been identified as being of high quality and accuracy: the Canadian Assessment of Tomography for Childhood Head Injury (CATCH) from Canada, the Children's Head Injury Algorithm for the Prediction of Important Clinical Events (CHALICE) from the UK, and the prediction rule for the identification of children at very low risk of clinically important traumatic brain injury developed by the Pediatric Emergency Care Applied Research Network (PECARN) from the USA. This study aims to prospectively validate and compare the performance accuracy of these three clinical decision rules when applied outside the derivation setting.Methods/design: This study is a prospective observational study of children aged 0 to less than 18 years presenting to 10 emergency departments within the Paediatric Research in Emergency Departments International Collaborative (PREDICT) research network in Australia and New Zealand after head injuries of any severity. Predictor variables identified in CATCH, CHALICE and PECARN clinical decision rules will be collected. Patients will be managed as per the treating clinicians at the participating hospitals. All patients not undergoing cranial CT will receive a follow up call 14 to 90 days after the injury. Outcome data collected will include results of cranial CTs (if performed) and details of admission, intubation, neurosurgery and death. The performance accuracy of each of the rules will be assessed using rule specific outcomes and inclusion and exclusion criteria.Discussion: This study will allow the simultaneous comparative application and validation of three major paediatric head injury clinical decision rules outside their derivation setting.Trial registration: The study is registered with the Australian New Zealand Clinical Trials Registry (ANZCTR)- ACTRN12614000463673 (registered 2 May 2014). © 2014 Babl et al.; licensee BioMed Central Ltd

    Arrhythmia and Death Following Percutaneous Revascularization in Ischemic Left Ventricular Dysfunction: Prespecified Analyses From the REVIVED-BCIS2 Trial.

    Get PDF
    BACKGROUND: Ventricular arrhythmia is an important cause of mortality in patients with ischemic left ventricular dysfunction. Revascularization with coronary artery bypass graft or percutaneous coronary intervention is often recommended for these patients before implantation of a cardiac defibrillator because it is assumed that this may reduce the incidence of fatal and potentially fatal ventricular arrhythmias, although this premise has not been evaluated in a randomized trial to date. METHODS: Patients with severe left ventricular dysfunction, extensive coronary disease, and viable myocardium were randomly assigned to receive either percutaneous coronary intervention (PCI) plus optimal medical and device therapy (OMT) or OMT alone. The composite primary outcome was all-cause death or aborted sudden death (defined as an appropriate implantable cardioverter defibrillator therapy or a resuscitated cardiac arrest) at a minimum of 24 months, analyzed as time to first event on an intention-to-treat basis. Secondary outcomes included cardiovascular death or aborted sudden death, appropriate implantable cardioverter defibrillator (ICD) therapy or sustained ventricular arrhythmia, and number of appropriate ICD therapies. RESULTS: Between August 28, 2013, and March 19, 2020, 700 patients were enrolled across 40 centers in the United Kingdom. A total of 347 patients were assigned to the PCI+OMT group and 353 to the OMT alone group. The mean age of participants was 69 years; 88% were male; 56% had hypertension; 41% had diabetes; and 53% had a clinical history of myocardial infarction. The median left ventricular ejection fraction was 28%; 53.1% had an implantable defibrillator inserted before randomization or during follow-up. All-cause death or aborted sudden death occurred in 144 patients (41.6%) in the PCI group and 142 patients (40.2%) in the OMT group (hazard ratio, 1.03 [95% CI, 0.82-1.30]; P=0.80). There was no between-group difference in the occurrence of any of the secondary outcomes. CONCLUSIONS: PCI was not associated with a reduction in all-cause mortality or aborted sudden death. In patients with ischemic cardiomyopathy, PCI is not beneficial solely for the purpose of reducing potentially fatal ventricular arrhythmias. REGISTRATION: URL: https://www. CLINICALTRIALS: gov; Unique identifier: NCT01920048

    Farmers’ perceptions of climate change : identifying types

    Get PDF
    Ambitious targets to reduce greenhouse gas (GHG) emissions from agriculture have been set by both national governments and their respective livestock sectors. We hypothesize that farmer self-identity influences their assessment of climate change and their willingness to im- plement measures which address the issue. Perceptions of climate change were determined from 286 beef/sheep farmers and evaluated using principal component analysis (PCA). The analysis elicits two components which evaluate identity (productivism and environmental responsibility), and two components which evaluate behavioral capacity to adopt mitigation and adaptation measures (awareness and risk perception). Subsequent Cluster Analyses reveal four farmer types based on the PCA scores. ‘The Productivist’ and ‘The Countryside Steward’ portray low levels of awareness of climate change, but differ in their motivation to adopt pro-environmental behavior. Conversely, both ‘The Environmentalist’ and ‘The Dejected’ score higher in their awareness of the issue. In addition, ‘The Dejected’ holds a high sense of perceived risk; however, their awareness is not conflated with an explicit understanding of agricultural GHG sources. With the exception of ‘The Environmentalist’, there is an evident disconnect between perceptions of agricultural emission sources and their contribution towards GHG emissions amongst all types. If such linkages are not con- ceptualized, it is unlikely that behavioral capacities will be realized. Effective communication channels which encour- age action should target farmers based on the groupings depicted. Therefore, understanding farmer types through the constructs used in this study can facilitate effective and tai- lored policy development and implementation

    A New Direction to Athletic Performance: Understanding the Acute and Longitudinal Responses to Backward Running

    Get PDF
    Backward running (BR) is a form of locomotion that occurs in short bursts during many overground field and court sports. It has also traditionally been used in clinical settings as a method to rehabilitate lower body injuries. Comparisons between BR and forward running (FR) have led to the discovery that both may be generated by the same neural circuitry. Comparisons of the acute responses to FR reveal that BR is characterised by a smaller ratio of braking to propulsive forces, increased step frequency, decreased step length, increased muscle activity and reliance on isometric and concentric muscle actions. These biomechanical differences have been critical in informing recent scientific explorations which have discovered that BR can be used as a method for reducing injury and improving a variety of physical attributes deemed advantageous to sports performance. This includes improved lower body strength and power, decreased injury prevalence and improvements in change of direction performance following BR training. The current findings from research help improve our understanding of BR biomechanics and provide evidence which supports BR as a useful method to improve athlete performance. However, further acute and longitudinal research is needed to better understand the utility of BR in athletic performance programs
    • …
    corecore