9 research outputs found

    Comparison of techniques for decomposing surface EMG signals into motor unit action potential trains

    Get PDF
    Advancements in surface electromyography (sEMG) have led to many discrepancies in techniques used for signal decomposition. Specifically, the capabilities of well-established recording systems, and the methods involved in identifying motor unit (MU) action potentials and respective firing behaviors.PURPOSE: To examine the differences in MU identification and validation procedures, and firing behaviors between a four-channel (4-ch) sensor and a sixty-four channel (64-ch) high-density sEMG array.METHODS: Following 2 maximal voluntary contractions (MVC), ten (23 ± 3 yrs.; 178.64 ± 5.82 cm; 177.8 ± 17.37 kg) lower body resistance trained males performed 10 sec submaximal isometric ramp contractions of the knee extension exercise at 10%, 20%, and 50% MVC. During testing sEMG was recorded from the vastus lateralis using both 4-ch and 64-ch sensors. Signals were separately decomposed into their constituent MU action potential trains and were further validated for subsequent analysis of firing behaviors. The slope and y-intercept were calculated across the relationships between recruitment threshold versus mean firing rate (RT/MFR). A 2-way mixed factorial ANOVA (sensor [4-ch vs 64-ch] x contraction intensity [10% vs 20% vs 50%]) was used to examine mean differences in MU yield during all contraction. For validated MUs, the RT/MFR relationships were compared between sensors at each intensity and a paired samples t test was used to compare differences in RTs.RESULTS: There was a significant interaction between sensor and intensity, as well as a main effect for intensity, with follow up analysis revealing a significant difference between MUs validated at 10% and 50% MVC (p < 0.05). There was a significant difference in slopes at 10% and 50% MVC, and y-intercepts at 20% MVC for RT/MFR relationships (p < 0.10) and the RT of validated MUs were significantly different (p <0.5) between sensors at each intensity.CONCLUSION: MUs validated using the 4-ch sensor yielded a greater numbers during higher contraction intensities versus the 64-ch sensor. The inability of the 64-ch sensor to yield a greater amount of MUs at 50% MVC may have been due to the subjectivity of the manual editing procedures. However, both validation procedures eliminated a high amount of decomposed MUs

    Bilateral Comparison of Elbow Ulnar Collateral Ligament Thickness in Division I Baseball Athletes

    Get PDF
    It has been established the mechanics of throwing a baseball contributes to ulnar collateral ligament (UCL) injury because of the repetitive valgus stress on the anterior band of the UCL, as it is the primary stabilizer of valgus stress in the elbow. This repetitive stress has been shown to cause adaptations in tensile properties leading to damage or failure of the UCL. Previous research has shown that ultrasound imaging may have the ability to detect changes in the UCL before the onset of symptoms in pitchers. PURPOSE: This study was performed to compare bilateral UCL thickness in Division I baseball players. METHODS: 22 Division I baseball players from the same university participated in this study. The players were separated into two groups based on position: pitchers (n=11; 190.9 ± 7.8 cm; 92.3 ± 11.2 kg) and position players (n=11; 184.6 ± 7.0 cm; 85.9 ± 10.8 kg). Ultrasound imaging was captured bilaterally on the players’ elbows using B-mode ultrasonography (LOGIQ, GE Healthcare). Players were tested seated in a reclined position with the arm resting on a plinth, shoulder abducted 90o, elbow extended 150o, and with forearm in a supinated position. The midpoint thickness of the UCL was measured in millimeters using ImageJ (National Institute of Health). The difference between throwing arm and non-throwing arm UCL thickness was calculated as ‘UCL Dif’. A two-way mixed factor ANOVA (position [pitcher vs position player] x arm [throwing arm and non-throwing arm]) was used to analyse UCL thickness. RESULTS:[BD1] There was no position x arm interaction (p=0.735); however, there were significant main effects for position (p = 0.003) and arm (p ≀ 0.001). When collapsed by position, UCL thickness was significantly greater in the throwing arm compared to the non-throwing arm (0.621 ± 0.008 mm vs. 0.581 ± 0.008 mm). When collapsed across arms, pitchers had significantly greater throwing arm UCL thickness compared to position players (0.618 ± 0.008 vs. 0.518 ± 0.008). CONCLUSION: The UCL thickness in pitchers’ throwing elbow was greater compared to position players. The greater UCL thickness in the throwing arm appears to be an adaptive change to the repetitive stress during a pitch. We believe the known relationships between UCL thickness, player position, and arm dominance can be best explained by positional demands and throwing style

    Grip Strength Symmetries in Division I College Baseball Pitchers and Hitters

    Get PDF
    Integrating strength and conditioning coaches and programs for baseball athletes has yielded positive performance outcomes for both hitting and throwing. Among a variety of baseball-specific testing batter, grip strength has shown to significantly correlate with increased swing and throwing velocity. However, no investigations have examined grip strength asymmetries for hitters and pitchers. PURPOSE: The purpose of this study was to examine differences between right and left arm grip strength of baseball pitchers and hitters. METHODS: Division I collegiate baseball players (n = 45, height. (183.52 ± 11.77 cm) weight. (85.96 ± 17.73 kg.) performed dominate and non-dominate maximal grip strength at position specific arm and forearm orientation utilizing the Jamar Hydraulic Hand Dynamometer. Hitters (n = 22) performed grip strength assessments at 90-degree elbow flexion, neutral forearm orientation (NDN). Pitchers (n = 23) performed grip strength assessments at 90-degree elbow flexion, pronated forearm orientation (NDP). Three attempts were permitted to exert maximal force, recorded in kilograms (kg) – highest exerted force was recorded and used for analysis. An independent samples t-test (p \u3c .05) was employed to assess dominate and non-dominate grip strength differences. RESULT: The results indicated no significant differences between pitchers’ dominate NDP (57.39 ± 7.49 kg) and non-dominate NDP (56.0 ± 7.63 kg), t(44) = .624, p = .966. Likewise, hitters presented no significant difference between dominate NDN (60.68 ± 10.15 kg) and non-dominate NDN (55.27 ± 11.31 kg) t(42) = 1.669, p = .264. CONCLUSION: Contrary to common belief, these results suggest baseball players do not present significant grip strength asymmetries. While baseball skills (i.e., throwing, hitting) require adequate grip strength to produce favorable performances outcomes, these single arm/hand movements do not place baseball players in a concerning asymmetrical grip strength state. Utilization of both hands during hitting provides reasonable explanations for the results of hitters. For pitchers, glove movement and skill, along with typical strength and conditioning may contribute to improvements in non-dominate grip strength. Furthermore, these results suggest equivalent bilateral strength may be a necessity of collegiate baseball players

    A Comparison of Techniques for Decomposing Surface Electromyography Signals During High-Intensity Contractions a Preliminary Analysis

    Get PDF
    Advancements in surface electromyography (sEMG) have led to discrepancies in identification of high-threshold motor units (MU) following signal decomposition PURPOSE: To examine the differences in MU firing behaviors recorded from two separate sEMG sensors following respective decomposition analysis. METHODS: Following 2 maximal voluntary contractions (MVC), ten (23 ± 3 yrs.; 178.64 ± 5.82 cm; 177.8 ± 17.37 kg) lower body resistance trained males performed a 10 sec submaximal (50%) isometric ramp contraction of the knee extension exercise. Signals were recorded from the vastus lateralis and separately decomposed into their constituent MU action potential trains, then further validated for subsequent analysis of firing behaviors. The slope and y-intercept were calculated between recruitment threshold versus mean firing rate (RT/MFR). Two separate paired samples t test were used to compare differences in regression coefficients for RT/MFR relationships between sensors, and differences in RTs of validated MUs during 50%MVC. RESULTS: There were significant differences in RT/MFR coefficients between the two sensors (p\u3c 0.05), as well as, respective RTs from the identified MUs (pCONCLUSION: We are uncertain as to what led to the differences between the two sEMG systems (decomposition algorithms, validation techniques, application area, etc). It is feasible that the substantial difference in yield (i.e. number of validated MUs), possibly due to different validation criteria, affected the outcomes. Thus, further studies should examine the effects of manual editing validation process on the end results

    The Effect of Perceptually Regulated Recovery on Lift Quality of Females during Different Resistance Training Conditions

    Get PDF
    Previous investigation identified the utilization of perceptually regulated recovery as a comparable, and occasionally optimal, method of regulating rest-time between bouts of exercise. Additionally, substantial variances in mean self-selected recovery time between male and female has recently been reported when completing resistance exercise intended to improve muscular strength. These finding suggested females recover quicker than males, and the current standardized intrasession rest recommendations may necessitate adjustments to account for sex specific responses. PURPOSE: The goal of the current investigate was to examine female lift quality responses to perceptually regulated intrasession recovery during various resistance training protocols. METHODS: Participants (n = 10 females; 7 control, 7 experimental) completed nine resistance exercise sessions. Session one involved one-repetition maximum (1RM) testing for squat (SQ) and deadlift (DL). Participants performed eight subsequent working sessions utilizing intensity, set, repetition, and intrasession recovery schemes for DL & SQ targeting the four primary resistance training goals: hypertrophy (HP), strength (ST), endurance (ED), and power (PW). A minimum of 48 hours rest was required between sessions. Control group utilized standardized rest intervals between sets. Experimental group utilized the Perceived Recovery Status (PRS) scale to guide their recovery. Rating of perceived exertion (RPE) using the OMNI RPE scale for resistance training was recorded after each set. Lift quality (LQ), defined as repetitions completed, was recorded for each set. RESULTS: A One-sample T-Test (p p=0.02; p=0.05). Mean experiment group times for the PW SQ and DL (105.73s ± 18.13; 100.73s ± 26.80, respectfully) sessions were significantly lower than the standardized time (120s) (p=0.04; p=0.03). Mean experimental group times for the HP SQ (46 ± 17.57, respectfully) session was significant lower then the standardized time (60s), (p=0.04). All other PRS and time comparison were not significantly different. Repeated Measures ANOVA indicated no significant main effect in LQ across all sessions, nor group RPE for HP, ST, and PW sessions. A significant main effect (p = .046) was identified between group reported RPE during the ME session. CONCLUSION: These results suggest the utilization of perceptually regulated recover yields lift performance comparable to the usage of standardized recovery. The control group reported significantly higher perceived recover during ME sessions than the experiment group, and experimental group rest times were significantly lower during HP session, suggesting the current standardized rest time recommendation for ME and HP training may necessitation an alternation (decrease for females) to account for sex specific responses. Interestingly, the control group reported significantly higher perceived exertion when placed under time restricted recovery (standardized recovery) during ME training. The researchers suggest future investigations examine the practical resistance training efficacy of the PRS scale for females

    IKK2/NFkB signaling controls lung resident CD8+ T cell memory during influenza infection

    No full text
    Abstract CD8+ T cell tissue resident memory (TRM) cells are especially suited to control pathogen spread at mucosal sites. However, their maintenance in lung is short-lived. TCR-dependent NFkB signaling is crucial for T cell memory but how and when NFkB signaling modulates tissue resident and circulating T cell memory during the immune response is unknown. Here, we find that enhancing NFkB signaling in T cells once memory to influenza is established, increases pro-survival Bcl-2 and CD122 levels thus boosting lung CD8+ TRM maintenance. By contrast, enhancing NFkB signals during the contraction phase of the response leads to a defect in CD8+ TRM differentiation without impairing recirculating memory subsets. Specifically, inducible activation of NFkB via constitutive active IKK2 or TNF interferes with TGFÎČ signaling, resulting in defects of lung CD8+ TRM imprinting molecules CD69, CD103, Runx3 and Eomes. Conversely, inhibiting NFkB signals not only recovers but improves the transcriptional signature and generation of lung CD8+ TRM. Thus, NFkB signaling is a critical regulator of tissue resident memory, whose levels can be tuned at specific times during infection to boost lung CD8+ TRM

    Effects of a pre-workout supplement on hyperemia following leg extension resistance exercise to failure with different resistance loads

    No full text
    Abstract Background We sought to determine if a pre-workout supplement (PWS), containing multiple ingredients thought to enhance blood flow, increases hyperemia associated with resistance training compared to placebo (PBO). Given the potential interaction with training loads/time-under-tension, we evaluated the hyperemic response at two different loads to failure. Methods Thirty males participated in this double-blinded study. At visit 1, participants were randomly assigned to consume PWS (Recklessℱ) or PBO (maltodextrin and glycine) and performed four sets of leg extensions to failure at 30% or 80% of their 1-RM 45-min thereafter. 1-wk. later (visit 2), participants consumed the same supplement as before, but exercised at the alternate load. Heart rate (HR), blood pressure (BP), femoral artery blood flow, and plasma nitrate/nitrite (NOx) were assessed at baseline (BL), 45-min post-PWS/PBO consumption (PRE), and 5-min following the last set of leg extensions (POST). Vastus lateralis near infrared spectroscopy (NIRS) was employed during leg extension exercise. Repeated measures ANOVAs were performed with time, supplement, and load as independent variables and Bonferroni correction applied for multiple post-hoc comparisons. Data are reported as mean ± SD. Results With the 30% training load compared to 80%, significantly more repetitions were performed (p  0.05). NIRS derived minimum oxygenated hemoglobin (O2Hb) was lower in the 80% load condition compared to 30% for all rest intervals between sets of exercise (p < 0.0167). HR and BP did not vary as a function of supplement or load. Femoral artery blood flow at POST was higher independent of exercise load and treatment. However, a time*supplement*load interaction was observed revealing greater femoral artery blood flow with PWS compared to PBO at POST in the 80% (+56.8%; p = 0.006) but not 30% load condition (+12.7%; p = 0.476). Plasma NOx was ~3-fold higher with PWS compared to PBO at PRE and POST (p < 0.001). Conclusions Compared to PBO, the PWS consumed herein augmented hyperemia following multiple sets to failure at 80% of 1-RM, but not 30%. This specificity may be a product of interaction with local perturbations (e.g., reduced tissue oxygenation levels [minimum O2Hb] in the 80% load condition) and/or muscle fiber recruitment
    corecore