52 research outputs found
The Effects of Summer Training on Neuromuscular Performance in Semi-Professional Soccer Players
While training load (TL) and heat exposure have been shown to independently influence neuromuscular performance, these combined effects have not been fully explored. PURPOSE: To investigate the effects of summer soccer training on neuromuscular performance in semi-professional male soccer players. METHODS: 21 semi-professional male soccer players (age: 21.4 ± 1.9 years; mass: 77.3 ± 7.0 kg; height: 179.2 ± 6.4 cm) visited the laboratory on two occasions separated by three weeks of training. During each visit, ultrasound imaging was collected to determine muscle cross-sectional area (CSA) of the dominant rectus femoris (RF) and vastus lateralis (VL). Maximal voluntary contractions (MVCs) of the dominant leg extensors and flexors were performed to calculate peak torque (PT) and rate of torque development (RTD). Muscle excitation of the RF, VL, and biceps femoris was assessed using electromyographic root mean squared (EMGRMS) calculations during each MVC. Internal and external TL metrics were collected via GPS-enabled accelerometers during all practices and matches. Linear regression models were used to assess the association between accumulative TL on the changes in neuromuscular performance during three weeks of summer training. RESULTS: Lower cumulative maximal accelerations (ACCMAX) and decelerations (DECMAX) were associated with higher PT in the RF during extension (β = -285, p = 0.007; β = -272, p = 0.026, respectively). However, greater cumulative ACCMAX and DECMAX were associated with higher PT in the VL during extension (β = 175, p = 0.016; β = 162, p = 0.012, respectively). Additionally, a greater cumulative total distance covered (TD) was associated with lower PT in the VL during flexion (β = -0.09, p = 0.004). Greater cumulative ACCMAX and DECMAX were associated with higher EMGRMS in the VL during flexion (β = 119, p = 0.026; β = 130, p = 0.029, respectively). No significant relationships were observed between other TL measures with RF and VL CSA and RTD. CONCLUSION: Changes in muscle excitation and force production characteristics in the RF and VL muscles were exhibited after a three-week training period during the summer. This suggests that three-week TL, in combination with heat exposure, may influence the risk of injury and performance of semi-professional soccer athletes
Relationship Between Muscle Cross-Sectional Area and Counter-Movement Jump Performance in Semi-Professional Athletes
Muscle cross-sectional area (CSA) has been identified as a major predictor of the force production capabilities in athletes, while countermovement jumps (CMJ) can be used to assess power production translated into jump height. However, there is a lack of literature looking at the relationship between CSA and CMJ performance in athletes at different timepoints throughout a competitive season. PURPOSE: To examine the relationship between change in lower limb muscle CSA and CMJ performance in semi-professional soccer players over the course of a season. METHODS: Twenty-three male players (mean ± standard deviation; age: 21.3 ± 2 years; height: 178.4 ± 6.7 cm; weight: 77.2 ± 7.9 kg) visited the laboratory during the pre-, mid-, and post-season for a total of three visits over the course of soccer season. During each visit, body mass (BM) of the player was measured first. Then, two ultrasound images of their dominant rectus femoris (RF) and vastus lateralis (VL) were acquired. Additionally, each player performed two CMJs with the highest height obtained used for data analysis. Percentage of changes of BM (%BM), CSA (%CSA), and CMJ (%CMJ) from baseline values were calculated for each player. Regression analyses were conducted to predict from %CSA to %CMJ without and with controlling %BM. RESULTS: Lower %CSA predicted higher %CMJ (r2 = 0.276, p =.007) independent of %BM. When controlled, lower %CSA also predicted higher %CMJ (r2=0.294, p =0.022). However, RF %CSA did not predict %CMJ both without and with controlling %BM (p \u3e .05). CONCLUSION: In semi-professional soccer players, lower VL CSA predicted higher CMJ performance. As %BM was not the predominant variable in the correlation, proven by the difference or variances between not controlling and controlling being minimal, reasoning could be speculated as other factors such as training load and recovery state
Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries
Abstract
Background
Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres.
Methods
This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries.
Results
In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia.
Conclusion
This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries
Risk Assessment Predicts Most of the Salmonellosis Risk in Raw Chicken Parts is Concentrated in Those Few Products with High Levels of High-Virulence Serotypes of Salmonella
Salmonella prevalence declined in U.S. raw poultry products since adopting prevalence-based Salmonella performance standards, but human illnesses did not reduce proportionally. We used Quantitative Microbial Risk Assessment (QMRA) to evaluate public health risks of raw chicken parts contaminated with different levels of all Salmonella and specific high- and low-virulence serotypes. Lognormal Salmonella level distributions were fitted to 2012 USDA-FSIS Baseline parts survey and 2023 USDA-FSIS HACCP verification sampling data. Three different Dose-Response (DR) approaches included (i) a single DR for all serotypes, (ii) DR that reduces Salmonella Kentucky ST152 virulence, and (iii) multiple serotype-specific DR models. All scenarios found risk concentrated in the few products with high Salmonella levels. Using a single DR model with Baseline data (μ = −3.19, σ = 1.29 Log CFU/g), 68% and 37% of illnesses were attributed to the 0.7% and 0.06% of products with >1 and >10 CFU/g Salmonella, respectively. Using distributions from 2023 HACCP data (μ = −5.53, σ = 2.45), 99.8% and 99.0% of illnesses were attributed to the 1.3% and 0.4% of products with >1 and >10 CFU/g Salmonella, respectively. Scenarios with serotype-specific DR models showed more concentrated risk at higher levels. Baseline data showed 92% and 67% and HACCP data showed >99.99% and 99.96% of illnesses attributed to products with >1 and >10 CFU/g Salmonella, respectively. Regarding serotypes using Baseline or HACCP input data, 0.002% and 0.1% of illnesses were attributed to the 0.2% and 0.4% of products with >1 CFU/g of Kentucky ST152, respectively, while 69% and 83% of illnesses were attributed to the 0.3% and 0.6% of products with >1 CFU/g of Enteritidis, Infantis, or Typhimurium, respectively. Therefore, public health risk in chicken parts is concentrated in finished products with high levels and specifically high levels of high-virulence serotypes. Low-virulence serotypes like Kentucky contribute few human cases
Optimal Ranges of Training Load and Recovery Status Prior to the Game to Maximize Game-day Performance in Soccer
Identifying ideal ranges of training load (TL) and recovery status before soccer games is essential to maximize game-day performance. PURPOSE: To determine how accumulative TL and subjective fatigue affect performance during the game. METHODS: Twenty-one male semi-professional soccer players (mean ± standard deviation; age: 22 ± 2 years; mass: 77.3 ± 6.9 kg) wore a player tracking device to monitor TL during each practice and game. A 7-day accumulation of high speed running and sprinting (HSR+Sprint, ³12.30 mph), total distance (TD), low acceleration (LowACC: 0.50-1.99 m×s-2), and high acceleration (HighACC: 2.00-50.00 m×s-2) was calculated. Also, 7-day average fatigue and pre-game fatigue was monitored using a 0 to 10 Likert scale. Game-day performance was defined as the percent changes in average HR and velocity between 1st and 2nd halves. Standard least square regression analysis was performed to determine if TL metrics and fatigue predicted game-day performance. Then, a predictive modeling decision tree was used to establish optimal ranges for each variable and corresponding probability of positive game-day performance. RESULTS: TD, LowACC,HighACC,HSR+Sprint, and game-day fatigue predicted game-day performance in HR (pnd half compared to 1st half. Similarly, achieving LowACC between 0.8 ≤ 1.6 miles and HighACC between 0.1 ≤ 0.2 miles had a 74% and 82% probability maintaining performance in the 2nd half. Additionally, HSR+Sprint between 0.1 ≤ 0.9 miles in practice had a 77% probability and game-day fatigue within 2-4 (small-somewhat fatigued) had an 80% probability. For velocity, TD, LowACC, HighACC, and 7-day average fatigue predicted game-day performance (pnd half. CONCLUSION: In order to achieve better game-day performance, players need to follow the designated ranges of TL, specifically for TD, LowACC,HighACC,and HSR+Sprint. Also, maintaining low levels of fatigue can lead to better game-day performance. Based on these findings, coaches can create individualized training and recovery plans for their players during practice to optimize performance in the game
Hurdle Approach to Simulate Corn Wet Milling Inactivation of Undesirable Microorganisms: A Pilot Scale Microbial Challenge Study Using Salmonella Surrogate Enterococcus faecium
Corn wet milling (CWM) and corn starch flash drying processing conditions reduce undesirable microorganisms, such as Salmonella. Finished products are historically safe, with intrinsic properties such as low water activity inhibiting microbial growth. Corn processors could use quantified levels of reduction in this study of Salmonella surrogate Enterococcus faecium (E. faecium) to update their food safety plans. Industry-relevant conditions for CWM processes were recreated at pilot or lab scale for 3 unit operations: (1) steeping treatment in sulfur dioxide (SO2) with low (750 ppm SO2, 20 h, 43.3 °C), medium (1,500 ppm SO2, 30 h, 48.9 °C), and high (2,200 ppm SO2, 40 h, 53.3 °C) treatment conditions; (2) hydrogen peroxide (H2O2) treatment tested on bench scale with a factorial design (pH 3.5, 4.0, and 4.5), H2O2 concentrations (0.05%, 0.10%, 0.15% (w/w)), and temperatures (32, 38, and 46 °C) for 3 and 6 h; (3) flash drying treatment at 4 different temperatures (149, 177, 204, and 232 °C) with 2 different inoculation methods. E. faecium was reduced during each of these unit operations. By the end of each steeping treatment, E. faecium was consistently below the limit of quantitation (LOQ), meaning >6.5 log CFU/mL reduction in steep water, and >3.7 log CFU/g reduction in ground corn. The peroxide step had a reduction range from 0.03 log CFU/mL in the control group (0% H2O2 added) to >6 log CFU/mL observed in the high-intensity treatment of corn starch slurry. Flash drying had a reduction range from 1.7 to 2.7 log CFU/g. There was also no biologically meaningful change (<1 log CFU/g reduction) of E. faecium counts during an 8-week survival study of the dried final product. This hurdle approach study shows that existing CWM conditions are effective for Salmonella surrogate reduction through processing into finished starch and provides quantified E. faecium reductions for use in food safety plans
Free-response self-description as a predictor of success and failure in adolescent exchange students
Determining the Geographic Origin of Animal Samples
While the identification of the species from an unknown sample is a first step, and often a crucial one, it may also be important to identify the region of origin of the sample. This becomes an important issue when a species is protected in one region but not in another, or if wild animals are caught and sold as captive bred. The movement of an animal can also be important in cases such as those of stolen animals or even for conservation purposes. One way to determine the origin of a sample is by comparing the ratios of different isotopes using methods such as inductively coupled plasma mass spectrometry (ICP-MS) and isotope ratio mass spectrometry (IRMS). This chapter will give a very basic understanding of the principles of isotopes, ICP-MS and IRMS and their uses in tracing the movements of species. It has been written for the reader with little to no understanding of isotopes or their use in forensic wildlife crime and who comes from a mainly biological background. In addition, case examples using the techniques will be presented
Development of carbon-11 labelled PET tracers-radiochemical and technological challenges in a historic perspective
The development of positron emission tomography (PET) from being an exclusive and expensive research tool at major research institutes to a clinically useful modality found at most major hospitals around the world is largely dependent on radiochemistry and synthesis technology achievements by a few pioneer researchers starting their PET careers 40 to 50years ago. Especially, the introduction of [C-11]methyl iodide resulted in a quantum jump in the history of PET tracer development enabling the smooth labelling of a multitude of useful tracers. A more recent and still challenging methodological improvement is transition metal mediated C-11-carbonylations, having a large synthetic potential that has, however, not yet been realized in the clinical setting. This mini-review focuses on the history of carbon-11 radiochemistry and related technology developments and the role this played in PET tracer developments, especially emphasizing radiolabelling of endogenous compounds. A few examples will be presented of how the use of radiolabelled endogenous substances have provided fundamental information of in vivo biochemistry using the concept of position-specific labelling in different positions in the same molecule.</p
- …
