792 research outputs found

    Neuromuscular, endocrine, and perceptual fatigue responses during different length between-match microcycles in professional rugby league players

    Full text link
    The purpose of this study was to examine the changes in neuromuscular, perceptual and hormonal measures following professional rugby league matches during different length between-match microcycles. Methods: Twelve professional rugby league players from the same team were assessed for changes in countermovement jump (CMJ) performance (flight time and relative power), perceptual responses (fatigue, well-being and muscle soreness) and salivary hormone (testosterone [T] and cortisol [C]) levels during 5, 7 and 9 d between-match training microcycles. All training was prescribed by the club coaches and was monitored using the session-RPE method. Results: Lower mean daily training load was completed on the 5 d compared with the 7 and 9 d microcycles. CMJ flight time and relative power, perception of fatigue, overall well-being and muscle soreness were significantly reduced in the 48 h following the match in each microcycle (P < .05). Most CMJ variables returned to near baseline values following 4 d in each microcycle. Countermovement jump relative power was lower in the 7 d microcycle in comparison with the 9 d microcycle (P < .05). There was increased fatigue at 48 h in the 7 and 9 d microcycles (P < .05) but had returned to baseline in the 5 d microcycle. Salivary T and C did not change in response to the match. Discussion: Neuromuscular performance and perception of fatigue are reduced for at least 48 h following a rugby league match but can be recovered to baseline levels within 4 d. These findings show that with appropriate training, it is possible to recover neuromuscular and perceptual measures within 4 d after a rugby league match. © Human Kinetics, Inc

    Quantifying Training and Game Demands of a National Basketball Association Season.

    Full text link
    Purpose: There are currently no data describing combined practice and game load demands throughout a National Basketball Association (NBA) season. The primary objective of this study was to integrate external load data garnered from all on-court activity throughout an NBA season, according to different activity and player characteristics. Methods: Data from 14 professional male basketball players (mean ± SD; age, 27.3 ± 4.8 years; height, 201.0 ± 7.2 cm; body mass, 104.9 ± 10.6 kg) playing for the same club during the 2017-2018 NBA season were retrospectively analyzed. Game and training data were integrated to create a consolidated external load measure, which was termed integrated load. Players were categorized by years of NBA experience (1-2y, 3-5y, 6-9y, and 10 + y), position (frontcourt and backcourt), and playing rotation status (starter, rotation, and bench). Results: Total weekly duration was significantly different (p < 0.001) between years of NBA playing experience, with duration highest in 3-5 year players, compared with 6-9 (d = 0.46) and 10+ (d = 0.78) year players. Starters experienced the highest integrated load, compared with bench (d = 0.77) players. There were no significant differences in integrated load or duration between positions. Conclusion: This is the first study to describe the seasonal training loads of NBA players for an entire season and shows that a most training load is accumulated in non-game activities. This study highlights the need for integrated and unobtrusive training load monitoring, with engagement of all stakeholders to develop well-informed individualized training prescription to optimize preparation of NBA players

    Measuring Physical Demands in Basketball: An Explorative Systematic Review of Practices.

    Full text link
    BACKGROUND:Measuring the physical work and resultant acute psychobiological responses of basketball can help to better understand and inform physical preparation models and improve overall athlete health and performance. Recent advancements in training load monitoring solutions have coincided with increases in the literature describing the physical demands of basketball, but there are currently no reviews that summarize all the available basketball research. Additionally, a thorough appraisal of the load monitoring methodologies and measures used in basketball is lacking in the current literature. This type of critical analysis would allow for consistent comparison between studies to better understand physical demands across the sport. OBJECTIVES:The objective of this systematic review was to assess and critically evaluate the methods and technologies used for monitoring physical demands in competitive basketball athletes. We used the term 'training load' to encompass the physical demands of both training and game activities, with the latter assumed to provide a training stimulus as well. This review aimed to critique methodological inconsistencies, establish operational definitions specific to the sport, and make recommendations for basketball training load monitoring practice and reporting within the literature. METHODS:A systematic review of the literature was performed using EBSCO, PubMed, SCOPUS, and Web of Science to identify studies through March 2020. Electronic databases were searched using terms related to basketball and training load. Records were included if they used a competitive basketball population and incorporated a measure of training load. This systematic review was registered with the International Prospective Register of Systematic Reviews (PROSPERO Registration # CRD42019123603), and approved under the National Basketball Association (NBA) Health Related Research Policy. RESULTS:Electronic and manual searches identified 122 papers that met the inclusion criteria. These studies reported the physical demands of basketball during training (n = 56), competition (n = 36), and both training and competition (n = 30). Physical demands were quantified with a measure of internal training load (n = 52), external training load (n = 29), or both internal and external measures (n = 41). These studies examined males (n = 76), females (n = 34), both male and female (n = 9), and a combination of youth (i.e. under 18 years, n = 37), adults (i.e. 18 years or older, n = 77), and both adults and youth (n = 4). Inconsistencies related to the reporting of competition level, methodology for recording duration, participant inclusion criteria, and validity of measurement systems were identified as key factors relating to the reporting of physical demands in basketball and summarized for each study. CONCLUSIONS:This review comprehensively evaluated the current body of literature related to training load monitoring in basketball. Within this literature, there is a clear lack of alignment in applied practices and methodological framework, and with only small data sets and short study periods available at this time, it is not possible to draw definitive conclusions about the true physical demands of basketball. A detailed understanding of modern technologies in basketball is also lacking, and we provide specific guidelines for defining and applying duration measurement methodologies, vetting the validity and reliability of measurement tools, and classifying competition level in basketball to address some of the identified knowledge gaps. Creating alignment in best-practice basketball research methodology, terminology and reporting may lead to a more robust understanding of the physical demands associated with the sport, thereby allowing for exploration of other research areas (e.g. injury, performance), and improved understanding and decision making in applying these methods directly with basketball athletes

    Understanding 'monitoring' data-the association between measured stressors and athlete responses within a holistic basketball performance framework.

    Get PDF
    This study examined associations between cumulative training load, travel demands and recovery days with athlete-reported outcome measures (AROMs) and countermovement jump (CMJ) performance in professional basketball. Retrospective analysis was performed on data collected from 23 players (mean±SD: age = 24.7±2.5 years, height = 198.3±7.6 cm, body mass = 98.1±9.0 kg, wingspan = 206.8±8.4 cm) from 2018-2020 in the National Basketball Association G-League. Linear mixed models were used to describe variation in AROMs and CMJ data in relation to cumulative training load (previous 3- and 10-days), hours travelled (previous 3- and 10-day), days away from the team's home city, recovery days (i.e., no travel/minimal on-court activity) and individual factors (e.g., age, fatigue, soreness). Cumulative 3-day training load had negative associations with fatigue, soreness, and sleep, while increased recovery days were associated with improved soreness scores. Increases in hours travelled and days spent away from home over 10 days were associated with increased sleep quality and duration. Cumulative training load over 3 and 10 days, hours travelled and days away from home city were all associated with changes in CMJ performance during the eccentric phase. The interaction of on-court and travel related stressors combined with individual factors is complex, meaning that multiple athletes response measures are needed to understand fatigue and recovery cycles. Our findings support the utility of the response measures presented (i.e., CMJ and AROMs), but this is not an exhaustive battery and practitioners should consider what measures may best inform training periodization within the context of their environment/sport

    Blood volumes following preseason heat versus altitude: A case study of australian footballers

    Full text link
    © 2020 Human Kinetics, Inc. Purpose: There is debate as to which environmental intervention produces the most benefit for team sport athletes, particularly comparing heat and altitude. This quasi-experimental study aimed to compare blood volume (BV) responses with heat and altitude training camps in Australian footballers. Methods: The BV of 7 professional Australian footballers (91.8 [10.5] kg, 191.8 [10.1] cm) was measured throughout 3 consecutive spring/summer preseasons. During each preseason, players participated in altitude (year 1 and year 2) and heat (year 3) environmental training camps. Year 1 and year 2 altitude camps were in November/ December in the United States, whereas the year 3 heat camp was in February/March in Australia after a full exposure to summer heat. BV, red cell volume, and plasma volume (PV) were measured at least 3 times during each preseason. Results: Red cell volume increased substantially following altitude in both year 1 (d = 0.67) and year 2 (d = 1.03), before returning to baseline 4 weeks postaltitude. Immediately following altitude, concurrent decreases in PV were observed during year 1 (d = -0.40) and year 2 (d = -0.98). With spring/summer training in year 3, BV and PV were substantially higher in January than temporally matched postaltitude measurements during year 1 (BV: d = -0.93, PV: d = -1.07) and year 2 (BV: d = -1.99, PV: d = -2.25), with year 3 total BV, red cell volume, and PV not changing further despite the 6-day heat intervention. Conclusions: We found greater BV after training throughout spring/summer conditions, compared with interrupting spring/summer exposure to train at altitude in the cold, with no additional benefits observed from a heat camp following spring/summer training

    Collision activity during training increases total energy expenditure measured via doubly labelled water

    Get PDF
    Purpose: Collision sports are characterised by frequent high intensity collisions that induce substantial muscle damage, potentially increasing the energetic cost of recovery. Therefore, this study investigated the energetic cost of collision-based activity for the first time across any sport. Methods: Using a randomised crossover design, six professional young male rugby league players completed two different five-day pre-season training microcycles. Players completed either a collision (COLL; 20 competitive one-on-one collisions) or non-collision (nCOLL; matched for kinematic demands, excluding collisions) training session on the first day of each microcycle, exactly seven days apart. All remaining training sessions were matched and did not involve any collision-based activity. Total energy expenditure was measured using doubly labelled water, the literature gold standard. Results: Collisions resulted in a very likely higher (4.96 ± 0.97 MJ; ES = 0.30 ±0.07; p=0.0021) total energy expenditure across the five-day COLL training microcycle (95.07 ± 16.66 MJ) compared with the nCOLL training microcycle (90.34 ± 16.97 MJ). The COLL training session also resulted in a very likely higher (200 ± 102 AU; ES = 1.43 ±0.74; p=0.007) session rating of perceived exertion and a very likely greater (-14.6 ± 3.3%; ES = -1.60 ±0.51; p=0.002) decrease in wellbeing 24h later. Conclusions: A single collision training session considerably increased total energy expenditure. This may explain the large energy expenditures of collision sport athletes, which appear to exceed kinematic training and match demands. These findings suggest fuelling professional collision-sport athletes appropriately for the "muscle damage caused” alongside the kinematic “work required”. Key words: Nutrition, Recovery, Contact, Rugb

    Activity patterns of free-ranging koalas (Phascolarctos cinereus) revealed by accelerometry

    Get PDF
    An understanding of koala activity patterns is important for measuring the behavioral response of this species to environmental change, but to date has been limited by the logistical challenges of traditional field methodologies. We addressed this knowledge gap by using tri-axial accelerometer data loggers attached to VHF radio collars to examine activity patterns of adult male and female koalas in a high-density population at Cape Otway, Victoria, Australia. Data were obtained from 27 adult koalas over two 7-d periods during the breeding season: 12 in the early-breeding season in November 2010, and 15 in the late-breeding season in January 2011. Multiple 15 minute observation blocks on each animal were used for validation of activity patterns determined from the accelerometer data loggers. Accelerometry was effective in distinguishing between inactive (sleeping, resting) and active (grooming, feeding and moving) behaviors. Koalas were more active during the early-breeding season with a higher index of movement (overall dynamic body acceleration [ODBA]) for both males and females. Koalas showed a distinct temporal pattern of behavior, with most activity occurring from mid-afternoon to early morning. Accelerometry has potential for examining fine-scale behavior of a wide range of arboreal and terrestrial species

    A Multi-Variant, Viral Dynamic Model of Genotype 1 HCV to Assess the in vivo Evolution of Protease-Inhibitor Resistant Variants

    Get PDF
    Variants resistant to compounds specifically targeting HCV are observed in clinical trials. A multi-variant viral dynamic model was developed to quantify the evolution and in vivo fitness of variants in subjects dosed with monotherapy of an HCV protease inhibitor, telaprevir. Variant fitness was estimated using a model in which variants were selected by competition for shared limited replication space. Fitness was represented in the absence of telaprevir by different variant production rate constants and in the presence of telaprevir by additional antiviral blockage by telaprevir. Model parameters, including rate constants for viral production, clearance, and effective telaprevir concentration, were estimated from 1) plasma HCV RNA levels of subjects before, during, and after dosing, 2) post-dosing prevalence of plasma variants from subjects, and 3) sensitivity of variants to telaprevir in the HCV replicon. The model provided a good fit to plasma HCV RNA levels observed both during and after telaprevir dosing, as well as to variant prevalence observed after telaprevir dosing. After an initial sharp decline in HCV RNA levels during dosing with telaprevir, HCV RNA levels increased in some subjects. The model predicted this increase to be caused by pre-existing variants with sufficient fitness to expand once available replication space increased due to rapid clearance of wild-type (WT) virus. The average replicative fitness estimates in the absence of telaprevir ranged from 1% to 68% of WT fitness. Compared to the relative fitness method, the in vivo estimates from the viral dynamic model corresponded more closely to in vitro replicon data, as well as to qualitative behaviors observed in both on-dosing and long-term post-dosing clinical data. The modeling fitness estimates were robust in sensitivity analyses in which the restoration dynamics of replication space and assumptions of HCV mutation rates were varied

    Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors

    Get PDF
    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative method using a sphingosine-1-phosphate (S1P) receptor as a model. Because of a lack of suitable binding studies, it has been difficult to study S1P receptor internalisation. Using a N-terminal HisG-tag, S1P1 receptors on the cell membrane can be visualised via immunocytochemistry with a specific anti-HisG antibody. S1P-induced internalisation was concentration dependent and was quantified using a microplate reader, detecting either absorbance, a fluorescent or luminescent signal, depending on the antibodies used. Among those, the fluorescence detection method was the most convenient to use. The relative ease of this method makes it suitable to measure a large number of data points, e.g. to compare the potency and efficacy of receptor ligands

    Is Body Fat a Predictor of Race Time in Female Long-Distance Inline Skaters?

    Get PDF
    Purpose: The aim of this study was to evaluate predictor variables of race time in female ultra-endurance inliners in the longest inline race in Europe. Methods: We investigated the association between anthropometric and training characteristics and race time for 16 female ultraendurance inline skaters, at the longest inline marathon in Europe, the ‘Inline One-eleven’ over 111 km in Switzerland, using bi- and multivariate analysis. Results: The mean (SD) race time was 289.7 (54.6) min. The bivariate analysis showed that body height (r=0.61), length of leg (r=0.61), number of weekly inline skating training sessions (r=-0.51)and duration of each training unit (r=0.61) were significantly correlated with race time. Stepwise multiple regressions revealed that body height, duration of each training unit, and age were the best variables to predict race time. Conclusion: Race time in ultra-endurance inline races such as the ‘Inline One-eleven’ over 111 km might be predicted by the following equation (r2 = 0.65): Race time (min) = -691.62 + 521.71 (body height, m) + 0.58 (duration of each training unit, min) + 1.78 (age, yrs) for female ultra-endurance inline skaters
    corecore