38 research outputs found

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Physiological responses to ball-drills in regional level male basketball players.

    No full text
    Abstract The main aim of this study was to assess the physiological responses of male basketball players during usual basketball ball-drills. Fourteen male basketball players (age 18.9\ua0\ub1\ua02.3 years) performed the following full-court (28\ua0 7\ua015 m) basketball ball-drills (3\ua0 7\ua04-min with 3-min passive rest): five-a-side (5v5), three-a-side (3v3) and two-a-side (2v2). A main effect (P\ua0<\ua00.0001) for ball-drills modes was evident for all variables (5v5\ua0<\ua03v3\ua0<\ua02v2). Mean [Vdot]O(2) during the 5v5, 3v3 and 2v2 were 39.0\ua0\ub1\ua07.2, 42.0\ua0\ub1\ua07.5 and 45.0\ua0\ub1\ua06.5 ml\ua0\ub7\ua0kg(-1)\ua0\ub7\ua0min(-1) (69\ua0\ub1\ua011, 74\ua0\ub1\ua012 and 79\ua0\ub1\ua011\% of [Vdot]O(2peak)) respectively (5v5\ua0=\ua03v3\ua0<\ua02v2, P\ua0<\ua00.001). Mean blood-lactate concentrations for 5v5, 3v3 and 2v2 were 4.2\ua0\ub1\ua01.8, 6.2\ua0\ub1\ua02.3 and 7.8\ua0\ub1\ua01.2 mmol (l(-1) respectively (5v5\ua0<\ua03v3\ua0<\ua02v2, P\ua0<\ua00.01). During the 5v5, 3v3 and 2v2 mean heart-rate (HR) was 84.0\ua0\ub1\ua09.2, 88.0\ua0\ub1\ua08.4 and 92.0\ua0\ub1\ua05.6\% of the individual peak respectively (5v5\ua0<\ua03v3\ua0<\ua02v2; P\ua0<\ua00.001). No significant differences were found between the regression-line slope (P\ua0=\ua00.86) and intercept (P\ua0=\ua00.45) of the HR-[Vdot]O(2) relationships of the multistage maximal fitness test (r(2) from 0.80 to 0.96, P\ua0<\ua00.001) and ball-drills (r(2) from 0.70 to 0.95, P\ua0<\ua00.001) conditions. Reducing the number of players over the same playing court resulted in increments in physiological demands. The 2v2 condition provided responses in the range of those reported to improve aerobic and anaerobic fitness. The aerobic demands of ball-drills can be accurately assessed using heart-rate monitoring in basketball

    Time–motion analysis and physiological data of elite under‐19‐year‐old basketball players during competition

    No full text
    The physical demands of modern basketball were assessed by investigating 38 elite under‐19‐year‐old basketball players during competition. Computerised time–motion analyses were performed on 18 players of various positions. Heart rate was recorded continuously for all subjects. Blood was sampled before the start of each match, at half time and at full time to determine lactate concentration. Players spent 8.8% (1%), 5.3% (0.8%) and 2.1% (0.3%) of live time in high “specific movements”, sprinting and jumping, respectively. Centres spent significantly lower live time competing in high‐intensity activities than guards (14.7% (1%) v 17.1% (1.2%); p<0.01) and forwards (16.6% (0.8%); p<0.05). The mean (SD) heart rate during total time was 171 (4) beats/min, with a significant difference (p<0.01) between guards and centres. Mean (SD) plasma lactate concentration was 5.49 (1.24) mmol/l, with concentrations at half time (6.05 (1.27) mmol/l) being significantly (p<0.001) higher than those at full time (4.94 (1.46) mmol/l). The changes to the rules of basketball have slightly increased the cardiac efforts involved during competition. The game intensity may differ according to the playing position, being greatest in guards

    Training Mode’s Influence on the Relationships between Training-Load Models During Basketball Conditioning

    No full text
    Purpose: To compare perceptual and physiological training load responses during various basketball training modes. Methods: Eight semi-professional male basketball players (age: 26.3 ± 6.7 years; height: 188.1 ± 6.2 cm; body mass: 92.0 ± 13.8 kg) were monitored across a 10-week period in the preparatory phase of the training plan. Player session ratings of perceived exertion (sRPE) and heart rate (HR) responses were gathered across base, specific, and tactical/game-play training modes. Pearson correlations were used to determine the relationships between the sRPE model and two HR-based models, the training impulse (TRIMP) and summated-heart-rate-zones (SHRZ). One-way ANOVAs were used to compare training loads between training modes for each model. Results: Stronger relationships between perceptual and physiological models were evident during base (sRPE-TRIMP: r = 0.53, P < 0.05; sRPE-SHRZ: r = 0.75, P < 0.05) and tactical/game-play conditioning (sRPE-TRIMP: r = 0.60, P < 0.05; sRPE-SHRZ: r = 0.63; P < 0.05) than during specific conditioning (sRPE-TRIMP: r = 0.38, P < 0.05; sRPE-SHRZ: r = 0.52; P < 0.05). Further, the sRPE model detected greater increases (126-429 AU) in training load than the TRIMP (15-65 AU) and SHRZ models (27-170 AU) transitioning between training modes. Conclusions: While the training load models were significantly correlated during each training mode, weaker relationships were observed during specific conditioning. Comparisons suggest the HR-based models were less effective in detecting periodized increases in training load, particularly during court-based, intermittent, multidirectional drills. The practical benefits and sensitivity of the sRPE model support its use across different basketball training modes
    corecore