27 research outputs found

    The influence of 8 weeks of endurance training on blood lactate threshold for male handball players

    Get PDF
    The blood lactate transfusion activity is characterized by triphasic nature. The required counterbalance between glycolysis and metabolism determines the flow of lactate into the muscles. Exercise (>80% of VO2max) has been observed to elevate the level of glycolysis and the gradual accretion of lactate to steadily higher levels, which consequently leads directly to fatigue. The purpose of this study was to investigate the influence of endurance training on the accumulation of lactate and whether it depends on the intensity of training or not. This study involved Sixteen participants, with mean age; 19.18 ± 1.0 years. Participants were asked to run at four various paces (9 km/h, 10.8 km/h, 12.6 km/h, and 14.4 km/h) for 60 minutes. After each exercise, blood lactate is immediately measured using Accusport and Polar devices. Statistically significant differences were seen in heart rate (HR), blood lactate (BL) and maximum lactate steady state (MLSS) after training (72.64 vs. 61.88), (158.22 vs 133.19), and (13.02 vs 8.41) > .01 respectively. Our results suggest that endurance training for 8 weeks can improve lactate threshold and blood lactate accumulation

    Role of Ulva lactuca

    Get PDF
    Seaweeds are potentially excellent sources of highly bioactive materials that could represent useful leads in the alleviation of salinity stress. The effects of presoaking wheat grains in water extract of Ulva lactuca on growth, some enzymatic activities, and protein pattern of salinized plants were investigated in this study. Algal presoaking of grains demonstrated a highly significant enhancement in the percentage of seed germination and growth parameters. The activity of superoxide dismutase (SOD) and catalase (CAT) increased with increasing the algal extract concentration while activity of ascorbate peroxidase (APX) and glutathione reductase (GR) was decreased with increasing concentration of algal extract more than 1% (w/v). The protein pattern of wheat seedling showed 12 newly formed bands as result of algal extract treatments compared with control. The bioactive components in U. lactuca extract such as ascorbic acid, betaine, glutathione, and proline could potentially participate in the alleviation of salinity stress. Therefore, algal presoaking is proved to be an effective technique to improve the growth of wheat seedlings under salt stress conditions

    Left Main Coronary Artery Revascularization in Patients with Impaired Renal Function: Percutaneous Coronary Intervention versus Coronary Artery Bypass Grafting

    Get PDF
    Introduction: The evidence about the optimal revascularization strategy in patients with left main coronary artery (LMCA) disease and impaired renal function is limited. Thus, we aimed to compare the outcomes of LMCA disease revascularization (percutaneous coronary intervention [PCI] vs. coronary artery bypass grafting [CABG]) in patients with and without impaired renal function. Methods: This retrospective cohort study included 2,138 patients recruited from 14 centers between 2015 and 2,019. We compared patients with impaired renal function who had PCI (n= 316) to those who had CABG (n = 121) and compared patients with normal renal function who had PCI (n = 906) to those who had CABG (n = 795). The study outcomes were in-hospital and follow-up major adverse cardiovascular and cerebrovascular events (MACCE). Results: Multivariable logistic regression analysis showed that the risk of in-hospital MACCE was significantly higher in CABG compared to PCI in patients with impaired renal function (odds ratio [OR]: 8.13 [95% CI: 4.19–15.76], p < 0.001) and normal renal function (OR: 2.59 [95% CI: 1.79–3.73]; p < 0.001). There were no differences in follow-up MACCE between CABG and PCI in patients with impaired renal function (HR: 1.14 [95% CI: 0.71–1.81], p = 0.585) and normal renal function (HR: 1.12 [0.90–1.39], p = 0.312). Conclusions: PCI could have an advantage over CABG in revascularization of LMCA disease in patients with impaired renal function regarding in-hospital MACCE. The follow-up MACCE was comparable between PCI and CABG in patients with impaired and normal renal function

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Pegylated Interferon-α Modulates Liver Concentrations of Activin-A and Its Related Proteins in Normal Wistar Rat

    Get PDF
    Aims. To measure the expression of activin ÎČA-subunit, activin IIA and IIB receptors, Smad4, Smad7, and follistatin in the liver and the liver and serum concentrations of mature activin-A and follistatin in normal rat following treatment with pegylated interferon-α (Peg-INF-α) and ribavirin (RBV). Materials and Methods. 40 male Wistar rats were divided equally into 4 groups: “control,” “Peg-only” receiving 4 injections of Peg-INF-α (6 ”g/rat/week), “RBV-only” receiving ribavirin (4 mg/rat/day) orally, and “Peg & RBV” group receiving both drugs. The expression of candidate molecules in liver was measured by immunohistochemistry and quantitative PCR. The concentrations of mature proteins in serum and liver homogenate samples were measured using ELISA. Results. Peg-INF-α  ± RBV altered the expression of all candidate molecules in the liver at the gene and protein levels P<0.05 and decreased activin-A and increased follistatin in serum and liver homogenates compared with the other groups P<0.05. There were also significant correlations between serum and liver activin-A and follistatin. Conclusion. Peg-INF-α modulates the hepatic production of activin-A and follistatin, which can be detected in serum. Further studies are needed to explore the role of Peg-INF-α on the production of activins and follistatin by the liver and immune cells
    corecore