21 research outputs found

    A Comparative Study of Modified and Unmodified Algae (Pediastrum boryanum) for Removal of Lead, Cadmium and Copper in Contaminated Water

    Get PDF
    The presence of heavy metals in water is of concern due to the risk toxicity. Thus there is need for their removal for the safety of consumers. Methods applied for removal of heavy metals include adsorption, membrane filtration and co-precipitation. However, studies have revealed adsorption is highly effective technique. Most adsorbents are expensive or require extensive processing before use and hence need to explore for possible sources of inexpensive adsorbents. This research work investigated the use an algal biomass (pediastrum boryanum) as an adsorbent for removal of Lead, Cadmium and Copper in waste water in its raw and modified forms. The samples were characterized with FTIR and was confirmed a successful modification with tetramethylethlynediamine (TMEDA). Sorption parameters were optimized and the material was finally applied on real water samples. It was found that the sorption was best at lower pH values (4.2-6.8). Sorption kinetics was very high as more that 90% of the metals were removed from the solution within 30 minutes. The adsorption of copper fitted into the Langmuir adsorption isotherm indicating a monolayer binding mechanism. Cadmium and lead fitted best the Freundlich adsorption mechanism. Sorption of lead and cadmium was of pseudo-second order kinetics, confirming a multisite interaction whereas copper was pseudo-first order indicating a single site adsorption. The adsorption capacity did not improve upon modification but the stability of the material was improved and secondary pollution of leaching colour was alleviated. This implies that the modified material is suitable for application on the removal of metals from water

    Parasite co-infections and their impact on survival of indigenous cattle

    Get PDF
    In natural populations, individuals may be infected with multiple distinct pathogens at a time. These pathogens may act independently or interact with each other and the host through various mechanisms, with resultant varying outcomes on host health and survival. To study effects of pathogens and their interactions on host survival, we followed 548 zebu cattle during their first year of life, determining their infection and clinical status every 5 weeks. Using a combination of clinical signs observed before death, laboratory diagnostic test results, gross-lesions on post-mortem examination, histo-pathology results and survival analysis statistical techniques, cause-specific aetiology for each death case were determined, and effect of co-infections in observed mortality patterns. East Coast fever (ECF) caused by protozoan parasite Theileria parva and haemonchosis were the most important diseases associated with calf mortality, together accounting for over half (52%) of all deaths due to infectious diseases. Co-infection with Trypanosoma species increased the hazard for ECF death by 6 times (1.4-25; 95% CI). In addition, the hazard for ECF death was increased in the presence of Strongyle eggs, and this was burden dependent. An increase by 1000 Strongyle eggs per gram of faeces count was associated with a 1.5 times (1.4-1.6; 95% CI) increase in the hazard for ECF mortality. Deaths due to haemonchosis were burden dependent, with a 70% increase in hazard for death for every increase in strongyle eggs per gram count of 1000. These findings have important implications for disease control strategies, suggesting a need to consider co-infections in epidemiological studies as opposed to single-pathogen focus, and benefits of an integrated approach to helminths and East Coast fever disease control

    Optimization of the SARS-CoV-2 ARTIC network V4 primers and whole genome sequencing protocol

    Get PDF
    Introduction: The ARTIC Network's primer set and amplicon-based protocol is one of the most widely used SARS-CoV-2 sequencing protocol. An update to the V3 primer set was released on 18th June 2021 to address amplicon drop-off observed among the Delta variant of concern. Here, we report on an in-house optimization of a modified version of the ARTIC Network V4 protocol that improves SARS-CoV-2 genome recovery in instances where the original V4 pooling strategy was characterized by amplicon drop-offs. Methods: We utilized a matched set of 43 clinical samples and serially diluted positive controls that were amplified by ARTIC V3, V4 and optimized V4 primers and sequenced using GridION from the Oxford Nanopore Technologies'. Results: We observed a 0.5% to 46% increase in genome recovery in 67% of the samples when using the original V4 pooling strategy compared to the V3 primers. Amplicon drop-offs at primer positions 23 and 90 were observed for all variants and positive controls. When using the optimized protocol, we observed a 60% improvement in genome recovery across all samples and an increase in the average depth in amplicon 23 and 90. Consequently, ≥95% of the genome was recovered in 72% (n = 31) of the samples. However, only 60–70% of the genomes could be recovered in samples that had 0.05) correlation between Ct value and genome recovery. Conclusion: Utilizing the ARTIC V4 primers, while increasing the primer concentrations for amplicons with drop-offs or low average read-depth, greatly improves genome recovery of Alpha, Beta, Delta, Eta and non-VOC/non-VOI SARS-CoV-2 variants

    Design and descriptive epidemiology of the Infectious Diseases of East African Livestock (IDEAL) project, a longitudinal calf cohort study in western Kenya

    Get PDF
    BACKGROUND: There is a widely recognised lack of baseline epidemiological data on the dynamics and impacts of infectious cattle diseases in east Africa. The Infectious Diseases of East African Livestock (IDEAL) project is an epidemiological study of cattle health in western Kenya with the aim of providing baseline epidemiological data, investigating the impact of different infections on key responses such as growth, mortality and morbidity, the additive and/or multiplicative effects of co-infections, and the influence of management and genetic factors. A longitudinal cohort study of newborn calves was conducted in western Kenya between 2007-2009. Calves were randomly selected from all those reported in a 2 stage clustered sampling strategy. Calves were recruited between 3 and 7 days old. A team of veterinarians and animal health assistants carried out 5-weekly, clinical and postmortem visits. Blood and tissue samples were collected in association with all visits and screened using a range of laboratory based diagnostic methods for over 100 different pathogens or infectious exposures. RESULTS: The study followed the 548 calves over the first 51 weeks of life or until death and when they were reported clinically ill. The cohort experienced a high all cause mortality rate of 16% with at least 13% of these due to infectious diseases. Only 307 (6%) of routine visits were classified as clinical episodes, with a further 216 reported by farmers. 54% of calves reached one year without a reported clinical episode. Mortality was mainly to east coast fever, haemonchosis, and heartwater. Over 50 pathogens were detected in this population with exposure to a further 6 viruses and bacteria. CONCLUSION: The IDEAL study has demonstrated that it is possible to mount population based longitudinal animal studies. The results quantify for the first time in an animal population the high diversity of pathogens a population may have to deal with and the levels of co-infections with key pathogens such as Theileria parva. This study highlights the need to develop new systems based approaches to study pathogens in their natural settings to understand the impacts of co-infections on clinical outcomes and to develop new evidence based interventions that are relevant

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Measuring training effectiveness of laboratory biosafety program offered at African Center for Integrated Laboratory Training in 22 President’s Emergency Plan for AIDS Relief supported countries (2008–2014)

    No full text
    Abstract Introduction The African Center for Integrated Laboratory Training (ACILT) in Johannesburg, South Africa offered a laboratory biosafety program to improve laboratory biosafety practices in 22 President’s Emergency Plan for AIDS Relief (PEPFAR) supported countries. This manuscript evaluates the transference of newly gained knowledge and skills to the participants’ place of employment for HIV and TB diagnostic laboratory programs. It also serves as a follow-on to a previously published manuscript that measured training effectiveness for all courses offered at ACILT. Methods ACILT offered 20 Laboratory Biosafety and Infrastructure courses (2008–2014), also referred as biosafety course/course comprising of 14 core laboratory safety elements to 402 participants from 22 countries. In 2015, participants received 22 e-questions divided into four categories: (1) Safety Policies, (2) Management’s Engagement, (3) Safety Programs and (4) Assessments of Safety Practices to determine retrospectively the training effectiveness of biosafety practices in their place of employment 6 months before and after attending their course. We used Kirkpatrick model to assess the transference of knowledge, skills and obstructive factors. Results 20% (81/402) of the participants completed the e-questionnaire. The overall percentage of positive responses indicating implementation of new safety practices increased from 50% to 84%. Improvement occurred in all four categories after attending the course, with the greatest increases in Safety Policies (67–94%) and Safety Programs (43–91%). Creating a safety committee, allocating resources, and establishing a facility safety policy were important drivers for implementing and maintaining laboratory safety practices. In addition, accredited laboratories and countries with national safety regulations or policies had a higher percentage of improvements. The most reported challenges were inadequate funding and lack of management enforcement. Conclusions PEPFAR and other partners’ investments in training institutions, such as ACILT, were effective in building sustainable country ownership to strengthen biosafety practices and were leveraged to combat zoonotic diseases and COVID-19. Although support continues at the national/regional level, a standardized, coordinated and continent-wide sustainable approach to offer a biosafety program-like ACILT is missing. Continuous offerings of biosafety programs similar to ACILT could contribute to sustainable strengthening of laboratory biosafety, QMS and pandemic preparedness

    Active Tuberculosis Is Associated with Worse Clinical Outcomes in HIV-Infected African Patients on Antiretroviral Therapy

    Get PDF
    <div><h3>Objective</h3><p>This cohort study utilized data from a large HIV treatment program in western Kenya to describe the impact of active tuberculosis (TB) on clinical outcomes among African patients on antiretroviral therapy (ART).</p> <h3>Design</h3><p>We included all patients initiating ART between March 2004 and November 2007. Clinical (signs and symptoms), radiological (chest radiographs) and laboratory (mycobacterial smears, culture and tissue histology) criteria were used to record the diagnosis of TB disease in the program’s electronic medical record system.</p> <h3>Methods</h3><p>We assessed the impact of TB disease on mortality, loss to follow-up (LTFU) and incident AIDS-defining events (ADEs) through Cox models and CD4 cell and weight response to ART by non-linear mixed models.</p> <h3>Results</h3><p>We studied 21,242 patients initiating ART–5,186 (24%) with TB; 62% female; median age 37 years. There were proportionately more men in the active TB (46%) than in the non-TB (35%) group. Adjusting for baseline HIV-disease severity, TB patients were more likely to die (hazard ratio – HR = 1.32, 95% CI 1.18–1.47) or have incident ADEs (HR = 1.31, 95% CI: 1.19–1.45). They had lower median CD4 cell counts (77 versus 109), weight (52.5 versus 55.0 kg) and higher ADE risk at baseline (CD4-adjusted odds ratio = 1.55, 95% CI: 1.31–1.85). ART adherence was similarly good in both groups. Adjusting for gender and baseline CD4 cell count, TB patients experienced virtually identical rise in CD4 counts after ART initiation as those without. However, the overall CD4 count at one year was lower among patients with TB (251 versus 269 cells/µl).</p> <h3>Conclusions</h3><p>Clinically detected TB disease is associated with greater mortality and morbidity despite salutary response to ART. Data suggest that identifying HIV patients co-infected with TB earlier in the HIV-disease trajectory may not fully address TB-related morbidity and mortality.</p> </div

    Progress in scale up of HIV viral load testing in select sub-Saharan African countries 2016–2018

    No full text
    Introduction We assessed progress in HIV viral load (VL) scale up across seven sub-Saharan African (SSA) countries and discussed challenges and strategies for improving VL coverage among patients on anti-retroviral therapy (ART). Methods A retrospective review of VL testing was conducted in Côte d’Ivoire, Kenya, Lesotho, Malawi, Namibia, Tanzania, and Uganda from January 2016 through June 2018. Data were collected and included the cumulative number of ART patients, number of patients with ≥ 1 VL test result (within the preceding 12 months), the percent of VL test results indicating viral suppression, and the mean turnaround time for VL testing. Results Between 2016 and 2018, the proportion of PLHIV on ART in all 7 countries increased (range 5.7%–50.2%). During the same time period, the cumulative number of patients with one or more VL test increased from 22,996 to 917,980. Overall, viral suppression rates exceeded 85% for all countries except for Côte d’Ivoire at 78% by June 2018. Reported turnaround times for VL testing results improved in 5 out of 7 countries by between 5.4 days and 27.5 days. Conclusions These data demonstrate that remarkable progress has been made in the scale-up of HIV VL testing in the seven SSA countries

    Results from the piece-wise linear model for CD4 response after ART initiation.

    No full text
    1<p>Confidence intervals were derived based on the delta method of approximation, using estimates of the variance of the parameters produced by the GEE model. Note that the overall differences in CD4 counts present in the TB versus non-TB groups, are not evident among the three CD4 groupings suggesting that differences in CD4 response are a function of higher rates of baseline immunosuppression among TB patients rather than an independent TB-associated effect.</p
    corecore