67 research outputs found

    Testing the assumptions of an indicator of unmet need for obstetric surgery in Ghana: A cross-sectional study of linked hospital and population-based delivery data.

    Get PDF
    BACKGROUND: The Unmet Obstetric Need (UON) indicator has been widely used to estimate unmet need for life-saving surgery at birth; however, its assumptions have not been verified. The objective of this study was to test two UON assumptions: (a) Absolute maternal indications (AMIs) require surgery for survival and (b) 1%-2% of deliveries develop AMIs, implying that rates of surgeries for AMIs below this threshold indicate excess mortality from these complications. METHODS: We used linked hospital and population-based data in central Ghana. Among hospital deliveries, we calculated the percentage of deliveries with AMIs who received surgery, and mortality among AMIs who did not. At the population level, we assessed whether the percentage of deliveries with surgeries for AMIs was inversely associated with mortality from these complications, stratified by education. RESULTS: A total of 380 of 387 (98%) hospital deliveries with recorded AMIs received surgery; an additional eight women with no AMI diagnosis died of AMI-related causes. Among the 50 148 deliveries in the population, surgeries for AMIs increased from 0.6% among women with no education to 1.9% among women with post-secondary education (P < .001). However, there was no association between AMI-related mortality and education (P = .546). Estimated AMI prevalence was 0.84% (95% CI: 0.76%-0.92%), below the assumed 1% minimum threshold. DISCUSSION: Obstetric providers consider AMIs absolute indications for surgery. However, low rates of surgeries for AMIs among less educated women were not associated with higher mortality. The UON indicator should be used with caution in estimating the unmet need for life-saving obstetric surgery; innovative approaches are needed to identify unmet need in the context of rising cesarean rates

    Implementing effective community-based surveillance in research studies of maternal, newborn and infant outcomes in low resource settings

    Get PDF
    BACKGROUND: Globally adopted health and development milestones have not only encouraged improvements in the health and wellbeing of women and infants worldwide, but also a better understanding of the epidemiology of key outcomes and the development of effective interventions in these vulnerable groups. Monitoring of maternal and child health outcomes for milestone tracking requires the collection of good quality data over the long term, which can be particularly challenging in poorly-resourced settings. Despite the wealth of general advice on conducting field trials, there is a lack of specific guidance on designing and implementing studies on mothers and infants. Additional considerations are required when establishing surveillance systems to capture real-time information at scale on pregnancies, pregnancy outcomes, and maternal and infant health outcomes. MAIN BODY: Based on two decades of collaborative research experience between the Kintampo Health Research Centre in Ghana and the London School of Hygiene and Tropical Medicine, we propose a checklist of key items to consider when designing and implementing systems for pregnancy surveillance and the identification and classification of maternal and infant outcomes in research studies. These are summarised under four key headings: understanding your population; planning data collection cycles; enhancing routine surveillance with additional data collection methods; and designing data collection and management systems that are adaptable in real-time. CONCLUSION: High-quality population-based research studies in low resource communities are essential to ensure continued improvement in health metrics and a reduction in inequalities in maternal and infant outcomes. We hope that the lessons learnt described in this paper will help researchers when planning and implementing their studies

    Impact of adversity on early childhood growth & development in rural India: Findings from the early life stress sub-study of the SPRING cluster randomised controlled trial (SPRING-ELS)

    Get PDF
    INTRODUCTION: Early childhood development is key to achieving the Sustainable Development Goals and can be negatively influenced by many different adversities including violence in the home, neglect, abuse and parental ill-health. We set out to quantify the extent to which multiple adversities are associated with impaired early childhood growth & development. METHODS: This was a substudy of the SPRING cluster randomised controlled trial covering the whole population of 120 villages of rural India. We assessed all children born from 18 June 2015 for adversities in the first year of life and summed these to make a total cumulative adversity score, and four subscale scores. We assessed the association of each of these with weight-for-age z-score, length-for-age z-score, and the motor, cognitive and language developmental scales of the Bayley Scales of Infant Development III assessed at 18 months. RESULTS: We enrolled 1726 children soon after birth and assessed 1273 of these at both 12 and 18 months of age. There were consistent and strongly negative relationships between all measures of childhood adversity and all five child growth & development outcome measures at 18 months of age. For the Bayley motor scale, each additional adversity was associated with a 1.1 point decrease (95%CI -1.3, -0.9); for the cognitive scales this was 0.8 points (95%CI -1.0, -0.6); and for language this was 1.4 points (95%CI -1.9, -1.1). Similarly for growth, each additional adversity was associated with a -0.09 change in weight-for-age z-score (-0.11, -0.06) and -0.12 change in height-for-age z-score (-0.14, -0.09). DISCUSSION: Our results are the first from a large population-based study in a low/middle-income country to show that each increase in adversity in multiple domains increases risk to child growth and development at a very early age. There is an urgent need to act to improve these outcomes for young children in LMICs and these findings suggest that Early Childhood programmes should prioritise early childhood adversity because of its impact on developmental inequities from the very start

    Vaccination timing of low-birth-weight infants in rural Ghana: a population-based, prospective cohort study

    Get PDF
    Objective: To investigate delays in first and third dose diphtheria–tetanus–pertussis (DTP1 and DTP3) vaccination in low-birth-weight infants in Ghana, and the associated determinants. Methods: We used data from a large, population-based vitamin A trial in 2010–2013, with 22 955 enrolled infants. We measured vaccination rate and maternal and infant characteristics and compared three categories of low-birth-weight infants (2.0–2.4 kg; 1.5–1.9 kg; and <1.5 kg) with infants weighing ≥2.5 kg. Poisson regression was used to calculate vaccination rate ratios for DTP1 at 10, 14 and 18 weeks after birth, and for DTP3 at 18, 22 and 24 weeks (equivalent to 1, 2 and 3 months after the respective vaccination due dates of 6 and 14 weeks). Findings: Compared with non-low-birth-weight infants (n=18 979), those with low birth weight (n=3382) had an almost 40% lower DTP1 vaccination rate at age 10 weeks (adjusted rate ratio, aRR: 0.58; 95% confidence interval, CI: 0.43–0.77) and at age 18 weeks (aRR: 0.63; 95% CI: 0.50–0.80). Infants weighing 1.5–1.9 kg (n=386) had vaccination rates approximately 25% lower than infants weighing ≥2.5 kg at these time points. Similar results were observed for DTP3. Lower maternal age, educational attainment and longer distance to the nearest health facility were associated with lower DTP1 and DTP3 vaccination rates. Conclusion: Low-birth-weight infants are a high-risk group for delayed vaccination in Ghana. Efforts to improve the vaccination of these infants are warranted, alongside further research to understand the reasons for the delays

    Whole genome sequencing of Plasmodium falciparum from dried blood spots using selective whole genome amplification

    Get PDF
    BACKGROUND: Translating genomic technologies into healthcare applications for the malaria parasite Plasmodium falciparum has been limited by the technical and logistical difficulties of obtaining high quality clinical samples from the field. Sampling by dried blood spot (DBS) finger-pricks can be performed safely and efficiently with minimal resource and storage requirements compared with venous blood (VB). Here, the use of selective whole genome amplification (sWGA) to sequence the P. falciparum genome from clinical DBS samples was evaluated, and the results compared with current methods that use leucodepleted VB. METHODS: Parasite DNA with high (&gt;95%) human DNA contamination was selectively amplified by Phi29 polymerase using short oligonucleotide probes of 8-12 mers as primers. These primers were selected on the basis of their differential frequency of binding the desired (P. falciparum DNA) and contaminating (human) genomes. RESULTS: Using sWGA method, clinical samples from 156 malaria patients, including 120 paired samples for head-to-head comparison of DBS and leucodepleted VB were sequenced. Greater than 18-fold enrichment of P. falciparum DNA was achieved from DBS extracts. The parasitaemia threshold to achieve &gt;5× coverage for 50% of the genome was 0.03% (40 parasites per 200 white blood cells). Over 99% SNP concordance between VB and DBS samples was achieved after excluding missing calls. CONCLUSION: The sWGA methods described here provide a reliable and scalable way of generating P. falciparum genome sequence data from DBS samples. The current data indicate that it will be possible to get good quality sequence on most if not all drug resistance loci from the majority of symptomatic malaria patients. This technique overcomes a major limiting factor in P. falciparum genome sequencing from field samples, and paves the way for large-scale epidemiological applications

    Machine learning approaches classify clinical malaria outcomes based on haematological parameters

    Get PDF
    Background Malaria is still a major global health burden, with more than 3.2 billion people in 91 countries remaining at risk of the disease. Accurately distinguishing malaria from other diseases, especially uncomplicated malaria (UM) from non-malarial infections (nMI), remains a challenge. Furthermore, the success of rapid diagnostic tests (RDTs) is threatened by Pfhrp2/3 deletions and decreased sensitivity at low parasitaemia. Analysis of haematological indices can be used to support the identification of possible malaria cases for further diagnosis, especially in travellers returning from endemic areas. As a new application for precision medicine, we aimed to evaluate machine learning (ML) approaches that can accurately classify nMI, UM, and severe malaria (SM) using haematological parameters. Methods We obtained haematological data from 2,207 participants collected in Ghana: nMI (n = 978), SM (n = 526), and UM (n = 703). Six different ML approaches were tested, to select the best approach. An artificial neural network (ANN) with three hidden layers was used for multi-classification of UM, SM, and uMI. Binary classifiers were developed to further identify the parameters that can distinguish UM or SM from nMI. Local interpretable model-agnostic explanations (LIME) were used to explain the binary classifiers. Results The multi-classification model had greater than 85% training and testing accuracy to distinguish clinical malaria from nMI. To distinguish UM from nMI, our approach identified platelet counts, red blood cell (RBC) counts, lymphocyte counts, and percentages as the top classifiers of UM with 0.801 test accuracy (AUC = 0.866 and F1 score = 0.747). To distinguish SM from nMI, the classifier had a test accuracy of 0.96 (AUC = 0.983 and F1 score = 0.944) with mean platelet volume and mean cell volume being the unique classifiers of SM. Random forest was used to confirm the classifications, and it showed that platelet and RBC counts were the major classifiers of UM, regardless of possible confounders such as patient age and sampling location. Conclusion The study provides proof of concept methods that classify UM and SM from nMI, showing that the ML approach is a feasible tool for clinical decision support. In the future, ML approaches could be incorporated into clinical decision-support algorithms for the diagnosis of acute febrile illness and monitoring response to acute SM treatment particularly in endemic settings

    The Accuracy and Perception of Test-Based Management of Malaria at Private Licensed Chemical Shops in the Middle Belt of Ghana.

    Get PDF
    The sale of artemisinin-based combination therapy (ACT) by private licensed chemical shops (LCS) without testing is contrary to current policy recommendations. This study assessed the accuracy and perception of test-based management of malaria using malaria rapid diagnostic test (mRDT) kits at private LCS in two predominantly rural areas in the middle part of Ghana. Clients presenting at LCS with fever or other signs and symptoms suspected to be malaria in the absence of signs of severe malaria were tested with mRDT by trained attendants and treated based on the national malaria treatment guidelines. Using structured questionnaires, exit interviews were conducted within 48 hours and a follow-up interview on day 7 (±3 days). Focus group discussions and in-depth interviews were also conducted to assess stakeholders' perception on the use of mRDT at LCS. About 79.0% (N = 1,797) of clients reported with a fever. Sixty-six percent (947/1,426) of febrile clients had a positive mRDT result. Eighty-six percent (815/947) of clients with uncomplicated malaria were treated with the recommended ACT. About 97.8% (790/808) of clients with uncomplicated malaria treated with ACT were reported to be well by day 7. However, referral for those with negative mRDT results was very low (4.1%, 27/662). A high proportion of clients with a positive mRDT result received the recommended malaria treatment. Test-based management of malaria by LCS attendants was found to be feasible and acceptable by the community members and other stakeholders. Successful implementation will however require effective referral, supervision and quality control systems

    Targeted Next Generation Sequencing for malaria research in Africa:Current status and outlook

    Get PDF
    Targeted Next Generation Sequencing (TNGS) is an efficient and economical Next Generation Sequencing (NGS) platform and the preferred choice when specific genomic regions are of interest. So far, only institutions located in middle and high-income countries have developed and implemented the technology, however, the efficiency and cost savings, as opposed to more traditional sequencing methodologies (e.g. Sanger sequencing) make the approach potentially well suited for resource-constrained regions as well. In April 2018, scientists from the Plasmodium Diversity Network Africa (PDNA) and collaborators met during the 7th Pan African Multilateral Initiative of Malaria (MIM) conference held in Dakar, Senegal to explore the feasibility of applying TNGS to genetic studies and malaria surveillance in Africa. The group of scientists reviewed the current experience with TNGS platforms in sub-Saharan Africa (SSA) and identified potential roles the technology might play to accelerate malaria research, scientific discoveries and improved public health in SSA. Research funding, infrastructure and human resources were highlighted as challenges that will have to be mitigated to enable African scientists to drive the implementation of TNGS in SSA. Current roles of important stakeholders and strategies to strengthen existing networks to effectively harness this powerful technology for malaria research of public health importance were discussed

    In silico characterisation of putative Plasmodium falciparum vaccine candidates in African malaria populations.

    Get PDF
    Genetic diversity of surface exposed and stage specific Plasmodium falciparum immunogenic proteins pose a major roadblock to developing an effective malaria vaccine with broad and long-lasting immunity. We conducted a prospective genetic analysis of candidate antigens (msp1, ama1, rh5, eba175, glurp, celtos, csp, lsa3, Pfsea, trap, conserved chrom3, hyp9, hyp10, phistb, surfin8.2, and surfin14.1) for malaria vaccine development on 2375 P. falciparum sequences from 16 African countries. We described signatures of balancing selection inferred from positive values of Tajima's D for all antigens across all populations except for glurp. This could be as a result of immune selection on these antigens as positive Tajima's D values mapped to regions with putative immune epitopes. A less diverse phistb antigen was characterised with a transmembrane domain, glycophosphatidyl anchors between the N and C- terminals, and surface epitopes that could be targets of immune recognition. This study demonstrates the value of population genetic and immunoinformatic analysis for identifying and characterising new putative vaccine candidates towards improving strain transcending immunity, and vaccine efficacy across all endemic populations

    Indoor residual spraying with a non-pyrethroid insecticide reduces the reservoir of <i>Plasmodium falciparum</i> in a high-transmission area in northern Ghana

    Get PDF
    High-malaria burden countries in sub-Saharan Africa are shifting from malaria control towards elimination. Hence, there is need to gain a contemporary understanding of how indoor residual spraying (IRS) with non-pyrethroid insecticides when combined with long-lasting insecticidal nets (LLINs) impregnated with pyrethroid insecticides, contribute to the efforts of National Malaria Control Programmes to interrupt transmission and reduce the reservoir of Plasmodium falciparum infections across all ages. Using an interrupted time-series study design, four age-stratified malariometric surveys, each of ~2,000 participants, were undertaken pre- and post-IRS in Bongo District, Ghana. Following the application of three-rounds of IRS, P. falciparum transmission intensity declined, as measured by a >90% reduction in the monthly entomological inoculation rate. This decline was accompanied by reductions in parasitological parameters, with participants of all ages being significantly less likely to harbor P. falciparum infections at the end of the wet season post-IRS (aOR = 0.22 [95% CI: 0.19–0.26], p-value < 0.001). In addition, multiplicity of infection (MOIvar) was measured using a parasite fingerprinting tool, designed to capture within-host genome diversity. At the end of the wet season post-IRS, the prevalence of multi-genome infections declined from 75.6% to 54.1%. This study demonstrates that in areas characterized by high seasonal malaria transmission, IRS in combination with LLINs can significantly reduce the reservoir of P. falciparum infection. Nonetheless despite this success, 41.6% of the population, especially older children and adolescents, still harboured multi-genome infections. Given the persistence of this diverse reservoir across all ages, these data highlight the importance of sustaining vector control in combination with targeted chemotherapy to move high-transmission settings towards pre-elimination. This study also points to the benefits of molecular surveillance to ensure that incremental achievements are not lost and that the goals advocated for in the WHO’s High Burden to High Impact strategy are realized
    • …
    corecore