92 research outputs found

    Doctors’ practice and attitudes towards red blood cell transfusion at Mthatha Regional Hospital, Eastern Cape, South Africa: A mixed methods study

    Full text link
    Background: Unnecessary blood transfusion exposes recipients to potential harms. Aim: The aim of this study was to describe blood transfusion practice and explore doctors’ attitudes towards transfusion. Setting: A hospital providing level 1 and 2 services. Methods: A mixed-methods study design was used. In the cross-sectional descriptive component, a sample was taken from patients transfused over a 2-month period. Blood use was categorised as for medical anaemia or haemorrhage, and appropriate or not. The qualitative component comprised a purposeful sample for focus group and individual semi-structured interviews. Results: Of 239 patients sampled, 62% were transfused for medical anaemia and 38% for haemorrhage. In the medical anaemia group, compliance with age-appropriate transfusion thresholds was 69%. In medical anaemia and haemorrhage, 114 (77%) and 85 (93.4%) of recipients had orders for ≥ 2 red blood cell (RBC) units, respectively. In adults ≥ 18 years old with medical anaemia, 47.1% of orders would have resulted in a haemoglobin (Hb) < 8 g/dL. Six doctors participated in focus group and eleven in individual interviews. There was a lack of awareness of institutional transfusion guidelines, disagreement on appropriate RBC transfusion thresholds and comments that more than one RBC unit should always be transfused. Factors informing decisions to transfuse included advice from senior colleagues, relieving symptoms of anaemia and high product costs. Conclusion: Most orders were for two or more units. In medical anaemia, doctors’ compliance with RBC transfusion thresholds was reasonable; however, almost half of the orders would have resulted in overtransfusion. The attitudes of doctors sampled suggest that their transfusion practice is influenced more by institutional values than formal guidelines

    Evaluation of the alignment of policies and practices for state-sponsored educational initiatives for sustainable health workforce solutions in selected Southern African countries: A protocol, multimethods study

    Full text link
    Introduction Health systems across the world are facing challenges with shortages and maldistribution of skilled health professionals. Return-of-service (ROS) initiatives are government-funded strategies used to educate health professionals by contracting beneficiaries to undertake government work on a year-for-year basis after their qualification. It is envisaged that once they have served their contract, they will be attracted to serve in the same area or government establishment beyond the duration of their obligatory period. Little is known about the processes that led to the development and implementation of ROS policies. Furthermore, there is no systematic evaluation of the strategies that demonstrate their utility. This research aims to evaluate the ROS initiatives, explore their efficacy and sustainability in five Southern African countries. Methods and analysis This study will be conducted in South Africa, Eswatini, Lesotho, Botswana and Namibia in a phased approach through a multimethods approach of policy reviews, quantitative and qualitative research. First, a review will be conducted to explore current ROS schemes. Second, a quantitative retrospective cohort study of ROS scheme recipients for the period 2000-2010 will be undertaken. Information will be sourced from multiple provincial or national information systems and/or databases. Third, we will conduct semistructured group or individual interviews with senior health, education, ROS managing agency managers (where appropriate) and finance managers and/policy makers in each country to determine managers' perceptions, challenges and the costs and benefits of these schemes. Fourth, we will interview or conduct group discussions with health professional regulatory bodies to assess their willingness to collaborate with ROS initiative funders. Ethics and dissemination Ethics approval for this study was obtained through the Human Research Ethics Committees of the University of New South Wales (HC200519), Australia; South Africa and Lesotho (065/2020); Eswatini (SHR302/2020); Namibia (SK001); and Botswana (HPDME 13/18/1). Relevant findings will be shared through presentations to participating governments, publications in peer-reviewed journals and presentations at relevant conferences

    Performance-based incentives and community health workers’ outputs, a systematic review

    Full text link
    Objective To review the evidence on the impact on measurable outcomes of performance-based incentives for community health workers (CHWs) in low-and middle-income countries. Methods We conducted a systematic review of intervention studies published before November 2020 that evaluated the impact of financial and non-financial performance-based incentives for CHWs. Outcomes included patient health indicators; quality, utilization or delivery of health-care services; and CHW motivation or satisfaction. We assessed risk of bias for all included studies using the Cochrane tool. We based our narrative synthesis on a framework for measuring the performance of CHW programmes, comprising inputs, processes, performance outputs and health outcomes. Findings Two reviewers screened 2811 records; we included 12 studies, 11 of which were randomized controlled trials and one a non-randomized trial. We found that non-financial, publicly displayed recognition of CHWs’ efforts was effective in improved service delivery outcomes. While large financial incentives were more effective than small ones in bringing about improved performance, they often resulted in the reallocation of effort away from other, non-incentivized tasks. We found no studies that tested a combined package of financial and non-financial incentives. The rationale for the design of performance-based incentives or explanation of how incentives interacted with contextual factors were rarely reported. Conclusion Financial performance-based incentives alone can improve CHW service delivery outcomes, but at the risk of unincentivized tasks being neglected. As calls to professionalize CHW programmes gain momentum, research that explores the interactions among different forms of incentives, context and sustainability is needed

    Performance-based incentives and community health workers' outputs, a systematic review.

    Full text link
    ObjectiveTo review the evidence on the impact on measurable outcomes of performance-based incentives for community health workers (CHWs) in low- and middle-income countries.MethodsWe conducted a systematic review of intervention studies published before November 2020 that evaluated the impact of financial and non-financial performance-based incentives for CHWs. Outcomes included patient health indicators; quality, utilization or delivery of health-care services; and CHW motivation or satisfaction. We assessed risk of bias for all included studies using the Cochrane tool. We based our narrative synthesis on a framework for measuring the performance of CHW programmes, comprising inputs, processes, performance outputs and health outcomes.FindingsTwo reviewers screened 2811 records; we included 12 studies, 11 of which were randomized controlled trials and one a non-randomized trial. We found that non-financial, publicly displayed recognition of CHWs' efforts was effective in improved service delivery outcomes. While large financial incentives were more effective than small ones in bringing about improved performance, they often resulted in the reallocation of effort away from other, non-incentivized tasks. We found no studies that tested a combined package of financial and non-financial incentives. The rationale for the design of performance-based incentives or explanation of how incentives interacted with contextual factors were rarely reported.ConclusionFinancial performance-based incentives alone can improve CHW service delivery outcomes, but at the risk of unincentivized tasks being neglected. As calls to professionalize CHW programmes gain momentum, research that explores the interactions among different forms of incentives, context and sustainability is needed

    How many mosquito nets are needed to achieve universal coverage? Recommendations for the quantification and allocation of long-lasting insecticidal nets for mass campaigns

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Long-lasting insecticidal nets are an effective tool for malaria prevention, and "universal coverage" with such nets is increasingly the goal of national malaria control programmes. However, national level campaigns in several countries have run out of nets in the course of distribution, indicating a problem in the method used to estimate the quantity needed.</p> <p>Presentation of hypothesis</p> <p>A major reason for the shortfall in estimation is the mismatch between the quantification factor used to plan procurement and the allocation algorithm used at community level, in particular the effect of needing to add an additional net to households with an odd number of inhabitants. To solve this problem a revised quantification factor is suggested.</p> <p>Testing hypothesis</p> <p>Based on data from a broad range of household surveys across Africa, the effect of odd-numbered households on numbers of nets distributed is estimated via two frequently used allocation methods. The impact of these algorithms on the proportion of households reaching a person to net ratio of 2:1, a frequently used marker of universal coverage is then calculated.</p> <p>Implications</p> <p>In order to avoid stock-outs of nets during national coverage campaigns, it is recommended to use a quantification factor of 1.78 people per net, with an additional allocation factor suggested to account for other common problems at the community level resulting in a final recommended ratio of 1.60 people per net. It is also recommend that community level allocation procedures be aligned with procurement estimates to reduce shortages of nets during campaign distributions. These analyses should enable programme managers to make evidence-based decisions and support a more efficient and effective use of LLIN distribution campaign resources.</p

    Prozone in malaria rapid diagnostics tests: how many cases are missed?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Prozone means false-negative or false-low results in antigen-antibody reactions, due to an excess of either antigen or antibody. The present study prospectively assessed its frequency for malaria rapid diagnostic tests (RDTs) and <it>Plasmodium falciparum </it>samples in an endemic field setting.</p> <p>Methods</p> <p>From January to April 2010, blood samples with <it>P. falciparum </it>high parasitaemia (≥ 4% red blood cells infected) were obtained from patients presenting at the Provincial Hospital of Tete (Mozambique). Samples were tested undiluted and 10-fold diluted in saline with a panel of RDTs and results were scored for line intensity (no line visible, faint, weak, medium and strong). Prozone was defined as a sample which showed no visible test line or a faint or weak test line when tested undiluted, and a visible test line of higher intensity when tested 10-fold diluted, as observed by two blinded observers and upon duplicate testing.</p> <p>Results</p> <p>A total of 873/7,543 (11.6%) samples showed <it>P. falciparum</it>, 92 (10.5%) had high parasitaemia and 76 were available for prozone testing. None of the two Pf-pLDH RDTs, but all six HRP-2 RDTs showed prozone, at frequencies between 6.7% and 38.2%. Negative and faint HRP-2 lines accounted for four (3.8%) and 15 (14.4%) of the 104 prozone results in two RDT brands. For the most affected brand, the proportions of prozone with no visible or faint HRP-2 lines were 10.9% (CI: 5.34-19.08), 1.2% (CI: 0.55-2.10) and 0.1% (CI: 0.06-0.24) among samples with high parasitaemia, all positive samples and all submitted samples respectively. Prozone occurred mainly, but not exclusively, among young children.</p> <p>Conclusion</p> <p>Prozone occurs at different frequency and intensity in HRP-2 RDTs and may decrease diagnostic accuracy in the most affected RDTs.</p

    Transcriptional Profiling of Plasmodium falciparum Parasites from Patients with Severe Malaria Identifies Distinct Low vs. High Parasitemic Clusters

    Get PDF
    Background: In the past decade, estimates of malaria infections have dropped from 500 million to 225 million per year; likewise, mortality rates have dropped from 3 million to 791,000 per year. However, approximately 90% of these deaths continue to occur in sub-Saharan Africa, and 85% involve children less than 5 years of age. Malaria mortality in children generally results from one or more of the following clinical syndromes: severe anemia, acidosis, and cerebral malaria. Although much is known about the clinical and pathological manifestations of CM, insights into the biology of the malaria parasite, specifically transcription during this manifestation of severe infection, are lacking. Methods and Findings: We collected peripheral blood from children meeting the clinical case definition of cerebral malaria from a cohort in Malawi, examined the patients for the presence or absence of malaria retinopathy, and performed whole genome transcriptional profiling for Plasmodium falciparum using a custom designed Affymetrix array. We identified two distinct physiological states that showed highly significant association with the level of parasitemia. We compared both groups of Malawi expression profiles with our previously acquired ex vivo expression profiles of parasites derived from infected patients with mild disease; a large collection of in vitro Plasmodium falciparum life cycle gene expression profiles; and an extensively annotated compendium of expression data from Saccharomyces cerevisiae. The high parasitemia patient group demonstrated a unique biology with elevated expression of Hrd1, a member of endoplasmic reticulum-associated protein degradation system. Conclusions: The presence of a unique high parasitemia state may be indicative of the parasite biology of the clinically recognized hyperparasitemic severe disease syndrome

    High Prevalence of Malaria in Zambezia, Mozambique: The Protective Effect of IRS versus Increased Risks Due to Pig-Keeping and House Construction

    Get PDF
    BACKGROUND: African countries are scaling up malaria interventions, especially insecticide treated nets (ITN) and indoor residual spraying (IRS), for which ambitious coverage targets have been set. In spite of these efforts infection prevalence remains high in many parts of the continent. This study investigated risk factors for malaria infection in children using three malaria indicator surveys from Zambezia province, Mozambique. The impact of IRS and ITNs, the effects of keeping farm animals and of the construction material of roofs of houses and other potential risk factors associated with malaria infection in children were assessed. METHODS: Cross-sectional community-based surveys were conducted in October of 2006, 2007 and 2008. A total of 8338 children (ages 1-15 years) from 2748 households were included in the study. All children were screened for malaria by rapid diagnostic tests. Caregiver interviews were used to assess household demographic and wealth characteristics and ITN and IRS coverage. Associations between malaria infection, vector control interventions and potential risk factors were assessed. RESULTS: Overall, the prevalence of malaria infection was 47.8% (95%CI: 38.7%-57.1%) in children 1-15 years of age, less than a quarter of children (23.1%, 95%CI: 19.1%-27.6%) were sleeping under ITN and almost two thirds were living in IRS treated houses (coverage 65.4%, 95%CI: 51.5%-77.0%). Protective factors that were independently associated with malaria infection were: sleeping in an IRS house without sleeping under ITN (Odds Ratio (OR)= 0.6; 95%CI: 0.4-0.9); additional protection due to sleeping under ITN in an IRS treated house (OR = 0.5; 95%CI: 0.3-0.7) versus sleeping in an unsprayed house without a ITN; and parental education (primary/secondary: OR = 0.6; 95%CI: 0.5-0.7) versus parents with no education. Increased risk of infection was associated with: current fever (OR = 1.2; 95%CI: 1.0-1.5) versus no fever; pig keeping (OR = 3.2; 95%CI: 2.1-4.9) versus not keeping pigs; living in houses with a grass roof (OR = 1.7; 95%CI: 1.3-2.4) versus other roofing materials and bigger household size (8-15 people: OR = 1.6; 95%CI: 1.3-2.1) versus small households (1-4 persons). CONCLUSION: Malaria infection among children under 15 years of age in Zambezia remained high but conventional malaria vector control methods, in particular IRS, provided effective means of protection. Household ownership of farm animals, particularly pigs, and living in houses with a grass roof were independently associated with increased risk of infection, even after allowing for household wealth. To reduce the burden of malaria, national control programs need to ensure high coverage of effective IRS and promote the use of ITNs, particularly in households with elevated risks of infection, such as those keeping farm animals, and those with grass roofs
    corecore