60 research outputs found

    Accuracy of Malaria Rapid Diagnostic Tests in Community Studies and their Impact on Treatment of Malaria in an Area with Declining Malaria Burden in North-Eastern Tanzania.

    Get PDF
    Despite some problems related to accuracy and applicability of malaria rapid diagnostic tests (RDTs), they are currently the best option in areas with limited laboratory services for improving case management through parasitological diagnosis and reducing over-treatment. This study was conducted in areas with declining malaria burden to assess; 1) the accuracy of RDTs when used at different community settings, 2) the impact of using RDTs on anti-malarial dispensing by community-owned resource persons (CORPs) and 3) adherence of CORPs to treatment guidelines by providing treatment based on RDT results. Data were obtained from: 1) a longitudinal study of passive case detection of fevers using CORPs in six villages in Korogwe; and 2) cross-sectional surveys (CSS) in six villages of Korogwe and Muheza districts, north-eastern, Tanzania. Performance of RDTs was compared with microscopy as a gold standard, and factors affecting their accuracy were explored using a multivariate logistic regression model. Overall sensitivity and specificity of RDTs in the longitudinal study (of 23,793 febrile cases; 18,154 with microscopy and RDTs results) were 88.6% and 88.2%, respectively. In the CSS, the sensitivity was significantly lower (63.4%; χ2=367.7, p<0.001), while the specificity was significantly higher (94.3%; χ2=143.1, p<0.001) when compared to the longitudinal study. As determinants of sensitivity of RDTs in both studies, parasite density of<200 asexual parasites/μl was significantly associated with high risk of false negative RDTs (OR≥16.60, p<0.001), while the risk of false negative test was significantly lower among cases with fever (axillary temperature ≥37.5 °C) (OR≤0.63, p≤0.027). The risk of false positive RDT (as a determinant of specificity) was significantly higher in cases with fever compared to afebrile cases (OR≥2.40, p<0.001). Using RDTs reduced anti-malarials dispensing from 98.9% to 32.1% in cases aged ≥5 years. Although RDTs had low sensitivity and specificity, which varied widely depending on fever and parasite density, using RDTs reduced over-treatment with anti-malarials significantly. Thus, with declining malaria prevalence, RDTs will potentially identify majority of febrile cases with parasites and lead to improved management of malaria and non-malaria fevers

    Designing a sustainable strategy for malaria control?

    Get PDF
    Malaria in the 21st century is showing signs of declining over much of its distribution, including several countries in Africa where previously this was not thought to be feasible. Yet for the most part the strategies to attack the infection are similar to those of the 1950s. Three major Journals have recently drawn attention to the situation, stressing the importance of research, describing the successes and defining semantics related to control. But there is a need to stress the importance of local sustainability, and consider somewhat urgently how individual endemic countries can plan and implement the programmes that are currently financed, for the most part, by donor institutions. On an immediate basis research should be more focused on a data driven approach to control. This will entail new thinking on the role of local infrastructure and in training of local scientists in local universities in epidemiology and field malariology so that expanded control programmes can become operational. Donor agencies should encourage and facilitate development of career opportunities for such personnel so that local expertise is available to contribute appropriately

    Using rapid diagnostic tests as source of malaria parasite DNA for molecular analyses in the era of declining malaria prevalence

    Get PDF
    BACKGROUND: Malaria prevalence has recently declined markedly in many parts of Tanzania and other sub-Saharan African countries due to scaling-up of control interventions including more efficient treatment regimens (e.g. artemisinin-based combination therapy) and insecticide-treated bed nets. Although continued molecular surveillance of malaria parasites is important to early identify emerging anti-malarial drug resistance, it is becoming increasingly difficult to obtain parasite samples from ongoing studies, such as routine drug efficacy trials. To explore other sources of parasite DNA, this study was conducted to examine if sufficient DNA could be successfully extracted from malaria rapid diagnostic tests (RDTs), used and collected as part of routine case management services in health facilities, and thus forming the basis for molecular analyses, surveillance and quality control (QC) testing of RDTs. METHODS: One hyper-parasitaemic blood sample (131,260 asexual parasites/μl) was serially diluted in triplicates with whole blood and blotted on RDTs. DNA was extracted from the RDT dilution series, either immediately or after storage for one month at room temperature. The extracted DNA was amplified using a nested PCR method for Plasmodium species detection. Additionally, 165 archived RDTs obtained from ongoing malaria studies were analysed to determine the amplification success and test applicability of RDT for QC testing. RESULTS: DNA was successfully extracted and amplified from the three sets of RDT dilution series and the minimum detection limit of PCR was <1 asexual parasite/μl. DNA was also successfully amplified from (1) 70/71 (98.6%) archived positive RDTs (RDTs and microscopy positive) (2) 52/63 (82.5%) false negative RDTs (negative by RDTs but positive by microscopy) and (3) 4/24 (16.7%) false positive RDTs (positive by RDTs but negative by microscopy). Finally, 7(100%) negative RDTs (negative by RDTs and microscopy) were also negative by PCR. CONCLUSION: This study showed that DNA extracted from archived RDTs can be successfully amplified by PCR and used for detection of malaria parasites. Since Tanzania is planning to introduce RDTs in all health facilities (and possibly also at community level), availability of archived RDTs will provide an alternative source of DNA for genetic studies such as continued surveillance of parasite resistance to anti-malarial drugs. The DNA obtained from RDTs can also be used for QC testing by detecting malaria parasites using PCR in places without facilities for microscopy

    Declining Burden of Malaria Over two Decades in a Rural Community of Muheza District, North-Eastern Tanzania.

    Get PDF
    The recently reported declining burden of malaria in some African countries has been attributed to scaling-up of different interventions although in some areas, these changes started before implementation of major interventions. This study assessed the long-term trends of malaria burden for 20 years (1992--2012) in Magoda and for 15 years in Mpapayu village of Muheza district, north-eastern Tanzania, in relation to different interventions as well as changing national malaria control policies.\ud Repeated cross-sectional surveys recruited individuals aged 0 -- 19 years from the two villages whereby blood smears were collected for detection of malaria parasites by microscopy. Prevalence of Plasmodium falciparum infections and other indices of malaria burden (prevalence of anaemia, splenomegaly and gametocytes) were compared across the years and between the study villages. Major interventions deployed including mobile clinic, bed nets and other research activities, and changes in national malaria control policies were also marked. In Magoda, the prevalence of P. falciparum infections initially decreased between 1992 and 1996 (from 83.5 to 62.0%), stabilized between 1996 and 1997, and further declined to 34.4% in 2004. A temporary increase between 2004 and 2008 was followed by a progressive decline to 7.2% in 2012, which is more than 10-fold decrease since 1992. In Mpapayu (from 1998), the highest prevalence was 81.5% in 1999 and it decreased to 25% in 2004. After a slight increase in 2008, a steady decline followed, reaching <5% from 2011 onwards. Bed net usage was high in both villages from 1999 to 2004 (>=88%) but it decreased between 2008 and 2012 (range, 28% - 68%). After adjusting for the effects of bed nets, age, fever and year of study, the risk of P. falciparum infections decreased significantly by >=97% in both villages between 1999 and 2012 (p < 0.001). The prevalence of splenomegaly (>40% to <1%) and gametocytes (23% to <1%) also decreased in both villages.Discussion and conclusionsA remarkable decline in the burden of malaria occurred between 1992 and 2012 and the initial decline (1992 -- 2004) was most likely due to deployment of interventions, such as bed nets, and better services through research activities. Apart from changes of drug policies, the steady decline observed from 2008 occurred when bed net coverage was low suggesting that other factors contributed to the most recent pattern. These results suggest that continued monitoring is required to determine causes of the changing malaria epidemiology and also to monitor the progress towards maintaining low malaria transmission and reaching related millennium development goals

    Predictors of Antibiotics Co-prescription with Antimalarials for Patients Presenting with Fever in Rural Tanzania.

    Get PDF
    Successful implementation of malaria treatment policy depends on the prescription practices for patients with malaria. This paper describes prescription patterns and assesses factors associated with co-prescription of antibiotics and artemether-lumefantrine (AL) for patients presenting with fever in rural Tanzania. From June 2009 to September 2011, a cohort event monitoring program was conducted among all patients treated at 8 selected health facilities in Ifakara and Rufiji Health and Demographic Surveillance System (HDSS).It included all patients presenting with fever and prescribed with AL. Logistic regression was used to model the predictors on the outcome variable which is co-prescription of AL and antibiotics on a single clinical visit. A cohort of 11,648 was recruited and followed up with 92% presenting with fever. Presumptive treatment was used in 56% of patients treated with AL. On average 2.4 (1 -- 7) drugs was prescribed per encounter, indicating co-prescription of AL with other drugs. Children under five had higher odds of AL and antibiotics co-prescription (OR = 0.63, 95% CI: 0.46 -- 0.85) than those aged more than five years. Patients testing negative had higher odds (OR = 2.22, 95%CI: 1.65 -- 2.97) of AL and antibiotics co-prescription. Patients receiving treatment from dispensaries had higher odds (OR = 1.45, 95% CI: 0.84 -- 2.30) of AL and antibiotics co-prescription than those from served in health centres even though the deference was not statistically significant. Regardless the fact that Malaria is declining but due to lack of laboratories and mRDT in most health facilities in the rural areas, clinicians are still treating malaria presumptively. This leads them to prescribe more drugs to treat all possibilities

    Access to Artemisinin-Based Anti-Malarial Treatment and its Related Factors in Rural Tanzania.

    Get PDF
    Artemisinin-based combination treatment (ACT) has been widely adopted as one of the main malaria control strategies. However, its promise to save thousands of lives in sub-Saharan Africa depends on how effective the use of ACT is within the routine health system. The INESS platform evaluated effective coverage of ACT in several African countries. Timely access within 24 hours to an authorized ACT outlet is one of the determinants of effective coverage and was assessed for artemether-lumefantrine (Alu), in two district health systems in rural Tanzania. From October 2009 to June 2011we conducted continuous rolling household surveys in the Kilombero-Ulanga and the Rufiji Health and Demographic Surveillance Sites (HDSS). Surveys were linked to the routine HDSS update rounds. Members of randomly pre-selected households that had experienced a fever episode in the previous two weeks were eligible for a structured interview. Data on individual treatment seeking, access to treatment, timing, source of treatment and household costs per episode were collected. Data are presented on timely access from a total of 2,112 interviews in relation to demographics, seasonality, and socio economic status. In Kilombero-Ulanga, 41.8% (CI: 36.6-45.1) and in Rufiji 36.8% (33.7-40.1) of fever cases had access to an authorized ACT provider within 24 hours of fever onset. In neither of the HDSS site was age, sex, socio-economic status or seasonality of malaria found to be significantly correlated with timely access. Timely access to authorized ACT providers is below 50% despite interventions intended to improve access such as social marketing and accreditation of private dispensing outlets. To improve prompt diagnosis and treatment, access remains a major bottle neck and new more innovative interventions are needed to raise effective coverage of malaria treatment in Tanzania

    Change in Composition of the Anopheles Gambiae Complex and its Possible Implications for the Transmission of Malaria and Lymphatic Filariasis in North-Eastern Tanzania.

    Get PDF
    A dramatic decline in the incidence of malaria due to Plasmodium falciparum infection in coastal East Africa has recently been reported to be paralleled (or even preceded) by an equally dramatic decline in malaria vector density, despite absence of organized vector control. As part of investigations into possible causes for the change in vector population density, the present study analysed the Anopheles gambiae s.l. sibling species composition in north-eastern Tanzania. The study was in two parts. The first compared current species complex composition in freshly caught An. gambiae s.l. complex from three villages to the composition reported from previous studies carried out 2-4 decades ago in the same villages. The second took advantage of a sample of archived dried An. gambiae s.l. complex specimens collected regularly from a fourth study village since 2005. Both fresh and archived dried specimens were identified to sibling species of the An. gambiae s.l. complex by PCR. The same specimens were moreover examined for Plasmodium falciparum and Wuchereria bancrofti infection by PCR. As in earlier studies, An. gambiae s.s., Anopheles merus and Anopheles arabiensis were identified as sibling species found in the area. However, both study parts indicated a marked change in sibling species composition over time. From being by far the most abundant in the past An. gambiae s.s. was now the most rare, whereas An. arabiensis had changed from being the most rare to the most common. P. falciparum infection was rarely detected in the examined specimens (and only in An. arabiensis) whereas W. bancrofti infection was prevalent and detected in all three sibling species. The study indicates that a major shift in An. gambiae s.l. sibling species composition has taken place in the study area in recent years. Combined with the earlier reported decline in overall malaria vector density, the study suggests that this decline has been most marked for An. gambiae s.s., and least for An. arabiensis, leading to current predominance of the latter. Due to differences in biology and vectorial capacity of the An. gambiae s.l. complex the change in sibling species composition will have important implications for the epidemiology and control of malaria and lymphatic filariasis in the study area

    Epidemiology of Coxiella burnetii infection in Africa: a OneHealth systematic review

    Get PDF
    Background: Q fever is a common cause of febrile illness and community-acquired pneumonia in resource-limited settings. Coxiella burnetii, the causative pathogen, is transmitted among varied host species, but the epidemiology of the organism in Africa is poorly understood. We conducted a systematic review of C. burnetii epidemiology in Africa from a “One Health” perspective to synthesize the published data and identify knowledge gaps.&lt;p&gt;&lt;/p&gt; Methods/Principal Findings: We searched nine databases to identify articles relevant to four key aspects of C. burnetii epidemiology in human and animal populations in Africa: infection prevalence; disease incidence; transmission risk factors; and infection control efforts. We identified 929 unique articles, 100 of which remained after full-text review. Of these, 41 articles describing 51 studies qualified for data extraction. Animal seroprevalence studies revealed infection by C. burnetii (&#8804;13%) among cattle except for studies in Western and Middle Africa (18–55%). Small ruminant seroprevalence ranged from 11–33%. Human seroprevalence was &#60;8% with the exception of studies among children and in Egypt (10–32%). Close contact with camels and rural residence were associated with increased seropositivity among humans. C. burnetii infection has been associated with livestock abortion. In human cohort studies, Q fever accounted for 2–9% of febrile illness hospitalizations and 1–3% of infective endocarditis cases. We found no studies of disease incidence estimates or disease control efforts.&lt;p&gt;&lt;/p&gt; Conclusions/Significance: C. burnetii infection is detected in humans and in a wide range of animal species across Africa, but seroprevalence varies widely by species and location. Risk factors underlying this variability are poorly understood as is the role of C. burnetii in livestock abortion. Q fever consistently accounts for a notable proportion of undifferentiated human febrile illness and infective endocarditis in cohort studies, but incidence estimates are lacking. C. burnetii presents a real yet underappreciated threat to human and animal health throughout Africa.&lt;p&gt;&lt;/p&gt

    Clinical malaria case definition and malaria attributable fraction in the highlands of western Kenya

    Get PDF
    BACKGROUND: In African highland areas where endemicity of malaria varies greatly according to altitude and topography, parasitaemia accompanied by fever may not be sufficient to define an episode of clinical malaria in endemic areas. To evaluate the effectiveness of malaria interventions, age-specific case definitions of clinical malaria needs to be determined. Cases of clinical malaria through active case surveillance were quantified in a highland area in Kenya and defined clinical malaria for different age groups. METHODS: A cohort of over 1,800 participants from all age groups was selected randomly from over 350 houses in 10 villages stratified by topography and followed for two-and-a-half years. Participants were visited every two weeks and screened for clinical malaria, defined as an individual with malaria-related symptoms (fever [axillary temperature ≥ 37.5°C], chills, severe malaise, headache or vomiting) at the time of examination or 1–2 days prior to the examination in the presence of a Plasmodium falciparum positive blood smear. Individuals in the same cohort were screened for asymptomatic malaria infection during the low and high malaria transmission seasons. Parasite densities and temperature were used to define clinical malaria by age in the population. The proportion of fevers attributable to malaria was calculated using logistic regression models. RESULTS: Incidence of clinical malaria was highest in valley bottom population (5.0% cases per 1,000 population per year) compared to mid-hill (2.2% cases per 1,000 population per year) and up-hill (1.1% cases per 1,000 population per year) populations. The optimum cut-off parasite densities through the determination of the sensitivity and specificity showed that in children less than five years of age, 500 parasites per μl of blood could be used to define the malaria attributable fever cases for this age group. In children between the ages of 5–14, a parasite density of 1,000 parasites per μl of blood could be used to define the malaria attributable fever cases. For individuals older than 14 years, the cut-off parasite density was 3,000 parasites per μl of blood. CONCLUSION: Clinical malaria case definitions are affected by age and endemicity, which needs to be taken into consideration during evaluation of interventions

    Engaging diverse communities participating in clinical trials: case examples from across Africa

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In the advent of increasing international collaborative research involving participants drawn from populations with diverse cultural backgrounds, community engagement becomes very critical for the smooth conduction of the research. The African Malaria Network Trust (AMANET) is a pan-African non-governmental organization that sponsors and technically supports malaria vaccine trials in various African countries.</p> <p>Case description</p> <p>AMANET sponsored phase Ib or IIb clinical trials of several malaria vaccine candidates in various Africa countries. In Burkina Faso, Mali and Tanzania trials of the merozoite surface protein 3 -- in its Long Synthetic Peptide configuration (MSP3 LSP) -- were conducted. In Mali, the apical membrane antigen 1 (AMA1) was tested, while a hybrid of glutamate rich protein (GLURP) and MSP3 (GMZ2) was tested in Gabon. AMANET recognizes the importance of engaging with the communities from which trial participants are drawn, hence community engagement was given priority in all project activities conducted in the various countries.</p> <p>Discussion and evaluation</p> <p>Existing local social systems were used to engage the communities from which clinical trial participants were drawn. This article focuses on community engagement activities employed at various AMANET-supported clinical trial sites in different countries, highlighting subtle differences in the approaches used. The paper also gives some general pros and cons of community engagement.</p> <p>Conclusions</p> <p>Community engagement enables two-way sharing of accurate information and ideas between researchers and researched communities, which helps to create an environment conducive to smooth research activities with enhanced sense of research ownership by the communities.</p
    corecore