372 research outputs found
Workload Indicators Of Staffing Need Method in determining optimal staffing levels at Moi Teaching and Referal Hospital
Background: There is an increasing demand for quality healthcare in the face of limited resources. With the health personnel consuming up to three quarters of recurrent budgets, a need arises to ascertain that a workforce for any health facility is the optimal level needed to produce the desired product.
Objective: To highlight the experience and findings of an attempt at establishing the optimal staffing levels for a tertiary health institution using the Workload Indicators of Staffing Need (WISN) method popularised by the World Health Organisation (WHO), Geneva, Switzerland.
Design: A descriptive study that captures the activities of a taskforce appointed to establish optimal staffing levels.
Setting: Moi Teaching and Referral Hospital (MTRH), Eldoret, Kenya, a tertiary hospital in the Rift Valley province of Kenya from September 2005 to May 2006.
Main outcome measures: The cadres of workers, working schedules, main activities, time taken to accomplish the activities, available working hours, category and individual allowances, annual workloads from the previous year\'s statistics and optimal departmental establishment of workers.
Results: There was initial resentment to the exercise because of the notion that it was aimed at retrenching workers. The team was given autonomy by the hospital management to objectively establish the optimal staffing levels. Very few departments were optimally established with the majority either under or over staffed. There were intradepartmental discrepancies in optimal levels of cadres even though many of them had the right number of total workforce.
Conclusion: The WISN method is a very objective way of establishing staffing levels but requires a dedicated team with adequate expertise to make the raw data meaningful for calculations. East African Medical Journla Vol. 85 (5) 2008: pp. 232-23
An investigation of the disparity in estimates of microfilaraemia and antigenaemia in lymphatic filariasis surveys
The diagnosis of lymphatic filariasis (LF) is based typically on either microfilaraemia as assessed by microscopy or filarial antigenaemia using an immuno-chromatographic test. While it is known that estimates of antigenaemia are generally higher than estimates of microfilaraemia, the extent of the difference is not known. This dataset was produced as part of a literature review of surveys that estimate microfilaraemia and antigenaemia
Sources of variability in the measurement of Ascaris lumbricoides infection intensity by Kato-Katz and qPCR
Background Understanding and quantifying the sources and implications of error in the measurement of helminth egg intensity using Kato-Katz (KK) and the newly emerging “gold standard” quantitative polymerase chain reaction (qPCR) technique is necessary for the appropriate design of epidemiological studies, including impact assessments for deworming programs. Methods Repeated measurements of Ascaris lumbricoides infection intensity were made from samples collected in western Kenya using the qPCR and KK techniques. These data were combined with data on post-treatment worm expulsions. Random effects regression models were used to quantify the variability associated with different technical and biological factors for qPCR and KK diagnosis. The relative precision of these methods was compared, as was the precision of multiple qPCR replicates. Results For both KK and qPCR, intensity measurements were largely determined by the identity of the stool donor. Stool donor explained 92.4% of variability in qPCR measurements and 54.5% of observed measurement variance for KK. An additional 39.1% of variance in KK measurements was attributable to having expelled adult A. lumbricoides worms following anthelmintic treatment. For qPCR, the remaining 7.6% of variability was explained by the efficiency of the DNA extraction (2.4%), plate-to-plate variability (0.2%) and other residual factors (5%). Differences in replicate measurements by qPCR were comparatively small. In addition to KK variability based on stool donor infection levels, the slide reader was highly statistically significant, although it only explained 1.4% of the total variation. In a comparison of qPCR and KK variance to mean ratios under ideal conditions, the coefficient of variation was on average 3.6 times larger for KK highlighting increased precision of qPCR. Conclusions Person-to-person differences explain the majority of variability in egg intensity measurements by qPCR and KK, with very little additional variability explained by the technical factors associated with the practical implementation of these techniques. qPCR provides approximately 3.6 times more precision in estimating A. lumbricoides egg intensity than KK, and could potentially be made more cost-effective by testing each sample only once without diminishing the power of a study to assess population-level intensity and prevalence
Determinants of success in national programs to Eliminate Lymphatic Filariasis: A perspective identifying essential elements and research needs
The Global Programme to Eliminate Lymphatic Filariasis (GPELF) was launched in 2000. To understand why some national programs have been more successful than others, a panel of individuals with expertise in LF elimination efforts met to assess available data from programs in 8 countries. The goal was to identify: 1) the factors determining success for national LF elimination programs (defined as the rapid, sustained reduction in microfilaremia/antigenemia after repeated mass drug administration [MDA]): 2) the priorities for operational research to enhance LF elimination efforts.
Of more than 40 factors identified, the most prominent were 1) initial level of LF endemicity: 2) effectiveness of vector mosquitoes; 3) MDA drug regimen: 4) population compliance.
Research important for facilitating program success was identified as either biologic (i.e., [1] quantifying differences in vectorial capacity; [2] identifying seasonal variations affecting LF transmission) or programmatic (i.e., [1] identifying quantitative thresholds, especially the population compliance levels necessary for success, and the antigenemia or microfilaremia prevalence at which MDA programs can stop with minimal risk of resumption of transmission; [2] defining optimal drug distribution strategies and timing; [3] identifying those individuals who are "persistently noncompliant" during MDAs, the reasons for this non-compliance and approaches to overcoming it).
While addressing these challenges is important, many key determinants of program success are already clearly understood; operationalizing these as soon as possible will greatly increase the potential for national program success
Spatial distribution of podoconiosis in relation to environmental factors in Ethiopia: a historical review
BACKGROUND
An up-to-date and reliable map of podoconiosis is needed to design geographically targeted and cost-effective intervention in Ethiopia. Identifying the ecological correlates of the distribution of podoconiosis is the first step for distribution and risk maps. The objective of this study was to investigate the spatial distribution and ecological correlates of podoconiosis using historical and contemporary survey data.
METHODS
Data on the observed prevalence of podoconiosis were abstracted from published and unpublished literature into a standardized database, according to strict inclusion and exclusion criteria. In total, 10 studies conducted between 1969 and 2012 were included, and data were available for 401,674 individuals older than 15 years of age from 229 locations. A range of high resolution environmental factors were investigated to determine their association with podoconiosis prevalence, using logistic regression.
RESULTS
The prevalence of podoconiosis in Ethiopia was estimated at 3.4% (95% CI 3.3%-3.4%) with marked regional variation. We identified significant associations between mean annual Land Surface Temperature (LST), mean annual precipitation, topography of the land and fine soil texture and high prevalence of podoconiosis. The derived maps indicate both widespread occurrence of podoconiosis and a marked variability in prevalence of podoconiosis, with prevalence typically highest at altitudes >1500 m above sea level (masl), with >1500 mm annual rainfall and mean annual LST of 19-21°C. No (or very little) podoconiosis occurred at altitudes 24°C.
CONCLUSION
Podoconiosis remains a public health problem in Ethiopia over considerable areas of the country, but exhibits marked geographical variation associated in part with key environmental factors. This is work in progress and the results presented here will be refined in future work
Multi-parallel qPCR provides increased sensitivity and diagnostic breadth for gastrointestinal parasites of humans: field-based inferences on the impact of mass deworming
BACKGROUND: Although chronic morbidity in humans from soil transmitted helminth (STH) infections can be reduced by anthelmintic treatment, inconsistent diagnostic tools make it difficult to reliably measure the impact of deworming programs and often miss light helminth infections. METHODS: Cryopreserved stool samples from 796 people (aged 2-81 years) in four villages in Bungoma County, western Kenya, were assessed using multi-parallel qPCR for 8 parasites and compared to point-of-contact assessments of the same stools by the 2-stool 2-slide Kato-Katz (KK) method. All subjects were treated with albendazole and all Ascaris lumbricoides expelled post-treatment were collected. Three months later, samples from 633 of these people were re-assessed by both qPCR and KK, re-treated with albendazole and the expelled worms collected. RESULTS: Baseline prevalence by qPCR (n = 796) was 17 % for A. lumbricoides, 18 % for Necator americanus, 41 % for Giardia lamblia and 15% for Entamoeba histolytica. The prevalence was <1% for Trichuris trichiura, Ancylostoma duodenale, Strongyloides stercoralis and Cryptosporidium parvum. The sensitivity of qPCR was 98% for A. lumbricoides and N. americanus, whereas KK sensitivity was 70% and 32%, respectively. Furthermore, qPCR detected infections with T. trichiura and S. stercoralis that were missed by KK, and infections with G. lamblia and E. histolytica that cannot be detected by KK. Infection intensities measured by qPCR and by KK were correlated for A. lumbricoides (r = 0.83, p < 0.0001) and N. americanus (r = 0.55, p < 0.0001). The number of A. lumbricoides worms expelled was correlated (p < 0.0001) with both the KK (r = 0.63) and qPCR intensity measurements (r = 0.60). CONCLUSIONS: KK may be an inadequate tool for stool-based surveillance in areas where hookworm or Strongyloides are common or where intensity of helminth infection is low after repeated rounds of chemotherapy. Because deworming programs need to distinguish between populations where parasitic infection is controlled and those where further treatment is required, multi-parallel qPCR (or similar high throughput molecular diagnostics) may provide new and important diagnostic information
Epidemiology of Coxiella burnetii infection in Africa: a OneHealth systematic review
Background:
Q fever is a common cause of febrile illness and community-acquired pneumonia in resource-limited settings. Coxiella burnetii, the causative pathogen, is transmitted among varied host species, but the epidemiology of the organism in Africa is poorly understood. We conducted a systematic review of C. burnetii epidemiology in Africa from a “One Health” perspective to synthesize the published data and identify knowledge gaps.<p></p>
Methods/Principal Findings:
We searched nine databases to identify articles relevant to four key aspects of C. burnetii epidemiology in human and animal populations in Africa: infection prevalence; disease incidence; transmission risk factors; and infection control efforts. We identified 929 unique articles, 100 of which remained after full-text review. Of these, 41 articles describing 51 studies qualified for data extraction. Animal seroprevalence studies revealed infection by C. burnetii (≤13%) among cattle except for studies in Western and Middle Africa (18–55%). Small ruminant seroprevalence ranged from 11–33%. Human seroprevalence was <8% with the exception of studies among children and in Egypt (10–32%). Close contact with camels and rural residence were associated with increased seropositivity among humans. C. burnetii infection has been associated with livestock abortion. In human cohort studies, Q fever accounted for 2–9% of febrile illness hospitalizations and 1–3% of infective endocarditis cases. We found no studies of disease incidence estimates or disease control efforts.<p></p>
Conclusions/Significance:
C. burnetii infection is detected in humans and in a wide range of animal species across Africa, but seroprevalence varies widely by species and location. Risk factors underlying this variability are poorly understood as is the role of C. burnetii in livestock abortion. Q fever consistently accounts for a notable proportion of undifferentiated human febrile illness and infective endocarditis in cohort studies, but incidence estimates are lacking. C. burnetii presents a real yet underappreciated threat to human and animal health throughout Africa.<p></p>
Field Epidemiology and Laboratory Training Programs in sub-Saharan Africa from 2004 to 2010: need, the process, and prospects
As of 2010 sub-Saharan Africa had approximately 865 million inhabitants living with numerous public health challenges. Several public health initiatives [e.g., the United States (US) President’s Emergency Plan for AIDS Relief and the US President’s Malaria Initiative] have been very successful at reducing mortality from priority diseases. A competently trained public health workforce that can operate multi-disease surveillance and response systems is necessary to build upon and sustain these successes and to address other public health problems. Sub-Saharan Africa appears to have weathered the recent global economic downturn remarkably well and its increasing middle class may soon demand stronger public health systems to protect communities. The Epidemic Intelligence Service (EIS) program of the US Centers for Disease Control and Prevention (CDC) has been the backbone of public health surveillance and response in the US during its 60 years of existence. EIS has been adapted internationally to create the Field Epidemiology Training Program (FETP) in several countries. In the 1990s CDC and the Rockefeller Foundation collaborated with the Uganda and Zimbabwe ministries of health and local universities to create 2-year Public Health Schools Without Walls (PHSWOWs) which were based on the FETP model. In 2004 the FETP model was further adapted to create the Field Epidemiology and Laboratory Training Program (FELTP) in Kenya to conduct joint competencybased training for field epidemiologists and public health laboratory scientists providing a master’s degree to participants upon completion. The FELTP model has been implemented in several additional countries in sub-Saharan Africa. By the end of 2010 these 10 FELTPs and two PHSWOWs covered 613 million of the 865 million people in sub-Saharan Africa and had enrolled 743 public health professionals. We describe the process that we used to develop 10 FELTPs covering 15 countries in sub-Saharan Africa from 2004 to 2010 as a strategy to develop a locally trained public health workforce that can operate multi-disease surveillance and response systems.Key words: Field epidemiology, laboratory management, multi-disease surveillance and response systems, public health workforce capacity buildin
- …