29 research outputs found

    The public health significance of latrines discharging to groundwater used for drinking.

    Get PDF
    Faecal contamination of groundwater from pit latrines is widely perceived as a major threat to the safety of drinking water for several billion people in rural and peri-urban areas worldwide. On the floodplains of the Ganges-Brahmaputra-Meghna delta in Bangladesh, we constructed latrines and monitored piezometer nests monthly for two years. We detected faecal coliforms (FC) in 3.3-23.3% of samples at four sites. We differentiate a near-field, characterised by high concentrations and frequent, persistent and contiguous contamination in all directions, and a far-field characterised by rare, impersistent, discontinuous low-level detections in variable directions. Far-field FC concentrations at four sites exceeded 0 and 10 cfu/100 ml in 2.4-9.6% and 0.2-2.3% of sampling events respectively. The lesser contamination of in-situ groundwater compared to water at the point-of-collection from domestic wells, which itself is less contaminated than at the point-of-consumption, demonstrates the importance of recontamination in the well-pump system. We present a conceptual model comprising four sub-pathways: the latrine-aquifer interface (near-field); groundwater flowing from latrine to well (far-field); the well-pump system; and post-collection handling and storage. Applying a hypothetical dose-response model suggests that 1-2% of the diarrhoeal disease burden from drinking water is derived from the aquifer, 29% from the well-pump system, and 70% from post-collection handling. The important implications are (i) that leakage from pit latrines is a minor contributor to faecal contamination of drinking water in alluvial-deltaic terrains; (ii) fears of increased groundwater pollution should not constrain expanding latrine coverage, and (iii) that more attention should be given to reducing contamination around the well-head

    Field trial of an automated batch chlorinator system at shared water points in an urban community of Dhaka, Bangladesh

    Full text link
    Point-of-use water treatment with chlorine is underutilized in low-income households. The Zimba, an automated batch chlorinator, requires no electricity or moving parts, and can be installed at shared water points with intermittent flow. We conducted a small-scale trial to assess the acceptability and quality of Zimba-treated municipal water. Fieldworkers collected stored drinking water over a 10-week period from control (n = 24 households) and treatment (n = 30 households) compounds to assess levels of free chlorine and E. coli contamination. Overall, 80% of stored drinking water samples had a safe chlorine residual among treatment households, compared to 29% among control households (P &amp;lt; 0.001). Concentrations of E. coli were lower (mean difference = 0.4 log colony-forming units/100 mL, P = 0.004) in treatment compared to control households. Fifty-three percent of mothers (n = 17), thought the Zimba was easy to use and 76% were satisfied with the taste. The majority of mothers mentioned that collecting water from the Zimba took more time and created a long queue at the handpump. The Zimba successfully chlorinated household stored drinking water; however, further technology development is required to address user preferences. The Zimba may be a good option for point-of-collection water treatment in areas where queuing for water is uncommon.</jats:p

    A framework for monitoring the safety of water services: from measurements to security

    Get PDF
    The sustainable developments goals (SDGs) introduced monitoring of drinking water quality to the international development agenda. At present, Escherichia coli are the primary measure by which we evaluate the safety of drinking water from an infectious disease perspective. Here, we propose and apply a framework to reflect on the purposes of and approaches to monitoring drinking water safety. To deliver SDG 6.1, universal access to safe drinking water, a new approach to monitoring is needed. At present, we rely heavily on single measures of E. coli contamination to meet a normative definition of safety. Achieving and sustaining universal access to safe drinking water will require monitoring that can inform decision making on whether services are managed to ensure safety and security of access

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    A novel and simple mixture as point-of-use water treatment agent to produce safe drinking water

    No full text
    Background: People in rural Bangladesh have a poor understanding of the link between use of contaminated surface water and disease. An inexpensive point-of-use water treatment agent was developed to purify surface water. Methods: Surface water was collected from various sources in Bangladesh from February 2007 to January 2008. Microbiological and physicochemical parameters of raw and treated surface water were analysed. Water was treated with a mixture of alum potash, bleaching powder and lime, or with each agent individually. Results: Raw water was contaminated with bacteria, the counts for total coliforms, faecal coliforms and faecal streptococci being 26 431, 14 548 and 240 colony-forming units (cfu) 100 ml(-1), respectively. These counts fell to 0 cfu 100 ml(-1) after treatment with the mixture. The count of artificially introduced Vibrio cholerae was also reduced to 0 cfu 100 ml(-1) after treatment. Treatment of raw water altered the pH from 6.90 to 6.87, turbidity from 21.61 to 3.55 nephelometric turbidity units (NTU), residual chlorine from 0 to 0.09 mg litre(-1), conductivity from 124.03 to 229.96 S cm(-1), and total dissolved solids from 59.40 to 199.25 mg litre(1). All these results of treatment were within the range recommended by the WHO as acceptable for drinking water. Conclusion:The mixture of alum potash, bleaching powder and lime described can be safely used to disinfect contaminated surface water to make it suitable for drinking and other household purposes in Bangladesh

    Outbreak of Mass Sociogenic Illness in a School Feeding Program in Northwest Bangladesh, 2010

    Get PDF
    BACKGROUND: In 2010, an acute illness outbreak was reported in school students eating high-energy biscuits supplied by the school feeding programme in northwest Bangladesh. We investigated this outbreak to describe the illness in terms of person, place and time, develop the timeline of events, and determine the cause and community perceptions regarding the outbreak. METHODS: We defined case-patients as students from affected schools reporting any two symptoms including abdominal pain, heartburn, bitter taste, and headache after eating biscuits on the day of illness. We conducted in-depth interviews and group discussions with students, teachers, parents and community members to explore symptoms, exposures, and community perceptions. We conducted a questionnaire survey among case-patients to determine the symptoms and ascertain food items eaten 12 hours before illness onset, and microbiological and environmental investigations. RESULTS: Among 142 students seeking hospital care, 44 students from four schools qualified as case-patients. Of these, we surveyed 30 who had a mean age of 9 years; 70% (21/30) were females. Predominant symptoms included abdominal pain (93%), heartburn (90%), and bitter taste (57%). All students recovered within a few hours. No pathogenic Vibrio cholerae, Shigella or Salmonella spp. were isolated from collected stool samples. We found no rancid biscuits in schools and storage sites. The female index case perceived the unusually darker packet label as a "devil's deed" that made the biscuits poisonous. Many students, parents and community members reported concerns about rumors of students dying from biscuit poisoning. CONCLUSIONS: Rapid onset, followed by rapid recovery of symptoms; female preponderance; inconsistent physical, microbiological and environmental findings suggested mass sociogenic illness rather than a foodborne or toxic cause. Rumours of student deaths heightening community anxiety apparently propagated this outbreak. Sharing investigation results and reassuring students and parents through health communication campaigns could limit similar future outbreaks and help retain beneficiaries' trust on nutrition supplementation initiatives

    Hygiene intervention reduces contamination of weaning food in Bangladesh

    No full text
    Objective This study was conducted to measure the impact of a hygiene intervention on the contamination of weaning food in Bangladesh. Methods Sixty households were selected: 30 study and 30 control households. Samples of weaning food were collected from all the 60 households at baseline and examined for faecal coliforms (FC), faecal streptococci (FS) and Clostridium perfringens (CP) following standard procedures. After cooking, food samples were collected on three occasions before feeding. Following Hazard Analysis Critical Control Point (HACCP) procedures, critical control points were determined. The mothers in the 30 study households were then trained for 4weeks in how to attain the control point conditions. Then, again the food samples were collected and analysed. Results At baseline, weaning foods from study and control households were heavily contaminated with FC and FS. The FC and FS counts were 1.84log10 and 1.92log10 colony-forming unit (cfu)/g, respectively, in the study households, and 0.86log10 and 1.33log10cfu/g, respectively, in the control households in the first feeding. After the intervention, the FC and FS counts in study households had dropped to 0.10log10 and 0.09log10cfu/g, respectively, a statistically significant reduction (P<0.001). Monitoring the sustainability of the behaviour change after 3months showed that the mothers were maintaining food hygiene. Conclusions A hygiene intervention following the HACCP approach reduced the weaning food contamination significantly. Awareness building among mothers about weaning food hygiene could be an important intervention for preventing weaning foodrelated diarrhoea in Bangladesh
    corecore