92 research outputs found

    Antidiarrhoeal Activity of Leaf Extract of Moringa Oleifera In Experimentally Induced Diarrhoea In Rats

    Get PDF
    To evalauate the antidiarrhoeal activity of the hydroalcoholic extract of moringa oleifera leaves. The hydroalchoholic extract was evaluated using rodent animal models of diarrhoea like the castor oil and magnesium sulfate induced gastrointestinal motility, in a model of enteropooling induced by the administration of castor oil and PGE2, Charcoal meal test. Acute toxicity and phytochemical constituents were also been evaluated using standardized methods. The results of the present study indicates that the hydroalcoholic extract of moringa oleifera leaves was effective in inducing a significant protection against experimentally induced diarrhoea by castor oil and magnesium sulfate, as evidenced by a decrease in the number of frequency, weight of stools after 4 hours with respect to control. The extract also prevented the enteropooling induced by castor oil and PGE2 at all the doses tested. Acute toxicity studies indicated that the extract is safe till 2500 mg/kg. The antidiarrhoeal activity though, not ascribed to any particular phytochemical present, general tests performed indicated the presence of flavonoids, tannis which were reported to produce antidiarrhoeal activity. These results showed that Moringa oleifera leaves possess anti-diarrheal properties mediated through inhibition of hyper secretion and gastrointestinal motility that substantiate its use in the treatment of diarrhea in traditional medicines or folklore use. Keywords: enteropooling, hyper secretion, Moringa oleifer

    Use of traditional cooking fuels and the risk of young adult cataract in rural Bangladesh: a hospital-based case-control study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study aimed to investigate the independent relationship between the use of various traditional biomass cooking fuels and the occurrence of cataract in young adults in rural Bangladesh.</p> <p>Methods</p> <p>A hospital-based age- and sex-matched case-control study incorporating two control groups was conducted. Cases were cataract patients aged 18 and 49 years diagnosed on the basis of any opacity of the crystalline lens or its capsule and visual acuity poorer than 6/18 on the Log Mar Visual Acuity Chart in either eye, or who had a pseudophakic lens as a result of cataract surgery within the previous 5 years. Non-eye-disease (NE) controls were selected from patients from ENT or Orthopaedics departments and non-cataract eye-disease (NC) controls from the Ophthalmology department. Data pertaining to history of exposure to various cooking fuels and to established risk factors for cataract were obtained by face-to-face interview and analyzed using conditional logistic regression.</p> <p>Results</p> <p>Clean fuels were used by only 4% of subjects. A majority of males (64-80% depending on group) had never cooked, while the rest had used biomass cooking fuels, mainly wood/dry leaves, with only 6 having used rice straw and/or cow dung. All females of each group had used wood/dry leaves for cooking. Close to half had also used rice straw and/or cow dung. Among females, after controlling for family history of cataract and education and combining the two control groups, case status was shown to be significantly related to lifetime exposure to rice straw, fitted as a trend variable coded as never, ≤ median of all exposed, > median of all exposed (OR = 1.52, 95%CI 1.04-2.22), but not to lifetime exposure to wood/dry leaves. Case status among females showed an inverse association with ever use of cow dung as a cooking fuel (OR 0.43, 95%CI 0.22-0.81).</p> <p>Conclusions</p> <p>In this population, where cooking is almost exclusively done using biomass fuels, cases of young adult cataract among females were more likely to have had an increased lifetime exposure to cooking with rice straw fuel and not to have cooked using cow dung fuel. There is a possibility that these apparent associations could have been the result of uncontrolled founding, for instance by wealth. The nature of the associations, therefore, needs to be further investigated.</p

    Ambulatory Multi-Drug Resistant Tuberculosis Treatment Outcomes in a Cohort of HIV-Infected Patients in a Slum Setting in Mumbai, India

    Get PDF
    Background: India carries one quarter of the global burden of multi-drug resistant TB (MDR-TB) and has an estimated 2.5 million people living with HIV. Despite this reality, provision of treatment for MDR-TB is extremely limited, particularly for HIV-infected individuals. Médecins Sans Frontières (MSF) has been treating HIV-infected MDR-TB patients in Mumbai since May 2007. This is the first report of treatment outcomes among HIV-infected MDR-TB patients in India. Methods: HIV-infected patients with suspected MDR-TB were referred to the MSF-clinic by public Antiretroviral Therapy (ART) Centers or by a network of community non-governmental organizations. Patients were initiated on either empiric or individualized second-line TB-treatment as per WHO recommendations. MDR-TB treatment was given on an ambulatory basis and under directly observed therapy using a decentralized network of providers. Patients not already receiving ART were started on treatment within two months of initiating MDR-TB treatment. Results: Between May 2007 and May 2011, 71 HIV-infected patients were suspected to have MDR-TB, and 58 were initiated on treatment. MDR-TB was confirmed in 45 (78%), of which 18 (40%) were resistant to ofloxacin. Final treatment outcomes were available for 23 patients; 11 (48%) were successfully treated, 4 (17%) died, 6 (26%) defaulted, and 2 (9%) failed treatment. Overall, among 58 patients on treatment, 13 (22%) were successfully treated, 13 (22%) died, 7 (12%) defaulted, two (3%) failed treatment, and 23 (40%) were alive and still on treatment at the end of the observation period. Twenty-six patients (45%) experienced moderate to severe adverse events, requiring modification of the regimen in 12 (20%). Overall, 20 (28%) of the 71 patients with MDR-TB died, including 7 not initiated on treatment. Conclusions: Despite high fluoroquinolone resistance and extensive prior second-line treatment, encouraging results are being achieved in an ambulatory MDR-T- program in a slum setting in India. Rapid scale-up of both ART and second-line treatment for MDR-TB is needed to ensure survival of co-infected patients and mitigate this growing epidemic.</br

    Big data-driven fuzzy cognitive map for prioritising IT service procurement in the public sector

    Get PDF
    YesThe prevalence of big data is starting to spread across the public and private sectors however, an impediment to its widespread adoption orientates around a lack of appropriate big data analytics (BDA) and resulting skills to exploit the full potential of big data availability. In this paper, we propose a novel BDA to contribute towards this void, using a fuzzy cognitive map (FCM) approach that will enhance decision-making thus prioritising IT service procurement in the public sector. This is achieved through the development of decision models that capture the strengths of both data analytics and the established intuitive qualitative approach. By taking advantages of both data analytics and FCM, the proposed approach captures the strength of data-driven decision-making and intuitive model-driven decision modelling. This approach is then validated through a decision-making case regarding IT service procurement in public sector, which is the fundamental step of IT infrastructure supply for publics in a regional government in the Russia federation. The analysis result for the given decision-making problem is then evaluated by decision makers and e-government expertise to confirm the applicability of the proposed BDA. In doing so, demonstrating the value of this approach in contributing towards robust public decision-making regarding IT service procurement.EU FP7 project Policy Compass (Project No. 612133

    Discrimination between two different grades of human glioma based on blood vessel infrared spectral imaging

    Get PDF
    Gliomas are brain tumours classified into four grades with increasing malignancy from I to IV. The development and the progression of malignant glioma largely depend on the tumour vascularization. Due to their tissue heterogeneity, glioma cases can be difficult to classify into a specific grade using the gold standard of histological observation, hence the need to base classification on a quantitative and reliable analytical method for accurately grading the disease. Previous works focused specifically on vascularization study by Fourier transform infrared (FTIR) spectroscopy, proving this method to be a way forward to detect biochemical changes in the tumour tissue not detectable by visual techniques. In this project, we employed FTIR imaging using a focal plane array (FPA) detector and globar source to analyse large areas of glioma tumour tissue sections via molecular fingerprinting in view of helping to define markers of the tumour grade. Unsupervised multivariate analysis (hierarchical cluster analysis and principal component analysis) of blood vessel spectral data, retrieved from the FPA images, revealed the fine structure of the borderline between two areas identified by a pathologist as grades III and IV. Spectroscopic indicators are found capable of discriminating different areas in the tumour tissue and are proposed as biomolecular markers for potential future use of grading gliomas. Graphical Abstract Infrared imaging of glioma blood vessels provides a means to revise the pathologists' line of demarcation separating grade III (GIII) from grade IV (GIV) parts

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Unravelling the relationship between animal growth and immune response during micro-parasitic infections

    Get PDF
    Background: Both host genetic potentials for growth and disease resistance, as well as nutrition are known to affect responses of individuals challenged with micro-parasites, but their interactive effects are difficult to predict from experimental studies alone. Methodology/Principal Findings: Here, a mathematical model is proposed to explore the hypothesis that a host's response to pathogen challenge largely depends on the interaction between a host's genetic capacities for growth or disease resistance and the nutritional environment. As might be expected, the model predicts that if nutritional availability is high, hosts with higher growth capacities will also grow faster under micro-parasitic challenge, and more resistant animals will exhibit a more effective immune response. Growth capacity has little effect on immune response and resistance capacity has little effect on achieved growth. However, the influence of host genetics on phenotypic performance changes drastically if nutrient availability is scarce. In this case achieved growth and immune response depend simultaneously on both capacities for growth and disease resistance. A higher growth capacity (achieved e.g. through genetic selection) would be detrimental for the animal's ability to cope with pathogens and greater resistance may reduce growth in the short-term. Significance: Our model can thus explain contradicting outcomes of genetic selection observed in experimental studies and provides the necessary biological background for understanding the influence of selection and/or changes in the nutritional environment on phenotypic growth and immune response. © 2009 Doeschl-Wilson et al

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore