47 research outputs found

    Public health assessment of Kenyan ASGM communities using multi-element biomonitoring, dietary and environmental evaluation

    Get PDF
    The Kakamega gold belt's natural geological enrichment and artisanal and small-scale gold mining (ASGM) have resulted in food and environmental pollution, human exposure, and subsequent risks to health. This study aimed to characterise exposure pathways and risks among ASGM communities. Human hair, nails, urine, water, and staple food crops were collected and analysed from 144 ASGM miners and 25 people from the ASGM associated communities. Exposure to PHEs was predominantly via drinking water from mine shafts, springs and shallow-wells (for As>Pb>Cr>Al), with up to 366 µg L−1 arsenic measured in shaft waters consumed by miners. Additional exposure was via consumption of locally grown crops (for As>Ni>Pb>Cr>Cd>Hg>Al) besides inhalation of Hg vapour and dust, and direct dermal contact with Hg. Urinary elemental concentrations for both ASGM workers and wider ASGM communities were in nearly all cases above bioequivalents and reference upper thresholds for As, Cr, Hg, Ni, Pb and Sb, with median concentrations of 12.3, 0.4, 1.6, 5.1, 0.7 and 0.15 µg L−1, respectively. Urinary As concentrations showed a strong positive correlation (0.958) with As in drinking water. This study highlighted the importance of a multidisciplinary approach in integrating environmental, dietary, and public health investigations to better characterise the hazards and risks associated with ASGM and better understand the trade-offs associated with ASGM activities relating to public health and environmental sustainability. Further research is crucial, and study results have been shared with Public Health and Environmental authorities to inform mitigation efforts

    Evaluation of Promising Malting Barley Varieties Using Agronomic and Quality Traits in Kenya

    Get PDF
    Abstract A study to select promising malting barley varieties was conducted at the University of Eldoret and Mau Narok i

    EFFECT OF RIDGING AND INTERCROPPING ON SORGHUM PRODUCTIVITY IN ARID AND SEMI-ARID LANDS OF EASTERN KENYA

    Get PDF
    Soil moisture deficit is a key constraint to sorghum ( Sorghum bicolor ) productivity in arid and semi-arid lands globally. The objective of this study was to determine the effect of ridging and sorghum-bean intercropping (additive system) on soil moisture conservation and sorghum productivity. Sorghum (gadam) was grown either as a sole crop or intercropped with two bean ( Phaseolus vulgaris L.) varieties (KATx56 and KAT B1), under two types of ridging (open ridges and tied ridges), and a control without ridges for two years. The study was set up in split plot arrangement, in a randomised complete block design, at the Kenya Agricultural and Livestock Research Organization, Kiboko, in 2019 and 2020. There was no significant interaction between ridging and intercropping. Soil moisture content increased by 11-26% due to ridging; and decreased by -11 and -7% due to sorghum-KAT B1 and Sorghum-KAT X56 intercropping, respectively. Higher moisture content due to ridging was attributed to formation of basin-like structures, which increased water harvesting and infiltration compared to the no ridges where surface run-off was predominant. The highest moisture content was attained on sole bean, followed by sole sorghum and then sorghum/bean intercropping. The decrease in moisture content in intercrops of sorghum/bean relative to their specific sole crops was attributed to higher crop density, which reduced crop spacing, thus triggering competition for available soil moisture. The highest sorghum grain and equivalent yields were obtained in the ridged plots. Intercropping resulted into decrease in sorghum grain yield, but led to increase in sorghum equivalent yield (SEY) and Land Equivalent Ratio (LER). The results show that both ridging and intercropping are suitable for higher water use efficiency and land productivity in ASALs of Kenya.Le d\ue9ficit d\u2019humidit\ue9 du sol est une contrainte majeure \ue0 la productivit\ue9 du sorgho (Sorghum bicolor) dans les terres arides et semi-arides \ue0 l\u2019\ue9chelle mondiale. L\u2019objectif de cette \ue9tude \ue9tait de d\ue9terminer l\u2019effet du billonnage et de la culture intercalaire sorgho-haricot (syst\ue8me additif) sur la conservation de l\u2019humidit\ue9 du sol et la productivit\ue9 du sorgho. Le sorgho (gadam) \ue9tait cultiv\ue9 soit en monoculture, soit en association avec deux vari\ue9t\ue9s de haricot (Phaseolus vulgaris L.) (KATx56 et KAT B1), sous deux types de billons (billons ouverts et billons li\ue9s), et un t\ue9moin sans billons pendant deux ann\ue9es. L\u2019\ue9tude a \ue9t\ue9 mise en place en parcelles divis\ue9es, dans une conception en blocs complets randomis\ue9s, \ue0 l\u2019Organisation de recherche sur l\u2019agriculture et l\u2019\ue9levage du Kenya, Kiboko, en 2019 et 2020. L\u2019\ue9tude n\u2019a montr\ue9 aucune interaction significative entre le billonnage et la culture intercalaire. La teneur en humidit\ue9 du sol a augment\ue9 de 11 \ue0 26 % en raison du billonnage ; et diminu\ue9 de -11 et -7% en raison des cultures intercalaires sorgho-KAT B1 et Sorgho-KAT X56, respectivement. La teneur en humidit\ue9 plus \ue9lev\ue9e due aux cr\ueates a \ue9t\ue9 attribu\ue9e \ue0 la formation de structures de type bassin, ce qui a augment\ue9 la collecte et l\u2019infiltration de l\u2019eau par rapport \ue0 l\u2019absence de cr\ueates o\uf9 le ruissellement de surface \ue9tait pr\ue9dominant. La teneur en humidit\ue9 la plus \ue9lev\ue9e \ue9tait sur le haricot unique, suivi du sorgho unique, puis de la culture intercalaire sorgho/haricot. La diminution de la teneur en humidit\ue9 dans les cultures intercalaires de sorgho/haricot par rapport \ue0 leurs cultures uniques sp\ue9cifiques a \ue9t\ue9 attribu\ue9e \ue0 une densit\ue9 de culture plus \ue9lev\ue9e, qui a r\ue9duit l\u2019espacement des cultures, d\ue9clenchant une comp\ue9tition pour l\u2019humidit\ue9 disponible du sol. Les rendements les plus \ue9lev\ue9s en grain de sorgho et en \ue9quivalent ont \ue9t\ue9 obtenus dans les parcelles butt\ue9es. La culture intercalaire a entra\ueen\ue9 une diminution du rendement en grains de sorgho, mais a entra\ueen\ue9 une augmentation du rendement \ue9quivalent en sorgho (SEY) et du rapport d\u2019\ue9quivalent en terres (LER). Les r\ue9sultats montrent que le billonnage et la culture intercalaire conviennent \ue0 une plus grande efficacit\ue9 de l\u2019utilisation de l\u2019eau et \ue0 la productivit\ue9 des terres dans les TASA du Kenya

    Reliability of dried blood spot (DBS) cards in antibody measurement: A systematic review

    Get PDF
    Background Increasingly, vaccine efficacy studies are being recommended in low-and-middle-income countries (LMIC), yet often facilities are unavailable to take and store infant blood samples correctly. Dried blood spots (DBS), are useful for collecting blood from infants for diagnostic purposes, especially in low-income settings, as the amount of blood required is miniscule and no refrigeration is required. Little is known about their utility for antibody studies in children. This systematic review aims to investigate the correlation of antibody concentrations against infectious diseases in DBS in comparison to serum or plasma samples that might inform their use in vaccine clinical trials. Methods and findings We searched MEDLINE, Embase and the Cochrane library for relevant studies between January 1990 to October 2020 with no language restriction, using PRISMA guidelines, investigating the correlation between antibody concentrations in DBS and serum or plasma samples, and the effect of storage temperature on DBS diagnostic performance. We included 40 studies in this systematic review. The antibody concentration in DBS and serum/plasma samples reported a good pooled correlation, (r2 = 0.86 (ranged 0.43 to 1.00)). Ten studies described a decline of antibody after 28 days at room temperature compared to optimal storage at -20°C, where antibodies were stable for up to 200 days. There were only five studies of anti-bacterial antibodies. Conclusions There is a good correlation between antibody concentrations in DBS and serum/plasma samples, supporting the wider use of DBS in vaccine and sero-epidemiological studies, but there is limited data on anti-bacterial antibodies. The correct storage of DBS is critical and may be a consideration for longer term storage

    Using Dried Blood Spots for a Sero-Surveillance Study of Maternally Derived Antibody against Group B Streptococcus.

    Get PDF
    Vaccination during pregnancy could protect women and their infants from invasive Group B Streptococcus (GBS) disease. To understand if neonatal dried blood spots (DBS) can be used to determine the amount of maternally derived antibody that protects infants against invasive GBS disease, a retrospective case-control study was conducted in England between 1 April 2014 and 30 April 2015. The DBS of cases with invasive GBS disease (n = 61) were matched with healthy controls (n = 125). The haematocrit, DBS storage temperature, freeze-thaw cycle, and paired serum/DBS studies were set up to optimise the antibody assessment. The samples were analysed using a multiplex immunoassay, and the results were assessed using parametric and nonparametric tests. Antibody concentrations were stable at haematocrits of up to 50% but declined at 75%. DBS storage at room temperature was stable for three months compared with storage from collection at -20 °C and rapidly degraded thereafter. Total IgG levels measured in DBS and paired serum showed a good correlation (r2 = 0.99). However, due to suboptimal storage conditions, no difference was found in the GBS IgG levels between DBS samples from cases and controls. We have demonstrated a proof of concept that assays utilising DBS for assessing GBS serotype-specific antibodies in infants is viable. This method could be used to facilitate future large sero-correlate studies, but DBS samples must be stored at -20 °C for long term preservation of antibody

    Simple scoring system to predict in-hospital mortality after surgery for infective endocarditis

    Get PDF
    BACKGROUND: Aspecific scoring systems are used to predict the risk of death postsurgery in patients with infective endocarditis (IE). The purpose of the present study was both to analyze the risk factors for in-hospital death, which complicates surgery for IE, and to create a mortality risk score based on the results of this analysis. METHODS AND RESULTS: Outcomes of 361 consecutive patients (mean age, 59.1\ub115.4 years) who had undergone surgery for IE in 8 European centers of cardiac surgery were recorded prospectively, and a risk factor analysis (multivariable logistic regression) for in-hospital death was performed. The discriminatory power of a new predictive scoring system was assessed with the receiver operating characteristic curve analysis. Score validation procedures were carried out. Fifty-six (15.5%) patients died postsurgery. BMI >27 kg/m2 (odds ratio [OR], 1.79; P=0.049), estimated glomerular filtration rate 55 mm Hg (OR, 1.78; P=0.032), and critical state (OR, 2.37; P=0.017) were independent predictors of in-hospital death. A scoring system was devised to predict in-hospital death postsurgery for IE (area under the receiver operating characteristic curve, 0.780; 95% CI, 0.734-0.822). The score performed better than 5 of 6 scoring systems for in-hospital death after cardiac surgery that were considered. CONCLUSIONS: A simple scoring system based on risk factors for in-hospital death was specifically created to predict mortality risk postsurgery in patients with IE
    corecore