38 research outputs found

    Child Survival, Poverty and Policy Options from DHS Surveys in Kenya: 1993-2003

    Get PDF
    This paper analyses multidimensional aspects of child poverty in Kenya. We carry out poverty and inequality comparisons for child survival and also use the parametric survival model to explain childhood mortality using DHS data. The results of poverty comparisons show that: children with the lowest probability of survival are from households with the lowest level of assets; and poverty orderings for child survival by assets are robust to the choice of the poverty line and to the measure of wellbeing. Inequality analysis suggests that there is less mortality inequality among children facing mortality than children who are better off. The survival model results show that child and maternal characteristics, and household assets are important correlates of childhood mortality. The results further show that health care services are crucial for child survival. Policy simulations suggest that there is potential for making some progress in reducing mortality, but the ERS and MDG targets cannot be achieved.Child survival, multidimensional poverty, inequality, stochastic dominance, childhood mortality, asset index, Kenya

    Negotiating Constraints to Sport Participation of University Soccer Players

    Get PDF
    The purpose of this study was to determine the strategies used by male university soccer athletes to negotiate constraints towards sport participation. Some selected socio-demographic factors (year of study, parental social economic status (SES), family involvement in soccer and birth rank) were correlated with strategies of negotiating constraints to sport participation. Data was collected through questionnaires from university soccer players (n=242) who were participating in a national university soccer championship. Pearson products moment correlation of coefficient was used to test hypothesis on selected socio-demographic factors and strategies for constraint negotiation. Findings indicated that the majority of players were either first born or second born and soccer was popular in their universities. The major strategies of negotiating constraints were time management, and interpersonal coordination. The selected socio-demographic factors had weak associations with the strategies used to negotiate constraints. Findings have implications to sport administrators in the universities and future researchers need to evaluate the association between participation motivation, constraints and constraint negotiation strategies of university athletes. Keywords: Constraints, negotiation, soccer, university

    Changing use of surgical antibiotic prophylaxis in Thika Hospital, Kenya: a quality improvement intervention with an interrupted time series design.

    Get PDF
    INTRODUCTION: In low-income countries, Surgical Site Infection (SSI) is a common form of hospital-acquired infection. Antibiotic prophylaxis is an effective method of preventing these infections, if given immediately before the start of surgery. Although several studies in Africa have compared pre-operative versus post-operative prophylaxis, there are no studies describing the implementation of policies to improve prescribing of surgical antibiotic prophylaxis in African hospitals. METHODS: We conducted SSI surveillance at a typical Government hospital in Kenya over a 16 month period between August 2010 and December 2011, using standard definitions of SSI and the extent of contamination of surgical wounds. As an intervention, we developed a hospital policy that advised pre-operative antibiotic prophylaxis and discouraged extended post-operative antibiotics use. We measured process, outcome and balancing effects of this intervention in using an interrupted time series design. RESULTS: From a starting point of near-exclusive post-operative antibiotic use, after policy introduction in February 2011 there was rapid adoption of the use of pre-operative antibiotic prophylaxis (60% of operations at 1 week; 98% at 6 weeks) and a substantial decrease in the use of post-operative antibiotics (40% of operations at 1 week; 10% at 6 weeks) in Clean and Clean-Contaminated surgery. There was no immediate step-change in risk of SSI, but overall, there appeared to be a moderate reduction in the risk of superficial SSI across all levels of wound contamination. There were marked reductions in the costs associated with antibiotic use, the number of intravenous injections performed and nursing time spent administering these. CONCLUSION: Implementation of a locally developed policy regarding surgical antibiotic prophylaxis is an achievable quality improvement target for hospitals in low-income countries, and can lead to substantial benefits for individual patients and the institution

    Modelling the Protective Efficacy of Alternative Delivery Schedules for Intermittent Preventive Treatment of Malaria in Infants and Children

    Get PDF
    BACKGROUND: Intermittent preventive treatment in infants (IPTi) with sulfadoxine-pyrimethamine (SP) is recommended by WHO where malaria incidence in infancy is high and SP resistance is low. The current delivery strategy is via routine Expanded Program on Immunisation contacts during infancy (EPI-IPTi). However, improvements to this approach may be possible where malaria transmission is seasonal, or where the malaria burden lies mainly outside infancy. METHODS AND FINDINGS: A mathematical model was developed to estimate the protective efficacy (PE) of IPT against clinical malaria in children aged 2-24 months, using entomological and epidemiological data from an EPI-IPTi trial in Navrongo, Ghana to parameterise the model. The protection achieved by seasonally-targeted IPT in infants (sIPTi), seasonal IPT in children (sIPTc), and by case-management with long-acting artemisinin combination therapies (LA-ACTs) was predicted for Navrongo and for sites with different transmission intensity and seasonality. In Navrongo, the predicted PE of sIPTi was 26% by 24 months of age, compared to 16% with EPI-IPTi. sIPTc given to all children under 2 years would provide PE of 52% by 24 months of age. Seasonally-targeted IPT retained its advantages in a range of transmission patterns. Under certain circumstances, LA-ACTs for case-management may provide similar protection to EPI-IPTi. However, EPI-IPTi or sIPT combined with LA-ACTs would be substantially more protective than either strategy used alone. CONCLUSION: Delivery of IPT to infants via the EPI is sub-optimal because individuals are not protected by IPT at the time of highest malaria risk, and because older children are not protected. Alternative delivery strategies to the EPI are needed where transmission varies seasonally or the malaria burden extends beyond infancy. Long-acting ACTs may also make important reductions in malaria incidence. However, delivery systems must be developed to ensure that both forms of chemoprevention reach the individuals who are most exposed to malaria

    Patterns of anti-malarial drug treatment among pregnant women in Uganda

    Get PDF
    BACKGROUND: Prompt use of an effective anti-malarial drug is essential for controlling malaria and its adverse effects in pregnancy. The World Health Organization recommends an artemisinin-based combination therapy as the first-line treatment of uncomplicated malaria in the second and third trimesters of pregnancy. The study objective was to determine the degree to which presumed episodes of uncomplicated symptomatic malaria in pregnancy were treated with a recommended anti-malarial regimen in a region of Uganda. METHODS: Utilizing a population-based random sample, we interviewed women living in Jinja, Uganda who had been pregnant in the past year. RESULTS: Self-reported malaria during the index pregnancy was reported among 67% (n = 334) of the 500 participants. Among the 637 self-reported episodes of malaria, an anti-malarial drug was used for treatment in 85% of the episodes. Use of a currently recommended treatment in the first trimester was uncommon (5.6%). A contraindicated anti-malarial drug (sulphadoxine-pyrimethamine and/or artemether-lumefantrine) was involved in 70% of first trimester episodes. Recommended anti-malarials were used according to the guidelines in only 30.1% of all second and third trimester episodes. CONCLUSIONS: Self-reported malaria was extremely common in this population and adherence to treatment guidelines for the management of malaria in pregnancy was poor. Use of artemether-lumefantrine combined with non-recommended anti-malarials was common practice. Overuse of anti-malarial drugs, especially ones that are no longer recommended, undermines malaria control efforts by fueling the spread of drug resistance and delaying appropriate treatment of non-malarial febrile illnesses. Improved diagnostic capacity is essential to ultimately improving the management of malaria-like symptoms during pregnancy and appropriate use of currently available anti-malarials

    Children’s and adolescents’ rising animal-source food intakes in 1990–2018 were impacted by age, region, parental education and urbanicity

    Get PDF
    Animal-source foods (ASF) provide nutrition for children and adolescents’ physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the world’s child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 15–19 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes.publishedVersio

    Incident type 2 diabetes attributable to suboptimal diet in 184 countries

    Get PDF
    The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.8–14.4 million) incident T2D cases, representing 70.3% (68.8–71.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.0–27.1%)), excess refined rice and wheat intake (24.6% (22.3–27.2%)) and excess processed meat intake (20.3% (18.3–23.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.4–87.7%)) and Latin America and the Caribbean (81.8% (80.1–83.4%)); and lowest proportional burdens were in South Asia (55.4% (52.1–60.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally.publishedVersio

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Food system value-chain adaptability - can new opportunities increase food security and food safety in Kibera? Linking aquaculture to urban food systems : Workshop Report - Discussing new opportunities with the aquaculture value-chains between Nyeri - Kibera

    No full text
    New business opportunities for aquaculture farmers in Nyeri and Kibera fish vendors – with potentials for real impact on food security in Kiber
    corecore