4,112 research outputs found

    Cross-sectional study of the burden of vector-borne and soil-transmitted polyparasitism in rural communities of Coast Province, Kenya.

    Get PDF
    BACKGROUND: In coastal Kenya, infection of human populations by a variety of parasites often results in co-infection or poly-parasitism. These parasitic infections, separately and in conjunction, are a major cause of chronic clinical and sub-clinical human disease and exert a long-term toll on economic welfare of affected populations. Risk factors for these infections are often shared and overlap in space, resulting in interrelated patterns of transmission that need to be considered at different spatial scales. Integration of novel quantitative tools and qualitative approaches is needed to analyze transmission dynamics and design effective interventions. METHODOLOGY: Our study was focused on detecting spatial and demographic patterns of single- and co-infection in six villages in coastal Kenya. Individual and household level data were acquired using cross-sectional, socio-economic, and entomological surveys. Generalized additive models (GAMs and GAMMs) were applied to determine risk factors for infection and co-infections. Spatial analysis techniques were used to detect local clusters of single and multiple infections. PRINCIPAL FINDINGS: Of the 5,713 tested individuals, more than 50% were infected with at least one parasite and nearly 20% showed co-infections. Infections with Schistosoma haematobium (26.0%) and hookworm (21.4%) were most common, as was co-infection by both (6.3%). Single and co-infections shared similar environmental and socio-demographic risk factors. The prevalence of single and multiple infections was heterogeneous among and within communities. Clusters of single and co-infections were detected in each village, often spatially overlapped, and were associated with lower SES and household crowding. CONCLUSION: Parasitic infections and co-infections are widespread in coastal Kenya, and their distributions are heterogeneous across landscapes, but inter-related. We highlighted how shared risk factors are associated with high prevalence of single infections and can result in spatial clustering of co-infections. Spatial heterogeneity and synergistic risk factors for polyparasitism need to be considered when designing surveillance and intervention strategies

    Measuring fitness of Kenyan children with polyparasitic infections using the 20-meter shuttle run test as a morbidity metric.

    Get PDF
    BACKGROUND: To date, there has been no standardized approach to the assessment of aerobic fitness among children who harbor parasites. In quantifying the disability associated with individual or multiple chronic infections, accurate measures of physical fitness are important metrics. This is because exercise intolerance, as seen with anemia and many other chronic disorders, reflects the body's inability to maintain adequate oxygen supply (VO(2) max) to the motor tissues, which is frequently linked to reduced quality-of-life in terms of physical and job performance. The objective of our study was to examine the associations between polyparasitism, anemia, and reduced fitness in a high risk Kenyan population using novel implementation of the 20-meter shuttle run test (20mSRT), a well-standardized, low-technology physical fitness test. METHODOLOGY/PRINCIPAL FINDINGS: Four villages in coastal Kenya were surveyed during 2009-2010. Children 5-18 years were tested for infection with Schistosoma haematobium (Sh), malaria, filaria, and geohelminth infections by standard methods. After anthropometric and hemoglobin testing, fitness was assessed with the 20 mSRT. The 20 mSRT proved easy to perform, requiring only minimal staff training. Parasitology revealed high prevalence of single and multiple parasitic infections in all villages, with Sh being the most common (25-62%). Anemia prevalence was 45-58%. Using multiply-adjusted linear modeling that accounted for household clustering, decreased aerobic capacity was significantly associated with anemia, stunting, and wasting, with some gender differences. CONCLUSIONS/SIGNIFICANCE: The 20 mSRT, which has excellent correlation with VO(2), is a highly feasible fitness test for low-resource settings. Our results indicate impaired fitness is common in areas endemic for parasites, where, at least in part, low fitness scores are likely to result from anemia and stunting associated with chronic infection. The 20 mSRT should be used as a common metric to quantify physical fitness and compare sub-clinical disability across many different disorders and community settings

    Oral human papillomavirus (HPV) infection in men who have sex with men: prevalence and lack of anogenital concordance.

    Get PDF
    To estimate the prevalence of oral detectable human papillomavirus (HPV) DNA in HIV-negative men who have sex with men (MSM) attending a sexual health clinic in London and concordance with anogenital HPV infection. Such data are important to improve our understanding of the epidemiology of oral HPV and the potential use of vaccines to prevent oropharyngeal cancers

    Human papillomavirus epidemiology in men who have sex with men: implications for a vaccine programme at sexual health clinics in the UK

    Get PDF
    Men who have sex with men (MSM) are at increased risk of human papillomavirus (HPV) infection and related disease. There are two licensed HPV vaccines against the high-risk HPV types, HPV16/18, one of which, the quadrivalent vaccine, additionally targets low-risk HPV types (HPV6/11). MSM will not benefit from the UK’s school-based HPV vaccine programme targeting girls. Sexual Health clinics (SHC) are the most feasible setting for vaccinating MSM. This thesis aimed to inform the policy decision on whether to vaccinate MSM attending SHCs in the UK by estimating underlying epidemiological parameters: HPV exposure in the MSM population attending SHCs, expected vaccine coverage and the effect of HPV16 vaccination on anal cancer incidence. A cross-sectional survey of 522 MSM was conducted at a SHC. Specimens (anal and external genital swabs, urine, oral rinse and serum) were tested for HPV and demographic, behavioural and clinical information was collected (HPV-MSM-MMC study). A static deterministic cohort model was developed of HPV16 infection and anal cancer in SHC-attending MSM. A substantial burden of HPV infection in MSM could be prevented at SHCs: a third of HPV-MSM-MMC participants were infected with ≥1 quadrivalent-vaccine HPV types yet none with all four. Therefore all had potential to benefit, at least partially, from vaccination. An additional third had evidence of prior exposure (seropositive or history of anogenital warts) to quadrivalent-vaccine types and a final third had no evidence of exposure. A targeted HPV vaccine programme at SHCs would result in ≥50% coverage of the UK’s MSM population. Vaccination at SHCs would efficiently interrupt HPV transmission because SHC-attenders represent MSM at high risk of HPV infection. Vaccination against HPV16 was predicted to substantially reduce anal cancer incidence, even without the effect of herd immunity. This thesis provides strong evidence for HPV vaccine effectiveness using a programme targeting MSM attending SHCs

    Improving the effectiveness and efficiency of outpatient services: A scoping review of interventions at the primary-secondary care interface

    Get PDF
    This is the final published version. Available from SAGE Publications via the DOI in this record.Objectives: Variation in patterns of referral from primary care can lead to inappropriate overuse or underuse of specialist resources. Our aim was to review the literature on strategies involving primary care that are designed to improve the effectiveness and efficiency of outpatient services. Methods: A scoping review to update a review published in 2006. We conducted a systematic literature search and qualitative evidence synthesis of studies across five intervention domains: transfer of services from hospital to primary care; relocation of hospital services to primary care; joint working between primary care practitioners and specialists; interventions to change the referral behaviour of primary care practitioners and interventions to change patient behaviour. Results: The 183 studies published since 2005, taken with the findings of the previous review, suggest that transfer of services from secondary to primary care and strategies aimed at changing referral behaviour of primary care clinicians can be effective in reducing outpatient referrals and in increasing the appropriateness of referrals. Availability of specialist advice to primary care practitioners by email or phone and use of store-and-forward telemedicine also show potential for reducing outpatient referrals and hence reducing costs. There was little evidence of a beneficial effect of relocation of specialists to primary care, or joint primary/secondary care management of patients on outpatient referrals. Across all intervention categories there was little evidence available on cost-effectiveness. Conclusions: There are a number of promising interventions which may improve the effectiveness and efficiency of outpatient services, including making it easier for primary care clinicians and specialists to discuss patients by email or phone. There remain substantial gaps in the evidence, particularly on cost-effectiveness, and new interventions should continue to be evaluated as they are implemented more widely. A move for specialists to work in the community is unlikely to be cost-effective without enhancing primary care clinicians’skills through education or joint consultations with complex patients.National Institute for Health Research (NIHR

    Factors contributing to high performance of sows in free farrowing systems

    Get PDF
    BackgroundPressure to abolish farrowing crates is increasing, and producers are faced with decisions about which alternative system to adopt. For sow welfare, well designed free farrowing systems without close confinement are considered optimal but producers have concerns about increased piglet mortality, particularly crushing by the sow. Reporting accurate performance figures from commercial farms newly operating such systems could inform the transition process. This study investigated performance on three commercial farms operating four different zero-confinement systems, three of which were newly installed. A total of 3212 litters from 2920 sows were followed from farrowing to weaning over a three-year period with key performance indicators (KPIs) recorded. Mixed Models (LMMs, GLMMs) determined the influence of different factors (e.g. farrowing system, sow parity, management aspects) and litter characteristics on performance, including levels and causes of piglet mortality.ResultsPiglet mortality was significantly influenced by farm/system. Live-born mortality ranged from 10.3 to 20.6% with stillbirths ranging from 2.5 to 5.9%. A larger litter size and higher parity resulted in higher levels of mortality regardless of system. In all systems, crushing was the main cause of piglet mortality (59%), but 31% of sows did not crush any piglets, whilst 26% crushed only one piglet and the remaining sows (43%) crushed two or more piglets. System significantly influenced crushing as a percentage of all deaths, with the system with the smallest spatial footprint (m2) compared to the other systems, recording the highest levels of crushing. Time from the start of the study influenced mortality, with significant reductions in crushing mortality (by ~ 4%) over the course of the three-year study. There was a highly significant effect of length of time (days) between moving sows into the farrowing accommodation and sows farrowing on piglet mortality (P < 0.001). The less time between sows moving in and farrowing, the higher the levels of piglet mortality, with ~ 3% increase in total mortality every five days. System effects were highly significant after adjusting for parity, litter size, and days pre-farrowing.ConclusionThese results from commercial farms demonstrate that even sows that have not been specifically selected for free farrowing are able, in many cases, to perform well in these zero-confinement systems, but that a period of adaptation is to be expected for overall farm performance. There are performance differences between the farms/systems which can be attributed to individual farm/system characteristics (e.g. pen design and management, staff expertise, pig genotypes, etc.). Higher parity sows and those producing very large litters provide a greater challenge to piglet mortality in these free farrowing systems (just as they do in crate systems). Management significantly influences performance, and ensuring sows have plenty of time to acclimatise between moving in to farrowing accommodation and giving birth is a critical aspect of improving piglet survival

    Factors contributing to high performance of sows in free farrowing systems

    Get PDF
    \ua9 The Author(s) 2024. Background: Pressure to abolish farrowing crates is increasing, and producers are faced with decisions about which alternative system to adopt. For sow welfare, well designed free farrowing systems without close confinement are considered optimal but producers have concerns about increased piglet mortality, particularly crushing by the sow. Reporting accurate performance figures from commercial farms newly operating such systems could inform the transition process. This study investigated performance on three commercial farms operating four different zero-confinement systems, three of which were newly installed. A total of 3212 litters from 2920 sows were followed from farrowing to weaning over a three-year period with key performance indicators (KPIs) recorded. Mixed Models (LMMs, GLMMs) determined the influence of different factors (e.g. farrowing system, sow parity, management aspects) and litter characteristics on performance, including levels and causes of piglet mortality. Results: Piglet mortality was significantly influenced by farm/system. Live-born mortality ranged from 10.3 to 20.6% with stillbirths ranging from 2.5 to 5.9%. A larger litter size and higher parity resulted in higher levels of mortality regardless of system. In all systems, crushing was the main cause of piglet mortality (59%), but 31% of sows did not crush any piglets, whilst 26% crushed only one piglet and the remaining sows (43%) crushed two or more piglets. System significantly influenced crushing as a percentage of all deaths, with the system with the smallest spatial footprint (m2) compared to the other systems, recording the highest levels of crushing. Time from the start of the study influenced mortality, with significant reductions in crushing mortality (by ~ 4%) over the course of the three-year study. There was a highly significant effect of length of time (days) between moving sows into the farrowing accommodation and sows farrowing on piglet mortality (P < 0.001). The less time between sows moving in and farrowing, the higher the levels of piglet mortality, with ~ 3% increase in total mortality every five days. System effects were highly significant after adjusting for parity, litter size, and days pre-farrowing. Conclusion: These results from commercial farms demonstrate that even sows that have not been specifically selected for free farrowing are able, in many cases, to perform well in these zero-confinement systems, but that a period of adaptation is to be expected for overall farm performance. There are performance differences between the farms/systems which can be attributed to individual farm/system characteristics (e.g. pen design and management, staff expertise, pig genotypes, etc.). Higher parity sows and those producing very large litters provide a greater challenge to piglet mortality in these free farrowing systems (just as they do in crate systems). Management significantly influences performance, and ensuring sows have plenty of time to acclimatise between moving in to farrowing accommodation and giving birth is a critical aspect of improving piglet survival

    A universal protocol to generate consensus level genome sequences for foot-and-mouth disease virus and other positive-sense polyadenylated RNA viruses using the Illumina MiSeq

    Get PDF
    BACKGROUND: Next-Generation Sequencing (NGS) is revolutionizing molecular epidemiology by providing new approaches to undertake whole genome sequencing (WGS) in diagnostic settings for a variety of human and veterinary pathogens. Previous sequencing protocols have been subject to biases such as those encountered during PCR amplification and cell culture, or are restricted by the need for large quantities of starting material. We describe here a simple and robust methodology for the generation of whole genome sequences on the Illumina MiSeq. This protocol is specific for foot-and-mouth disease virus (FMDV) or other polyadenylated RNA viruses and circumvents both the use of PCR and the requirement for large amounts of initial template. RESULTS: The protocol was successfully validated using five FMDV positive clinical samples from the 2001 epidemic in the United Kingdom, as well as a panel of representative viruses from all seven serotypes. In addition, this protocol was successfully used to recover 94% of an FMDV genome that had previously been identified as cell culture negative. Genome sequences from three other non-FMDV polyadenylated RNA viruses (EMCV, ERAV, VESV) were also obtained with minor protocol amendments. We calculated that a minimum coverage depth of 22 reads was required to produce an accurate consensus sequence for FMDV O. This was achieved in 5 FMDV/O/UKG isolates and the type O FMDV from the serotype panel with the exception of the 5′ genomic termini and area immediately flanking the poly(C) region. CONCLUSIONS: We have developed a universal WGS method for FMDV and other polyadenylated RNA viruses. This method works successfully from a limited quantity of starting material and eliminates the requirement for genome-specific PCR amplification. This protocol has the potential to generate consensus-level sequences within a routine high-throughput diagnostic environment

    Emergence of heat extremes attributable to anthropogenic influences

    Get PDF
    Climate scientists have demonstrated that a substantial fraction of the probability of numerous recent extreme events may be attributed to human-induced climate change. However, it is likely that for temperature extremes occurring over previous decades a fraction of their probability was attributable to anthropogenic influences. We identify the first record-breaking warm summers and years for which a discernible contribution can be attributed to human influence. We find a significant human contribution to the probability of record-breaking global temperature events as early as the 1930s. Since then, all the last 16 record-breaking hot years globally had an anthropogenic contribution to their probability of occurrence. Aerosol-induced cooling delays the timing of a significant human contribution to record-breaking events in some regions. Without human-induced climate change recent hot summers and years would be very unlikely to have occurred.111411Ysciescopu

    P2X receptors: epithelial ion channels and regulators of salt and water transport.

    Get PDF
    When the results from electrophysiological studies of renal epithelial cells are combined with data from in vivo tubule microperfusion experiments and immunohistochemical surveys of the nephron, the accumulated evidence suggests that ATP-gated ion channels, P2X receptors, play a specialized role in the regulation of ion and water movement across the renal tubule and are integral to electrolyte and fluid homeostasis. In this short review, we discuss the concept of P2X receptors as regulators of salt and water salvage pathways, as well as acknowledging their accepted role as ATP-gated ion channels
    corecore