1,147 research outputs found

    Aeolianite and barrier dune construction spanning the last two glacial-interglacial cycles from the southern Cape coast, South Africa

    Get PDF
    The southern Cape region of South Africa has extensive coastal aeolianites and barrier dunes. Whilst previously reported, limited knowledge of their age has precluded an understanding of their relationship with the climatic and sea-level fluctuations that have taken place during the Late Quaternary. Sedimentological and geomorphological studies combined with an optical dating programme reveal aeolianite development and barrier dune construction spanning at least the last two glacial–interglacial cycles. Aeolianite deposition has occurred on the southern Cape coast at ca 67–80, 88–90, 104–128, 160–189 and >200 ka before the present. Using this and other published data coupled with a better understanding of Late Quaternary sea-level fluctuations and palaeocoastline configurations, it is concluded that these depositional phases appear to be controlled by interglacial and subsequent interstadial sea-level high stands. These marine transgressions and regressions allowed onshore carbonate-rich sediment movement and subsequent aeolian reworking to occur at similar points in the landscape on a number of occasions. The lack of carbonates in more recent dunes (Oxygen Isotope Stages 1/2 and 4/5) is attributed not to leaching but to changes to carbonate production in the sediment source area caused by increased terrigenous material and/or changes in the balance between the warm Agulhas and nutrient-rich Benguela ocean current

    Cost effectiveness of an intervention to increase uptake of hepatitis C virus testing and treatment (HepCATT):cluster randomised controlled trial in primary care

    Get PDF
    Objective To evaluate the effectiveness and cost effectiveness of a complex intervention in primary care that aims to increase uptake of hepatitis C virus (HCV) case finding and treatment. Design Pragmatic, two armed, practice level, cluster randomised controlled trial and economic evaluation. Setting and participants 45 general practices in South West England (22 randomised to intervention and 23 to control arm). Outcome data were collected from all intervention practices and 21/23 control practices. Total number of flagged patients was 24 473 (about 5% of practice list). Intervention Electronic algorithm and flag on practice systems identifying patients with HCV risk markers (such as history of opioid dependence or HCV tests with no evidence of referral to hepatology), staff educational training in HCV, and practice posters/leaflets to increase patients’ awareness. Flagged patients were invited by letter for an HCV test (with one follow-up) and had on-screen pop-ups to encourage opportunistic testing. The intervention lasted one year, with practices recruited April to December 2016. Main outcome measures Primary outcome: uptake of HCV testing. Secondary outcomes: number of positive HCV tests and yield (proportion HCV positive); HCV treatment assessment at hepatology; cost effectiveness. Results Baseline HCV testing of flagged patients (six months before study start) was 608/13 097 (4.6%) in intervention practices and 380/11 376 (3.3%) in control practices. During the study 2071 (16%) of flagged patients in the intervention practices and 1163 (10%) in control practices were tested for HCV: overall intervention effect as an adjusted rate ratio of 1.59 (95% confidence interval 1.21 to 2.08; P<0.001). HCV antibodies were detected in 129 patients from intervention practices and 51 patients from control practices (adjusted rate ratio 2.24, 1.47 to 3.42) with weak evidence of an increase in yield (6.2% v 4.4%; adjusted risk ratio 1.40, 0.99 to 1.95). Referral and assessment increased in intervention practices compared with control practices (adjusted rate ratio 5.78, 1.6 to 21.6) with a risk difference of 1.3 per 1000 and a “number needed to help” of one extra HCV diagnosis, referral, and assessment per 792 (95% confidence interval 558 to 1883) patients flagged. The average cost of HCV case finding was £4.03 (95% confidence interval £2.27 to £5.80) per at risk patient and £3165 per additional patient assessed at hepatology. The incremental cost effectiveness ratio was £6212 per quality adjusted life year (QALY), with 92.5% probability of being below £20 000 per QALY. Conclusion HepCATT had a modest impact but is a low cost intervention that merits optimisation and implementation as part of an NHS strategy to increase HCV testing and treatment

    A Recombinant Blood-Stage Malaria Vaccine Reduces Plasmodium falciparum Density and Exerts Selective Pressure on Parasite Populations in a Phase 1-2b Trial in Papua New Guinea

    Get PDF
    The malaria vaccine Combination B comprises recombinant Plasmodium falciparum ring-infected erythrocyte surface antigen and 2 merozoite surface proteins (MSP1 and MSP2) formulated in oil-based adjuvant. A phase 1-2b double-blind, randomized, placebo-controlled trial in 120 children (5-9 years old) in Papua New Guinea demonstrated a 62% (95% confidence limits: 13%, 84%) reduction in parasite density in children not pretreated with sulfadoxine-pyrimethamine. Vaccinees had a lower prevalence of parasites carrying the MSP2-3D7 allelic form (corresponding to that in the vaccine) and a higher incidence of morbid episodes associated with FC27-type parasites. These results demonstrate functional activity of Combination B against P. falciparum in individuals with previous malaria exposure. The specific effects on parasites with particular msp2 genotypes suggest that the MSP2 component, at least in part, accounted for the activity. The vaccine-induced selection pressure exerted on the parasites and its consequences for morbidity strongly argue for developing vaccines comprising conserved antigens and/or multiple components covering all important allelic type

    Lack of phenotypic and evolutionary cross-resistance against parasitoids and pathogens in Drosophila melanogaster

    Get PDF
    BackgroundWhen organisms are attacked by multiple natural enemies, the evolution of a resistance mechanism to one natural enemy will be influenced by the degree of cross-resistance to another natural enemy. Cross-resistance can be positive, when a resistance mechanism against one natural enemy also offers resistance to another; or negative, in the form of a trade-off, when an increase in resistance against one natural enemy results in a decrease in resistance against another. Using Drosophila melanogaster, an important model system for the evolution of invertebrate immunity, we test for the existence of cross-resistance against parasites and pathogens, at both a phenotypic and evolutionary level.MethodsWe used a field strain of D. melanogaster to test whether surviving parasitism by the parasitoid Asobara tabida has an effect on the resistance against Beauveria bassiana, an entomopathogenic fungus; and whether infection with the microsporidian Tubulinosema kingi has an effect on the resistance against A. tabida. We used lines selected for increased resistance to A. tabida to test whether increased parasitoid resistance has an effect on resistance against B. bassiana and T. kingi. We used lines selected for increased tolerance against B. bassiana to test whether increased fungal resistance has an effect on resistance against A. tabida.Results/ConclusionsWe found no positive cross-resistance or trade-offs in the resistance to parasites and pathogens. This is an important finding, given the use of D. melanogaster as a model system for the evolution of invertebrate immunity. The lack of any cross-resistance to parasites and pathogens, at both the phenotypic and the evolutionary level, suggests that evolution of resistance against one class of natural enemies is largely independent of evolution of resistance against the other

    Training Hospital Providers in Basic CPR Skills in Botswana: Acquisition, Retention and Impact of Novel Training Techniques

    Get PDF
    Objective Globally, one third of deaths each year are from cardiovascular diseases, yet no strong evidence supports any specific method of CPR instruction in a resource-limited setting. We hypothesized that both existing and novel CPR training programs significantly impact skills of hospital-based healthcare providers (HCP) in Botswana. Methods HCP were prospectively randomized to 3 training groups: instructor led, limited instructor with manikin feedback, or self-directed learning. Data was collected prior to training, immediately after and at 3 and 6 months. Excellent CPR was prospectively defined as having at least 4 of 5 characteristics: depth, rate, release, no flow fraction, and no excessive ventilation. GEE was performed to account for within subject correlation. Results Of 214 HCP trained, 40% resuscitate ≥1/month, 28% had previous formal CPR training, and 65% required additional skills remediation to pass using AHA criteria. Excellent CPR skill acquisition was significant (infant: 32% vs. 71%, p \u3c 0.01; adult 28% vs. 48%, p \u3c 0.01). Infant CPR skill retention was significant at 3 (39% vs. 70%, p \u3c 0.01) and 6 months (38% vs. 67%, p \u3c 0.01), and adult CPR skills were retained to 3 months (34% vs. 51%, p = 0.02). On multivariable analysis, low cognitive score and need for skill remediation, but not instruction method, impacted CPR skill performance. Conclusions HCP in resource-limited settings resuscitate frequently, with little CPR training. Using existing training, HCP acquire and retain skills, yet often require remediation. Novel techniques with increased student: instructor ratio and feedback manikins were not different compared to traditional instruction

    Socioeconomic status and the 25 x 25 risk factors as determinants of premature mortality : a multicohort study and meta-analysis of 1.7 million men and women

    Get PDF
    Background In 2011, WHO member states signed up to the 25 x 25 initiative, a plan to cut mortality due to noncommunicable diseases by 25% by 2025. However, socioeconomic factors influencing non-communicable diseases have not been included in the plan. In this study, we aimed to compare the contribution of socioeconomic status to mortality and years-of-life-lost with that of the 25 x 25 conventional risk factors. Methods We did a multicohort study and meta-analysis with individual-level data from 48 independent prospective cohort studies with information about socioeconomic status, indexed by occupational position, 25 x 25 risk factors (high alcohol intake, physical inactivity, current smoking, hypertension, diabetes, and obesity), and mortality, for a total population of 1 751 479 (54% women) from seven high-income WHO member countries. We estimated the association of socioeconomic status and the 25 x 25 risk factors with all-cause mortality and cause-specific mortality by calculating minimally adjusted and mutually adjusted hazard ratios [HR] and 95% CIs. We also estimated the population attributable fraction and the years of life lost due to suboptimal risk factors. Findings During 26.6 million person-years at risk (mean follow-up 13.3 years [SD 6.4 years]), 310 277 participants died. HR for the 25 x 25 risk factors and mortality varied between 1.04 (95% CI 0.98-1.11) for obesity in men and 2.17 (2.06-2.29) for current smoking in men. Participants with low socioeconomic status had greater mortality compared with those with high socioeconomic status (HR 1.42, 95% CI 1.38-1.45 for men; 1.34, 1.28-1.39 for women); this association remained significant in mutually adjusted models that included the 25 x 25 factors (HR 1.26, 1.21-1.32, men and women combined). The population attributable fraction was highest for smoking, followed by physical inactivity then socioeconomic status. Low socioeconomic status was associated with a 2.1-year reduction in life expectancy between ages 40 and 85 years, the corresponding years-of-life-lost were 0.5 years for high alcohol intake, 0.7 years for obesity, 3.9 years for diabetes, 1.6 years for hypertension, 2.4 years for physical inactivity, and 4.8 years for current smoking. Interpretation Socioeconomic circumstances, in addition to the 25 x 25 factors, should be targeted by local and global health strategies and health risk surveillance to reduce mortality.Peer reviewe

    De novo mutations in SMCHD1 cause Bosma arhinia microphthalmia syndrome and abrogate nasal development

    Get PDF
    Bosma arhinia microphthalmia syndrome (BAMS) is an extremely rare and striking condition characterized by complete absence of the nose with or without ocular defects. We report here that missense mutations in the epigenetic regulator SMCHD1 mapping to the extended ATPase domain of the encoded protein cause BAMS in all 14 cases studied. All mutations were de novo where parental DNA was available. Biochemical tests and in vivo assays in Xenopus laevis embryos suggest that these mutations may behave as gain-of-function alleles. This finding is in contrast to the loss-of-function mutations in SMCHD1 that have been associated with facioscapulohumeral muscular dystrophy (FSHD) type 2. Our results establish SMCHD1 as a key player in nasal development and provide biochemical insight into its enzymatic function that may be exploited for development of therapeutics for FSHD

    Prostration and the prognosis of death in African children with severe malaria

    Get PDF
    Objectives: Malaria is still one of the main reasons for hospitalization in children living in sub-Saharan Africa. Rapid risk stratification at admission is essential for optimal medical care and improved prognosis. Whereas coma, deep breathing, and, to a lesser degree, severe anemia are established predictors of malaria-related death, the value of assessing prostration for risk stratification is less certain. Methods: Here we used a retrospective multi-center analysis comprising over 33,000 hospitalized children from four large studies, including two observational studies from the Severe Malaria in African Children network, a randomized controlled treatment study, and the phase-3-clinical RTS,S-malaria vaccine trial, to evaluate known risk factors of mortality and with a specific emphasis on the role of prostration. Results: Despite comparable age profiles of the participants, we found significant inter- and intra-study variation in the incidence of fatal malaria as well as in the derived risk ratios associated with the four risk factors: coma, deep breathing, anemia, and prostration. Despite pronounced variations, prostration was significantly associated with an increased risk of mortality (P <0.001) and its consideration resulted in improved predictive performance, both in a multivariate model and a univariate model based on the Lambaréné Organ Dysfunction Score. Conclusion: Prostration is an important clinical criterion to determine severe pediatric malaria with possible fatal outcomes
    corecore