54 research outputs found

    The impact of low erythrocyte density in human blood on the fitness and energetic reserves of the African malaria vector Anopheles gambiae

    Get PDF
    Background Anaemia is a common health problem in the developing world. This condition is characterized by a reduction in erythrocyte density, primarily from malnutrition and/or infectious diseases such as malaria. As red blood cells are the primary source of protein for haematophagous mosquitoes, any reduction could impede the ability of mosquito vectors to transmit malaria by influencing their fitness or that of the parasites they transmit. The aim of this study was to determine the impact of differences in the density of red blood cells in human blood on malaria vector (Anopheles gambiae sensu stricto) fitness. The hypotheses tested are that mosquito vector energetic reserves and fitness are negatively influenced by reductions in the red cell density of host human blood meals commensurate with those expected from severe anaemia. Methods Mosquitoes (An. gambiae s.s.) were offered blood meals of different packed cell volume(PCV) of human blood consistent with those arising from severe anaemia (15%) and normalPCV (50%). Associations between mosquito energetic reserves (lipid, glucose and glycogen)and fitness measures (reproduction and survival) and blood meal PCV were investigated. Results The amount of protein that malaria vectors acquired from blood feeding (indexed by haematin excretion) was significantly reduced at low blood PCV. However, mosquitoes feeding on blood of low PCV had the same oviposition rates as those feeding on blood of normal PCV, and showed an increase in egg production of around 15%. The long-term survival of An. gambiae s.s was reduced after feeding on low PCV blood, but PCV had no significant impact on the proportion of mosquitoes surviving through the minimal period required to develop and transmit malaria parasites (estimated as 14 days post-blood feeding). The impact of blood PCV on the energetic reserves of mosquitoes was relatively minor. Conclusions These results suggest that feeding on human hosts whose PCV has been depleted due to severe anaemia does not significantly reduce the fitness or transmission potential of malaria vectors, and indicates that mosquitoes may be able exploit resources for reproduction more efficiently from blood of low rather than normal PCV

    Exploring Cell Tropism as a Possible Contributor to Influenza Infection Severity

    Get PDF
    Several mechanisms have been proposed to account for the marked increase in severity of human infections with avian compared to human influenza strains, including increased cytokine expression, poor immune response, and differences in target cell receptor affinity. Here, the potential effect of target cell tropism on disease severity is studied using a mathematical model for in-host influenza viral infection in a cell population consisting of two different cell types. The two cell types differ only in their susceptibility to infection and rate of virus production. We show the existence of a parameter regime which is characterized by high viral loads sustained long after the onset of infection. This finding suggests that differences in cell tropism between influenza strains could be sufficient to cause significant differences in viral titer profiles, similar to those observed in infections with certain strains of influenza A virus. The two target cell mathematical model offers good agreement with experimental data from severe influenza infections, as does the usual, single target cell model albeit with biologically unrealistic parameters. Both models predict that while neuraminidase inhibitors and adamantanes are only effective when administered early to treat an uncomplicated seasonal infection, they can be effective against more severe influenza infections even when administered late

    Factors Associated with Acquisition of Human Infective and Animal Infective Trypanosome Infections in Domestic Livestock in Western Kenya

    Get PDF
    Trypanosomiasis is regarded as a constraint on livestock production in Western Kenya where the responsibility for tsetse and trypanosomiasis control has increasingly shifted from the state to the individual livestock owner. To assess the sustainability of these localised control efforts, this study investigates biological and management risk factors associated with trypanosome infections detected by polymerase chain reaction (PCR), in a range of domestic livestock at the local scale in Busia, Kenya. Busia District also remains endemic for human sleeping sickness with sporadic cases of sleeping sickness reported.In total, trypanosome infections were detected in 11.9% (329) out of the 2773 livestock sampled in Busia District. Multivariable logistic regression revealed that host species and cattle age affected overall trypanosome infection, with significantly increased odds of infection for cattle older than 18 months, and significantly lower odds of infection in pigs and small ruminants. Different grazing and watering management practices did not affect the odds of trypanosome infection, adjusted by host species. Neither anaemia nor condition score significantly affected the odds of trypanosome infection in cattle. Human infective Trypanosoma brucei rhodesiense were detected in 21.5% of animals infected with T. brucei s.l. (29/135) amounting to 1% (29/2773) of all sampled livestock, with significantly higher odds of T. brucei rhodesiense infections in T. brucei s.l. infected pigs (OR =  4.3, 95%CI 1.5-12.0) than in T. brucei s.l. infected cattle or small ruminants.Although cattle are the dominant reservoir of trypanosome infection it is unlikely that targeted treatment of only visibly diseased cattle will achieve sustainable interruption of transmission for either animal infective or zoonotic human infective trypanosomiasis, since most infections were detected in cattle that did not exhibit classical clinical signs of trypanosomiasis. Pigs were also found to be reservoirs of infection for T. b. rhodesiense and present a risk to local communities

    Vitamin d status predicts 30 day mortality in hospitalised cats

    Get PDF
    Vitamin D insufficiency, defined as low serum concentrations of the major circulating form of vitamin D, 25 hydroxyvitamin D (25(OH)D), has been associated with the development of numerous infectious, inflammatory, and neoplastic disorders in humans. In addition, vitamin D insufficiency has been found to be predictive of mortality for many disorders. However, interpretation of human studies is difficult since vitamin D status is influenced by many factors, including diet, season, latitude, and exposure to UV radiation. In contrast, domesticated cats do not produce vitamin D cutaneously, and most cats are fed a commercial diet containing a relatively standard amount of vitamin D. Consequently, domesticated cats are an attractive model system in which to examine the relationship between serum 25(OH)D and health outcomes. The hypothesis of this study was that vitamin D status would predict short term, all-cause mortality in domesticated cats. Serum concentrations of 25(OH)D, together with a wide range of other clinical, hematological, and biochemical parameters, were measured in 99 consecutively hospitalised cats. Cats which died within 30 days of initial assessment had significantly lower serum 25(OH)D concentrations than cats which survived. In a linear regression model including 12 clinical variables, serum 25(OH)D concentration in the lower tertile was significantly predictive of mortality. The odds ratio of mortality within 30 days was 8.27 (95% confidence interval 2.54-31.52) for cats with a serum 25(OH)D concentration in the lower tertile. In conclusion, this study demonstrates that low serum 25(OH)D concentration status is an independent predictor of short term mortality in cats

    The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2

    Get PDF
    Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase 1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age  6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score  652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc = 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N = 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in Asia and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701
    corecore