288 research outputs found

    Is the Utility of the GLIM Criteria Used to Diagnose Malnutrition Suitable for Bicultural Populations? Findings from Life and Living in Advanced Age Cohort Study in New Zealand (LiLACS NZ).

    Get PDF
    OBJECTIVES: To investigate associations between nutrition risk (determined by SCREEN-II) and malnutrition (diagnosed by the GLIM criteria) with five-year mortality in Māori and non-Māori of advanced age. DESIGN: A longitudinal cohort study. SETTING: Bay of Plenty and Lakes regions of New Zealand. PARTICIPANTS: 255 Māori; 400 non-Māori octogenarians. MEASUREMENTS: All participants were screened for nutrition risk using the Seniors in the Community: Risk Evaluation for Eating and Nutrition (SCREEN-II). Those at high nutrition risk (SCREEN-II score 0.05) but was for non-Māori. This association remained significant after adjustment for other predictors of death (OR (95% CI); 0.50 (0.29, 0.86), P< 0.05). Reduced food intake was the only GLIM criterion predictive of five-year mortality for Māori (HR (95% CI); 10.77 (4.76, 24.38), P <0.001). For non-Māori, both aetiologic and phenotypic GLIM criteria were associated with five-year mortality. CONCLUSION: Nutrition risk, but not malnutrition diagnosed by the GLIM criteria was significantly associated with mortality for Māori. Conversely, both nutrition risk and malnutrition were significantly associated with mortality for non-Māori. Appropriate phenotypic criteria for diverse populations are needed within the GLIM framework.Publishe

    Going beyond personal protection against mosquito bites to eliminate malaria transmission: population suppression of malaria vectors that exploit both human and animal blood

    Get PDF
    Protecting individuals and households against mosquito bites with long-lasting insecticidal nets (LLINs) or indoor residual spraying (IRS) can suppress entire populations of unusually efficient malaria vector species that predominantly feed indoors on humans. Mosquitoes which usually feed on animals are less reliant on human blood, so they are far less vulnerable to population suppression effects of such human-targeted insecticidal measures. Fortunately, the dozens of mosquito species which primarily feed on animals are also relatively inefficient vectors of malaria, so personal protection against mosquito bites may be sufficient to eliminate transmission. However, a handful of mosquito species are particularly problematic vectors of residual malaria transmission, because they feed readily on both humans and animals. These unusual vectors feed often enough on humans to be potent malaria vectors, but also often enough on animals to evade population control with LLINs, IRS or any other insecticidal personal protection measure targeted only to humans. Anopheles arabiensis and A. coluzzii in Africa, A. darlingi in South America and A. farauti in Oceania, as well as A. culicifacies species E, A. fluviatilis species S, A. lesteri and A. minimus in Asia, all feed readily on either humans or animals and collectively mediate residual malaria transmission across most of the tropics. Eliminating malaria transmission by vectors exhibiting such dual host preferences will require aggressive mosquito population abatement, rather than just personal protection of humans. Population suppression of even these particularly troublesome vectors is achievable with a variety of existing vector control technologies that remain underdeveloped or underexploited

    Dissemination and implementation of an educational tool for veterans on complementary and alternative medicine: a case study

    Get PDF
    Background Predicting when and where pathogens will emerge is difficult, yet, as shown by the recent Ebola and Zika epidemics, effective and timely responses are key. It is therefore crucial to transition from reactive to proactive responses for these pathogens. To better identify priorities for outbreak mitigation and prevention, we developed a cohesive framework combining disparate methods and data sources, and assessed subnational pandemic potential for four viral haemorrhagic fevers in Africa, Crimean–Congo haemorrhagic fever, Ebola virus disease, Lassa fever, and Marburg virus disease. Methods In this multistage analysis, we quantified three stages underlying the potential of widespread viral haemorrhagic fever epidemics. Environmental suitability maps were used to define stage 1, index-case potential, which assesses populations at risk of infection due to spillover from zoonotic hosts or vectors, identifying where index cases could present. Stage 2, outbreak potential, iterates upon an existing framework, the Index for Risk Management, to measure potential for secondary spread in people within specific communities. For stage 3, epidemic potential, we combined local and international scale connectivity assessments with stage 2 to evaluate possible spread of local outbreaks nationally, regionally, and internationally. Findings We found epidemic potential to vary within Africa, with regions where viral haemorrhagic fever outbreaks have previously occurred (eg, western Africa) and areas currently considered non-endemic (eg, Cameroon and Ethiopia) both ranking highly. Tracking transitions between stages showed how an index case can escalate into a widespread epidemic in the absence of intervention (eg, Nigeria and Guinea). Our analysis showed Chad, Somalia, and South Sudan to be highly susceptible to any outbreak at subnational levels. Interpretation Our analysis provides a unified assessment of potential epidemic trajectories, with the aim of allowing national and international agencies to pre-emptively evaluate needs and target resources. Within each country, our framework identifies at-risk subnational locations in which to improve surveillance, diagnostic capabilities, and health systems in parallel with the design of policies for optimal responses at each stage. In conjunction with pandemic preparedness activities, assessments such as ours can identify regions where needs and provisions do not align, and thus should be targeted for future strengthening and support

    Flower vs. Leaf Feeding by Pieris brassicae: Glucosinolate-Rich Flower Tissues are Preferred and Sustain Higher Growth Rate

    Get PDF
    Interactions between butterflies and caterpillars in the genus Pieris and plants in the family Brassicaceae are among the best explored in the field of insect–plant biology. However, we report here for the first time that Pieris brassicae, commonly assumed to be a typical folivore, actually prefers to feed on flowers of three Brassica nigra genotypes rather than on their leaves. First- and second-instar caterpillars were observed to feed primarily on leaves, whereas late second and early third instars migrated via the small leaves of the flower branches to the flower buds and flowers. Once flower feeding began, no further leaf feeding was observed. We investigated growth rates of caterpillars having access exclusively to either leaves of flowering plants or flowers. In addition, we analyzed glucosinolate concentrations in leaves and flowers. Late-second- and early-third-instar P. brassicae caterpillars moved upward into the inflorescences of B. nigra and fed on buds and flowers until the end of the final (fifth) instar, after which they entered into the wandering stage, leaving the plant in search of a pupation site. Flower feeding sustained a significantly higher growth rate than leaf feeding. Flowers contained levels of glucosinolates up to five times higher than those of leaves. Five glucosinolates were identified: the aliphatic sinigrin, the aromatic phenyethylglucosinolate, and three indole glucosinolates: glucobrassicin, 4-methoxyglucobrassicin, and 4-hydroxyglucobrassicin. Tissue type and genotype were the most important factors affecting levels of identified glucosinolates. Sinigrin was by far the most abundant compound in all three genotypes. Sinigrin, 4-hydroxyglucobrassicin, and phenylethylglucosinolate were present at significantly higher levels in flowers than in leaves. In response to caterpillar feeding, sinigrin levels in both leaves and flowers were significantly higher than in undamaged plants, whereas 4-hydroxyglucobrassicin leaf levels were lower. Our results show that feeding on flower tissues, containing higher concentrations of glucosinolates, provides P. brassicae with a nutritional benefit in terms of higher growth rate. This preference appears to be in contrast to published negative effects of volatile glucosinolate breakdown products on the closely related Pieris rapae

    Cross-Sectional Dating of Novel Haplotypes of HERV-K 113 and HERV-K 115 Indicate These Proviruses Originated in Africa before Homo sapiens

    Get PDF
    The human genome, human endogenous retroviruses (HERV), of which HERV-K113 and HERV-K115 are the only known full-length proviruses that are insertionally polymorphic. Although a handful of previously published papers have documented their prevalence in the global population; to date, there has been no report on their prevalence in the United States population. Here, we studied the geographic distribution of K113 and K115 among 156 HIV-1+ subjects from the United States, including African Americans, Hispanics, and Caucasians. In the individuals studied, we found higher insertion frequencies of K113 (21%) and K115 (35%) in African Americans compared with Caucasians (K113 9% and K115 6%) within the United States. We also report the presence of three single nucleotide polymorphism sites in the K113 5′ long terminal repeats (LTRs) and four in the K115 5′ LTR that together constituted four haplotypes for K113 and five haplotypes for K115. HERV insertion times can be estimated from the sequence differences between the 5′ and 3′ LTR of each insertion, but this dating method cannot be used with HERV-K115. We developed a method to estimate insertion times by applying coalescent inference to 5′ LTR sequences within our study population and validated this approach using an independent estimate derived from the genetic distance between K113 5′ and 3′ LTR sequences. Using our method, we estimated the insertion dates of K113 and K115 to be a minimum of 800,000 and 1.1 million years ago, respectively. Both these insertion dates predate the emergence of anatomically modern Homo sapiens

    Incorporation of uranium into hematite during crystallization from ferrihydrite

    Get PDF
    Ferrihydrite was exposed to U(VI)-containing cement leachate (pH 10.5) and aged to induce crystallization of hematite. A combination of chemical extractions, TEM, and XAS techniques provided the first evidence that adsorbed U(VI) (≈3000 ppm) was incorporated into hematite during ferrihydrite aggregation and the early stages of crystallization, with continued uptake occurring during hematite ripening. Analysis of EXAFS and XANES data indicated that the U(VI) was incorporated into a distorted, octahedrally coordinated site replacing Fe(III). Fitting of the EXAFS showed the uranyl bonds lengthened from 1.81 to 1.87 Å, in contrast to previous studies that have suggested that the uranyl bond is lost altogether upon incorporation into hematite the results of this study both provide a new mechanistic understanding of uranium incorporation into hematite and define the nature of the bonding environment of uranium within the mineral structure. Immobilization of U(VI) by incorporation into hematite has clear and important implications for limiting uranium migration in natural and engineered environments. © 2014 American Chemical Society
    corecore