41 research outputs found

    Can Low-Severity Fire Reverse Compositional Change in Montane Forests of the Sierra Nevada, California, USA?

    Get PDF
    Throughout the Sierra Nevada, nearly a century of fire suppression has altered the tree species composition, forest structure, and fire regimes that were previously characteristic of montane forests. Species composition is fundamentally important because species differ in their tolerances to fire and environmental stressors, and these differences dictate future forest structure and influence fire regime attributes. In some lower montane stands, shade-tolerant, fire-sensitive species have driven a threefold increase in tree density that may intensify the risk of high-severity fire. In upper montane forests, which were historically characterized by longer fire return intervals, the effects of fire exclusion are both less apparent and less studied. Although land managers have been reintroducing fire to lower and upper montane forests for \u3e4 decades, the potentially restorative effects of these actions on species composition remain largely unassessed. We used tree diameter and species data from 51 recently burned and 46 unburned plots located throughout lower and upper montane forests in Yosemite National Park and Sequoia & Kings Canyon National Parks to examine the effects of low-to moderate-severity (hereafter, lower-severity)fire on the demography of seven prevalent tree species. The density of Abies concolor concolor 30–45 cm dbh, A. magnifica Calocedrus decurrens concolor but not for C. decurrens, and (2) variability in tree density among plots that burned at lower severity exceeded the range of tree densities reported in historical data sets. High proportions of shade-tolerant species in some postfire stands may increase the prevalence of shade-tolerant species in the future, a potential concern for managers who seek to minimize ladder fuels and promote forest structure that is less prone to high-severity fire

    A Sub-Microscopic Gametocyte Reservoir Can Sustain Malaria Transmission

    Get PDF
    Novel diagnostic tools, including PCR and high field gradient magnetic fractionation (HFGMF), have improved detection of asexual Plasmodium falciparum parasites and especially infectious gametocytes in human blood. These techniques indicate a significant number of people carry gametocyte densities that fall below the conventional threshold of detection achieved by standard light microscopy (LM).To determine how low-level gametocytemia may affect transmission in present large-scale efforts for P. falciparum control in endemic areas, we developed a refinement of the classical Ross-Macdonald model of malaria transmission by introducing multiple infective compartments to model the potential impact of highly prevalent, low gametocytaemic reservoirs in the population. Models were calibrated using field-based data and several numerical experiments were conducted to assess the effect of high and low gametocytemia on P. falciparum transmission and control. Special consideration was given to the impact of long-lasting insecticide-treated bed nets (LLIN), presently considered the most efficient way to prevent transmission, and particularly LLIN coverage similar to goals targeted by the Roll Back Malaria and Global Fund malaria control campaigns. Our analyses indicate that models which include only moderate-to-high gametocytemia (detectable by LM) predict finite eradication times after LLIN introduction. Models that include a low gametocytemia reservoir (requiring PCR or HFGMF detection) predict much more stable, persistent transmission. Our modeled outcomes result in significantly different estimates for the level and duration of control needed to achieve malaria elimination if submicroscopic gametocytes are included.It will be very important to complement current methods of surveillance with enhanced diagnostic techniques to detect asexual parasites and gametocytes to more accurately plan, monitor and guide malaria control programs aimed at eliminating malaria

    The Role of Human Movement in the Transmission of Vector-Borne Pathogens

    Get PDF
    Vector-borne diseases constitute a largely neglected and enormous burden on public health in many resource-challenged environments, demanding efficient control strategies that could be developed through improved understanding of pathogen transmission. Human movement—which determines exposure to vectors—is a key behavioral component of vector-borne disease epidemiology that is poorly understood. We develop a conceptual framework to organize past studies by the scale of movement and then examine movements at fine-scale—i.e., people going through their regular, daily routine—that determine exposure to insect vectors for their role in the dynamics of pathogen transmission. We develop a model to quantify risk of vector contact across locations people visit, with emphasis on mosquito-borne dengue virus in the Amazonian city of Iquitos, Peru. An example scenario illustrates how movement generates variation in exposure risk across individuals, how transmission rates within sites can be increased, and that risk within sites is not solely determined by vector density, as is commonly assumed. Our analysis illustrates the importance of human movement for pathogen transmission, yet little is known—especially for populations most at risk to vector-borne diseases (e.g., dengue, leishmaniasis, etc.). We outline several important considerations for designing epidemiological studies to encourage investigation of individual human movement, based on experience studying dengue

    Coffee and its waste repel gravid Aedes albopictus females and inhibit the development of their embryos

    Get PDF
    corecore