117 research outputs found

    Diversity and Community Structure of Stream Insects in a Minimally Disturbed Forested Watershed in Southern Illinois

    Get PDF
    The Lusk Creek Watershed, located in Pope County, IL, long has been rec- ognized as a high quality area of biological significance, but surveys of the stream macroinvertebrate fauna have been limited. Thus, a survey of the benthic insect community at 11 sites in the upper portion of Lusk Creek was conducted from May 2003 to April 2005. A total of 20,888 specimens, mostly immatures, were examined during the study and represented eight orders. The Diptera, by far, was the most abundant order, with 18,590 specimens, almost all of which were members of the Chironomidae or Simuliidae. Members of the EPT (Ephemer- optera, Plecoptera, Trichoptera) contributed 1,550 specimens. The Coleoptera was represented by 647 specimens, most of which were members of Stenelmis (Elmidae) (n = 612). The Shannon diversity index (H ́) ranged from 1.07-2.01 for individual sites and was indicative of relatively undisturbed streams in this region. Jackknife analyses of richness estimated that as many as 37 taxa were unobserved in this survey. Results provide information on reference conditions in the region and a foundation for future monitoring

    The Stream Biome Gradient Concept: factors controlling lotic systems across broad biogeographic scales

    Get PDF
    Citation: Dodds, W. K., Gido, K., Whiles, M. R., Daniels, M. D., & Grudzinski, B. P. (2015). The Stream Biome Gradient Concept: factors controlling lotic systems across broad biogeographic scales. Freshwater Science, 34(1), 1-19. doi:10.1086/679756We propose the Stream Biome Gradient Concept as a way to predict macroscale biological patterns in streams. This concept is based on the hypothesis that many abiotic and biotic features of streams change predictably along climate (temperature and precipitation) gradients because of direct influences of climate on hydrology, geomorphology, and interactions mediated by terrestrial vegetation. The Stream Biome Gradient Concept generates testable hypotheses related to continental variation among streams worldwide and allows aquatic scientists to understand how results from one biome might apply to a less-studied biome. Some predicted factors change monotonically across the biome/climate gradients, whereas others have maxima or minima in the central portion of the gradient. For example, predictions across the gradient from drier deserts through grasslands to wetter forests include more permanent flow, less bare ground, lower erosion and sediment transport rates, decreased importance of autochthonous C inputs to food webs, and greater stream animal species richness. In contrast, effects of large ungulate grazers on streams are expected to be greater in grasslands than in forests or deserts, and fire is expected to have weaker effects in grassland streams than in desert and forest streams along biome gradients with changing precipitation and constant latitude or elevation. Understanding historic patterns among biomes can help describe the evolutionary template at relevant biogeographic scales, can be used to broaden other conceptual models of stream ecology, and could lead to better management and conservation across the broadest scales

    Stream Invertebrate Responses to a Catastrophic Decline in Consumer Diversity

    Get PDF
    Tadpoles are often abundant and diverse consumers in headwater streams in the Neotropics. However, their populations are declining catastrophically in many regions, in part because of a chytrid fungal pathogen. These declines are occurring along a moving disease front in Central America and offer the rare opportunity to quantify the consequences of a sudden, dramatic decline in consumer diversity in a natural system. As part of the Tropical Amphibian Declines in Streams (TADS) project, we examined stream macroinvertebrate assemblage structure and production for 2 y in 4 stream reaches at 2 sites in Panama. One site initially had healthy amphibians but declined during our study (El Copé), and 1 site already had experienced a decline in 1996 (Fortuna). During the 1st y, total macroinvertebrate abundance, biomass, and production were generally similar among sites and showed no consistent patterns between pre- and post-decline streams. However, during the 2nd y, tadpole densities declined precipitously at El Copé, and total macroinvertebrate production was significantly lower in the El Copé streams than in Fortuna streams. Functional structure differed between sites. Abundance, biomass, and production of filterers generally were higher at Fortuna, and shredders generally were higher at El Copé. However, shredder production declined significantly in both El Copé reaches in the 2nd y as tadpoles declined. Nonmetric dimensional scaling (NMDS) based on abundance and production indicated that assemblages differed between sites, and patterns were linked to variations in relative availability of basal resources. Our results indicate that responses of remaining consumers to amphibian declines might not be evident in coarse metrics (e.g., total abundance and biomass), but functional and assemblage structure responses did occur. Ongoing, long-term studies at these sites might reveal further ecological consequences of the functional and taxonomic shifts we observed

    Evidence for the Persistence of Food Web Structure After Amphibian Extirpation in a Neotropical Stream

    Get PDF
    Species losses are predicted to simplify food web structure, and disease‐driven amphibian declines in Central America offer an opportunity to test this prediction. Assessment of insect community composition, combined with gut content analyses, was used to generate periphyton–insect food webs for a Panamanian stream, both pre‐ and post‐amphibian decline. We then used network analysis to assess the effects of amphibian declines on food web structure. Although 48% of consumer taxa, including many insect taxa, were lost between pre‐ and post‐amphibian decline sampling dates, connectance declined by less than 3%. We then quantified the resilience of food web structure by calculating the number of expected cascading extirpations from the loss of tadpoles. This analysis showed the expected effects of species loss on connectance and linkage density to be more than 60% and 40%, respectively, than were actually observed. Instead, new trophic linkages in the post‐decline food web reorganized the food web topology, changing the identity of “hub” taxa, and consequently reducing the effects of amphibian declines on many food web attributes. Resilience of food web attributes was driven by a combination of changes in consumer diets, particularly those of insect predators, as well as the appearance of generalist insect consumers, suggesting that food web structure is maintained by factors independent of the original trophic linkages

    Effects of an Infectious Fungus, Batrachochytrium dendrobatidis, on Amphibian Predator-Prey Interactions

    Get PDF
    The effects of parasites and pathogens on host behaviors may be particularly important in predator-prey contexts, since few animal behaviors are more crucial for ensuring immediate survival than the avoidance of lethal predators in nature. We examined the effects of an emerging fungal pathogen of amphibians, Batrachochytrium dendrobatidis, on anti-predator behaviors of tadpoles of four frog species. We also investigated whether amphibian predators consumed infected prey, and whether B. dendrobatidis caused differences in predation rates among prey in laboratory feeding trials. We found differences in anti-predator behaviors among larvae of four amphibian species, and show that infected tadpoles of one species (Anaxyrus boreas) were more active and sought refuge more frequently when exposed to predator chemical cues. Salamander predators consumed infected and uninfected tadpoles of three other prey species at similar rates in feeding trials, and predation risk among prey was unaffected by B. dendrobatidis. Collectively, our results show that even sub-lethal exposure to B. dendrobatidis can alter fundamental anti-predator behaviors in some amphibian prey species, and suggest the unexplored possibility that indiscriminate predation between infected and uninfected prey (i.e., non-selective predation) could increase the prevalence of this widely distributed pathogen in amphibian populations. Because one of the most prominent types of predators in many amphibian systems is salamanders, and because salamanders are susceptible to B. dendrobatidis, our work suggests the importance of considering host susceptibility and behavioral changes that could arise from infection in both predators and prey

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
    corecore