130 research outputs found

    Rhodolith Beds Are Major CaCO3 Bio-Factories in the Tropical South West Atlantic

    Get PDF
    Rhodoliths are nodules of non-geniculate coralline algae that occur in shallow waters (<150 m depth) subjected to episodic disturbance. Rhodolith beds stand with kelp beds, seagrass meadows, and coralline algal reefs as one of the world's four largest macrophyte-dominated benthic communities. Geographic distribution of rhodolith beds is discontinuous, with large concentrations off Japan, Australia and the Gulf of California, as well as in the Mediterranean, North Atlantic, eastern Caribbean and Brazil. Although there are major gaps in terms of seabed habitat mapping, the largest rhodolith beds are purported to occur off Brazil, where these communities are recorded across a wide latitudinal range (2°N - 27°S). To quantify their extent, we carried out an inter-reefal seabed habitat survey on the Abrolhos Shelf (16°50′ - 19°45′S) off eastern Brazil, and confirmed the most expansive and contiguous rhodolith bed in the world, covering about 20,900 km2. Distribution, extent, composition and structure of this bed were assessed with side scan sonar, remotely operated vehicles, and SCUBA. The mean rate of CaCO3 production was estimated from in situ growth assays at 1.07 kg m−2 yr−1, with a total production rate of 0.025 Gt yr−1, comparable to those of the world's largest biogenic CaCO3 deposits. These gigantic rhodolith beds, of areal extent equivalent to the Great Barrier Reef, Australia, are a critical, yet poorly understood component of the tropical South Atlantic Ocean. Based on the relatively high vulnerability of coralline algae to ocean acidification, these beds are likely to experience a profound restructuring in the coming decades

    Transcriptomic Analysis of Toxoplasma Development Reveals Many Novel Functions and Structures Specific to Sporozoites and Oocysts

    Get PDF
    Sexual reproduction of Toxoplasma gondii occurs exclusively within enterocytes of the definitive felid host. The resulting immature oocysts are excreted into the environment during defecation, where in the days following, they undergo a complex developmental process. Within each oocyst, this culminates in the generation of two sporocysts, each containing 4 sporozoites. A single felid host is capable of shedding millions of oocysts, which can survive for years in the environment, are resistant to most methods of microbial inactivation during water-treatment and are capable of producing infection in warm-blooded hosts at doses as low as 1–10 ingested oocysts. Despite its extremely interesting developmental biology and crucial role in initiating an infection, almost nothing is known about the oocyst stage beyond morphological descriptions. Here, we present a complete transcriptomic analysis of the oocyst from beginning to end of its development. In addition, and to identify genes whose expression is unique to this developmental form, we compared the transcriptomes of developing oocysts with those of in vitro-derived tachyzoites and in vivo-derived bradyzoites. Our results reveal many genes whose expression is specifically up- or down-regulated in different developmental stages, including many genes that are likely critical to oocyst development, wall formation, resistance to environmental destruction and sporozoite infectivity. Of special note is the up-regulation of genes that appear “off” in tachyzoites and bradyzoites but that encode homologues of proteins known to serve key functions in those asexual stages, including a novel pairing of sporozoite-specific paralogues of AMA1 and RON2, two proteins that have recently been shown to form a crucial bridge during tachyzoite invasion of host cells. This work provides the first in-depth insight into the development and functioning of one of the most important but least studied stages in the Toxoplasma life cycle

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
    corecore