19 research outputs found

    Lectin-Dependent Enhancement of Ebola Virus Infection via Soluble and Transmembrane C-type Lectin Receptors

    Get PDF
    Mannose-binding lectin (MBL) is a key soluble effector of the innate immune system that recognizes pathogen-specific surface glycans. Surprisingly, low-producing MBL genetic variants that may predispose children and immunocompromised individuals to infectious diseases are more common than would be expected in human populations. Since certain immune defense molecules, such as immunoglobulins, can be exploited by invasive pathogens, we hypothesized that MBL might also enhance infections in some circumstances. Consequently, the low and intermediate MBL levels commonly found in human populations might be the result of balancing selection. Using model infection systems with pseudotyped and authentic glycosylated viruses, we demonstrated that MBL indeed enhances infection of Ebola, Hendra, Nipah and West Nile viruses in low complement conditions. Mechanistic studies with Ebola virus (EBOV) glycoprotein pseudotyped lentiviruses confirmed that MBL binds to N-linked glycan epitopes on viral surfaces in a specific manner via the MBL carbohydrate recognition domain, which is necessary for enhanced infection. MBL mediates lipid-raft-dependent macropinocytosis of EBOV via a pathway that appears to require less actin or early endosomal processing compared with the filovirus canonical endocytic pathway. Using a validated RNA interference screen, we identified C1QBP (gC1qR) as a candidate surface receptor that mediates MBL-dependent enhancement of EBOV infection. We also identified dectin-2 (CLEC6A) as a potentially novel candidate attachment factor for EBOV. Our findings support the concept of an innate immune haplotype that represents critical interactions between MBL and complement component C4 genes and that may modify susceptibility or resistance to certain glycosylated pathogens. Therefore, higher levels of native or exogenous MBL could be deleterious in the setting of relative hypocomplementemia which can occur genetically or because of immunodepletion during active infections. Our findings confirm our hypothesis that the pressure of infectious diseases may have contributed in part to evolutionary selection of MBL mutant haplotypes

    Global Retinoblastoma Presentation and Analysis by National Income Level.

    Get PDF
    Importance: Early diagnosis of retinoblastoma, the most common intraocular cancer, can save both a child's life and vision. However, anecdotal evidence suggests that many children across the world are diagnosed late. To our knowledge, the clinical presentation of retinoblastoma has never been assessed on a global scale. Objectives: To report the retinoblastoma stage at diagnosis in patients across the world during a single year, to investigate associations between clinical variables and national income level, and to investigate risk factors for advanced disease at diagnosis. Design, Setting, and Participants: A total of 278 retinoblastoma treatment centers were recruited from June 2017 through December 2018 to participate in a cross-sectional analysis of treatment-naive patients with retinoblastoma who were diagnosed in 2017. Main Outcomes and Measures: Age at presentation, proportion of familial history of retinoblastoma, and tumor stage and metastasis. Results: The cohort included 4351 new patients from 153 countries; the median age at diagnosis was 30.5 (interquartile range, 18.3-45.9) months, and 1976 patients (45.4%) were female. Most patients (n = 3685 [84.7%]) were from low- and middle-income countries (LMICs). Globally, the most common indication for referral was leukocoria (n = 2638 [62.8%]), followed by strabismus (n = 429 [10.2%]) and proptosis (n = 309 [7.4%]). Patients from high-income countries (HICs) were diagnosed at a median age of 14.1 months, with 656 of 666 (98.5%) patients having intraocular retinoblastoma and 2 (0.3%) having metastasis. Patients from low-income countries were diagnosed at a median age of 30.5 months, with 256 of 521 (49.1%) having extraocular retinoblastoma and 94 of 498 (18.9%) having metastasis. Lower national income level was associated with older presentation age, higher proportion of locally advanced disease and distant metastasis, and smaller proportion of familial history of retinoblastoma. Advanced disease at diagnosis was more common in LMICs even after adjusting for age (odds ratio for low-income countries vs upper-middle-income countries and HICs, 17.92 [95% CI, 12.94-24.80], and for lower-middle-income countries vs upper-middle-income countries and HICs, 5.74 [95% CI, 4.30-7.68]). Conclusions and Relevance: This study is estimated to have included more than half of all new retinoblastoma cases worldwide in 2017. Children from LMICs, where the main global retinoblastoma burden lies, presented at an older age with more advanced disease and demonstrated a smaller proportion of familial history of retinoblastoma, likely because many do not reach a childbearing age. Given that retinoblastoma is curable, these data are concerning and mandate intervention at national and international levels. Further studies are needed to investigate factors, other than age at presentation, that may be associated with advanced disease in LMICs

    The global retinoblastoma outcome study : a prospective, cluster-based analysis of 4064 patients from 149 countries

    Get PDF
    DATA SHARING : The study data will become available online once all analyses are complete.BACKGROUND : Retinoblastoma is the most common intraocular cancer worldwide. There is some evidence to suggest that major differences exist in treatment outcomes for children with retinoblastoma from different regions, but these differences have not been assessed on a global scale. We aimed to report 3-year outcomes for children with retinoblastoma globally and to investigate factors associated with survival. METHODS : We did a prospective cluster-based analysis of treatment-naive patients with retinoblastoma who were diagnosed between Jan 1, 2017, and Dec 31, 2017, then treated and followed up for 3 years. Patients were recruited from 260 specialised treatment centres worldwide. Data were obtained from participating centres on primary and additional treatments, duration of follow-up, metastasis, eye globe salvage, and survival outcome. We analysed time to death and time to enucleation with Cox regression models. FINDINGS : The cohort included 4064 children from 149 countries. The median age at diagnosis was 23·2 months (IQR 11·0–36·5). Extraocular tumour spread (cT4 of the cTNMH classification) at diagnosis was reported in five (0·8%) of 636 children from high-income countries, 55 (5·4%) of 1027 children from upper-middle-income countries, 342 (19·7%) of 1738 children from lower-middle-income countries, and 196 (42·9%) of 457 children from low-income countries. Enucleation surgery was available for all children and intravenous chemotherapy was available for 4014 (98·8%) of 4064 children. The 3-year survival rate was 99·5% (95% CI 98·8–100·0) for children from high-income countries, 91·2% (89·5–93·0) for children from upper-middle-income countries, 80·3% (78·3–82·3) for children from lower-middle-income countries, and 57·3% (52·1-63·0) for children from low-income countries. On analysis, independent factors for worse survival were residence in low-income countries compared to high-income countries (hazard ratio 16·67; 95% CI 4·76–50·00), cT4 advanced tumour compared to cT1 (8·98; 4·44–18·18), and older age at diagnosis in children up to 3 years (1·38 per year; 1·23–1·56). For children aged 3–7 years, the mortality risk decreased slightly (p=0·0104 for the change in slope). INTERPRETATION : This study, estimated to include approximately half of all new retinoblastoma cases worldwide in 2017, shows profound inequity in survival of children depending on the national income level of their country of residence. In high-income countries, death from retinoblastoma is rare, whereas in low-income countries estimated 3-year survival is just over 50%. Although essential treatments are available in nearly all countries, early diagnosis and treatment in low-income countries are key to improving survival outcomes.The Queen Elizabeth Diamond Jubilee Trust and the Wellcome Trust.https://www.thelancet.com/journals/langlo/homeam2023Paediatrics and Child Healt

    Soil Carbon Accumulation under Switchgrass Barriers

    Get PDF
    The benefits of grass barriers or hedges for reducing offsite transport of non-point-source water pollutants from croplands are well recognized, but their ancillary benefits on soil properties have received less attention. We studied the 15-yr cumulative effects of narrow and perennial switchgrass (Panicum virgatum L.) barriers on soil organic C (SOC), total N, particulate organic matter (POM), and associated soil structural properties as compared with the cropped area on an Aksarben silty clay loam (fine, smectitic, mesic Typic Argiudoll) with 5.4% slope in eastern Nebraska. Five switchgrass barriers were established in 1998 at ~38-m intervals parallel to the crop rows in a field under a conventional tillage and no-till grain sorghum [Sorghum bicolor (L.) Moench]–soybean [Glycine max (L.) Merr.]–corn (Zea mays L.) rotation. Compared with the cropped area, switchgrass barriers accumulated about 0.85 Mg ha-1 yr-1 of SOC and 80 kg ha-1 yr-1 of total soil N at the 0 to 15 cm soil depth. Switchgrass barriers also increased coarse POM by 60%. Mean weight diameter of water-stable aggregates increased by 70% at 0 to 15 cm and by 40% at 15 to 60 cm, indicating that switchgrass barriers improved soil aggregation at deeper depths. Large (4.75–8 mm) macroaggregates under switchgrass barriers contained 30% more SOC than those under the cropped area. Switchgrass-induced changes in SOC concentration were positively associated with aggregate stability (r = 0.89***) and porosity (r = 0.47*). Overall, switchgrass barriers integrated with intensively managed agroecosystems can increase the SOC pool and improve soil structural properties

    Watershed Scale Impacts of Buffers and Upland Conservation Practices on Agrochemical Delivery to Streams

    Get PDF
    Conservation buffers are designed to reduce sediment and agrichemical runoff to surface water. Much is known about plot and field scale effectiveness of buffers; but little is known about their watershed scale impact. Our objective was to estimate the watershed scale impact of grass buffers by comparing sediment and agrichemical losses from two adjacent 141-165 hectare watersheds, one with conservation buffers and one without. Rainfall derived runoff events from 2002-2003 were monitored for water runoff, TSS, phosphorous and atrazine loss. A conservation-watershed included 0.8 km of grass buffers and 0.8 km of riparian forest buffer, ridge-tilled corn, corn-beans-alfalfa rotation, terraces and grassed waterways. A control-watershed had no buffers, disk-tilled, continuous corn and grassed waterways. The same application rate and method for atrazine to corn was used in each watershed. Total rainfall during the April-June monitoring period was similar in 2002 and 2003; however, the conservation-watershed produced only 27 mm of runoff, compared to 47 mm from the control. Over two years, TSS and phosphorous losses per hectare were reduced by 97% and 95%, respectively, in the conservation-watershed. Atrazine loss per hectare was 57% less in the conservation watershed. A separation technique showed that for 2002 other conservation practices reduced TSS by 84% and buffers reduced TSS by an additional 13% compared to the control. Similarly, other conservation practices reduced atrazine losses by 29% and buffers accounted for an additional 31%. On a watershed scale buffers can add benefit to a conservation system

    A Graduate-Level Field Course in Irrigation and Agricultural Water Management for an Immersive Learning Experience

    Get PDF
    Effective irrigation and agricultural water management (IAWM) is critical for food security and water security. A key requirement in designing, implementing and operation of IWM is the necessary knowledge and capacity on the farm, in the service industry and within the supply chain. Educational opportunities that not only teach the relevant principles of irrigated agriculture, but also the necessary applied skills are essential. An Irrigation Field Course was initiated by the IHE Delft Institute for Water Education (IHE Delft) and was later developed as a joint field course with IHE Delft, the University of Nebraska-Lincoln (UNL), and the Daugherty Water for Food Global Institute (DWFI). The field course was designed as a two-week course, an immersive experience. The field course was a combination of hands-on laboratory exercises (both lab and field), brief lectures to prepare students for the labs, data analysis, lab reports, and tours. Laboratory topics included surface, sprinkler, and drip irrigation systems; flow in pipelines and open channels; and irrigation well hydraulics. Tours addressed broader topics including IAWM extension programs, technology for irrigation management, manufacturing, water resources management, and impacts on ecosystems. One of the benefits of the field course is that it provides a vantage point from which a student gains a clearer systems perspective of irrigated agriculture, whether their home country is in an irrigation development phase or a water conservation phase. Students often use the data collection skills from the field course in their graduate research projects. A survey was conducted to assess which components of the course were most helpful to the students, and to compare the format of the field course to a typical semester-long course. Students indicated that the immersive two-week experience, with hands-on learning and tours providing a systems perspective, were particularly helpful (60% “strongly agree”) compared to a lecture-based irrigation course
    corecore