642 research outputs found

    Biotechnologies as catalysts for driving net zero

    Get PDF
    R&D impact delivered by this work extends to policy development and to the benefits derived from delivering circularity, green growth and reducing carbon emissions by anaerobic digestion that (1) recovers a variety of organic wastes and low value biomass and (2) produces bioenergy and fertiliser. Other biotechnologies being developed can recover resources for the production of fuels (CH4, H2 and NH3), chemicals e.g. volatile fatty acids, biopolymers e.g. polyhydroxyalkanoates and single-cell proteins that can be used for animal feed. Biotechnologies delivering solutions for Power to X, for energy storage and for the capture and use of carbon have also been a focus of our research. Monitoring and control methodologies for the biotechnologies have been developed, including the use of analytical technologies such as FTNIR, GC-IMS and qPCR. Work continues on the valorisation of digestates as microbial and algae growth media, and the recovery of nutrients (NPK). Evaluations of the fate of polymers in the environment, their biochemical recycling and the production of biostimulants for soil and crop improvements, nitrogen fixing and emissions’ reduction are all in progress. Technologies are currently across the TRL 3-6 range and require further R&D to progress them to commercialisation. Deploying industrial biotechnologies is essential to act as sustainable catalysts for change and for delivering net zero, circular economy and green growth. Biotechnologies can impact beneficially on the sustainability of cities and benefit their relationship and integration with surrounding rural areas

    The CAR-HEMATOTOX risk-stratifies patients for severe infections and disease progression after CD19 CAR-T in R/R LBCL

    Get PDF
    BACKGROUND: CD19-directed chimeric antigen receptor T-cell therapy (CAR-T) represents a promising treatment modality for an increasing number of B-cell malignancies. However, prolonged cytopenias and infections substantially contribute to the toxicity burden of CAR-T. The recently developed CAR-HEMATOTOX (HT) score-composed of five pre-lymphodepletion variables (eg, absolute neutrophil count, platelet count, hemoglobin, C-reactive protein, ferritin)-enables risk stratification of hematological toxicity. METHODS: In this multicenter retrospective analysis, we characterized early infection events (days 0-90) and clinical outcomes in 248 patients receiving standard-of-care CD19 CAR-T for relapsed/refractory large B-cell lymphoma. This included a derivation cohort (cohort A, 179 patients) and a second independent validation cohort (cohort B, 69 patients). Cumulative incidence curves were calculated for all-grade, grade ≥3, and specific infection subtypes. Clinical outcomes were studied via Kaplan-Meier estimates. RESULTS: In a multivariate analysis adjusted for other baseline features, the HT score identified patients at high risk for severe infections (adjusted HR 6.4, 95% CI 3.1 to 13.1). HT(high) patients more frequently developed severe infections (40% vs 8%, p<0.0001)-particularly severe bacterial infections (27% vs 0.9%, p<0.0001). Additionally, multivariate analysis of post-CAR-T factors revealed that infection risk was increased by prolonged neutropenia (≥14 days) and corticosteroid use (≥9 days), and decreased with fluoroquinolone prophylaxis. Antibacterial prophylaxis significantly reduced the likelihood of severe bacterial infections in HT(high) (16% vs 46%, p<0.001), but not HT(low) patients (0% vs 2%, p=n.s.). Collectively, HT(high) patients experienced worse median progression-free (3.4 vs 12.6 months) and overall survival (9.1 months vs not-reached), and were hospitalized longer (median 20 vs 16 days). Severe infections represented the most common cause of non-relapse mortality after CAR-T and were associated with poor survival outcomes. A trend toward increased non-relapse mortality in HT(high) patients was observed (8.0% vs 3.7%, p=0.09). CONCLUSIONS: These data demonstrate the utility of the HT score to risk-stratify patients for infectious complications and poor survival outcomes prior to CD19 CAR-T. High-risk patients likely benefit from anti-infective prophylaxis and should be closely monitored for potential infections and relapse

    Transcriptomics and adaptive genomics of the asymptomatic bacteriuria Escherichia coli strain 83972

    Get PDF
    Escherichia coli strains are the major cause of urinary tract infections in humans. Such strains can be divided into virulent, UPEC strains causing symptomatic infections, and asymptomatic, commensal-like strains causing asymptomatic bacteriuria, ABU. The best-characterized ABU strain is strain 83972. Global gene expression profiling of strain 83972 has been carried out under seven different sets of environmental conditions ranging from laboratory minimal medium to human bladders. The data reveal highly specific gene expression responses to different conditions. A number of potential fitness factors for the human urinary tract could be identified. Also, presence/absence data of the gene expression was used as an adaptive genomics tool to model the gene pool of 83972 using primarily UPEC strain CFT073 as a scaffold. In our analysis, 96% of the transcripts filtered present in strain 83972 can be found in CFT073, and genes on six of the seven pathogenicity islands were expressed in 83972. Despite the very different patient symptom profiles, the two strains seem to be very similar. Genes expressed in CFT073 but not in 83972 were identified and can be considered as virulence factor candidates. Strain 83972 is a deconstructed pathogen rather than a commensal strain that has acquired fitness properties

    Comparative genomics of Escherichia coli isolated from patients with inflammatory bowel disease

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Inflammatory bowel disease (IBD) is used to describe a state of idiopathic, chronic inflammation of the gastrointestinal tract. The two main phenotypes of IBD are Crohn's disease (CD) and ulcerative colitis (UC). The major cause of IBD-associated mortality is colorectal cancer. Although both host-genetic and exogenous factors have been found to be involved, the aetiology of IBD is still not well understood. In this study we characterized thirteen <it>Escherichia coli </it>strains from patients with IBD by comparative genomic hybridization employing a microarray based on 31 sequenced <it>E. coli </it>genomes from a wide range of commensal and pathogenic isolates.</p> <p>Results</p> <p>The IBD isolates, obtained from patients with UC and CD, displayed remarkably heterogeneous genomic profiles with little or no evidence of group-specific determinants. No IBD-specific genes were evident when compared with the prototypic CD isolate, LF82, suggesting that the IBD-inducing effect of the strains is multifactorial. Several of the IBD isolates carried a number of extraintestinal pathogenic <it>E. coli </it>(ExPEC)-related virulence determinants such as the <it>pap</it>, <it>sfa</it>, <it>cdt </it>and <it>hly </it>genes. The isolates were also found to carry genes of ExPEC-associated genomic islands.</p> <p>Conclusions</p> <p>Combined, these data suggest that <it>E. coli </it>isolates obtained from UC and CD patients represents a heterogeneous population of strains, with genomic profiles that are indistinguishable to those of ExPEC isolates. Our findings indicate that IBD-induction from <it>E. coli </it>strains is multifactorial and that a range of gene products may be involved in triggering the disease.</p

    Two Years with COVID-19 : The Electronic Frailty Index Identifies High-Risk Patients in the Stockholm GeroCovid Study

    Get PDF
    INTRODUCTION: Frailty, a measure of biological aging, has been linked to worse COVID-19 outcomes. However, as the mortality differs across the COVID-19 waves, it is less clear whether a medical record-based electronic frailty index (eFI) that we have previously developed for older adults could be used for risk stratification in hospitalized COVID-19 patients. OBJECTIVES: The aim of the study was to examine the association of frailty with mortality, readmission, and length of stay in older COVID-19 patients and to compare the predictive accuracy of the eFI to other frailty and comorbidity measures. METHODS: This was a retrospective cohort study using electronic health records (EHRs) from nine geriatric clinics in Stockholm, Sweden, comprising 3,980 COVID-19 patients (mean age 81.6 years) admitted between March 2020 and March 2022. Frailty was assessed using a 48-item eFI developed for Swedish geriatric patients, the Clinical Frailty Scale, and the Hospital Frailty Risk Score. Comorbidity was measured using the Charlson Comorbidity Index. We analyzed in-hospital mortality and 30-day readmission using logistic regression, 30-day and 6-month mortality using Cox regression, and the length of stay using linear regression. Predictive accuracy of the logistic regression and Cox models was evaluated by area under the receiver operating characteristic curve (AUC) and Harrell's C-statistic, respectively. RESULTS: Across the study period, the in-hospital mortality rate decreased from 13.9% in the first wave to 3.6% in the latest (Omicron) wave. Controlling for age and sex, a 10% increment in the eFI was significantly associated with higher risks of in-hospital mortality (odds ratio = 2.95; 95% confidence interval = 2.42-3.62), 30-day mortality (hazard ratio [HR] = 2.39; 2.08-2.74), 6-month mortality (HR = 2.29; 2.04-2.56), and a longer length of stay (β-coefficient = 2.00; 1.65-2.34) but not with 30-day readmission. The association between the eFI and in-hospital mortality remained robust across the waves, even after the vaccination rollout. Among all measures, the eFI had the best discrimination for in-hospital (AUC = 0.780), 30-day (Harrell's C = 0.733), and 6-month mortality (Harrell's C = 0.719). CONCLUSION: An eFI based on routinely collected EHRs can be applied in identifying high-risk older COVID-19 patients during the continuing pandemic.publishedVersionPeer reviewe

    The Missing Part of Seed Dispersal Networks: Structure and Robustness of Bat-Fruit Interactions

    Get PDF
    Mutualistic networks are crucial to the maintenance of ecosystem services. Unfortunately, what we know about seed dispersal networks is based only on bird-fruit interactions. Therefore, we aimed at filling part of this gap by investigating bat-fruit networks. It is known from population studies that: (i) some bat species depend more on fruits than others, and (ii) that some specialized frugivorous bats prefer particular plant genera. We tested whether those preferences affected the structure and robustness of the whole network and the functional roles of species. Nine bat-fruit datasets from the literature were analyzed and all networks showed lower complementary specialization (H2' = 0.37±0.10, mean ± SD) and similar nestedness (NODF = 0.56±0.12) than pollination networks. All networks were modular (M = 0.32±0.07), and had on average four cohesive subgroups (modules) of tightly connected bats and plants. The composition of those modules followed the genus-genus associations observed at population level (Artibeus-Ficus, Carollia-Piper, and Sturnira-Solanum), although a few of those plant genera were dispersed also by other bats. Bat-fruit networks showed high robustness to simulated cumulative removals of both bats (R = 0.55±0.10) and plants (R = 0.68±0.09). Primary frugivores interacted with a larger proportion of the plants available and also occupied more central positions; furthermore, their extinction caused larger changes in network structure. We conclude that bat-fruit networks are highly cohesive and robust mutualistic systems, in which redundancy is high within modules, although modules are complementary to each other. Dietary specialization seems to be an important structuring factor that affects the topology, the guild structure and functional roles in bat-fruit networks

    Crop pests and predators exhibit inconsistent responses to surrounding landscape composition

    Get PDF
    The idea that noncrop habitat enhances pest control and represents a win–win opportunity to conserve biodiversity and bolster yields has emerged as an agroecological paradigm. However, while noncrop habitat in landscapes surrounding farms sometimes benefits pest predators, natural enemy responses remain heterogeneous across studies and effects on pests are inconclusive. The observed heterogeneity in species responses to noncrop habitat may be biological in origin or could result from variation in how habitat and biocontrol are measured. Here, we use a pest-control database encompassing 132 studies and 6,759 sites worldwide to model natural enemy and pest abundances, predation rates, and crop damage as a function of landscape composition. Our results showed that although landscape composition explained significant variation within studies, pest and enemy abundances, predation rates, crop damage, and yields each exhibited different responses across studies, sometimes increasing and sometimes decreasing in landscapes with more noncrop habitat but overall showing no consistent trend. Thus, models that used landscape-composition variables to predict pest-control dynamics demonstrated little potential to explain variation across studies, though prediction did improve when comparing studies with similar crop and landscape features. Overall, our work shows that surrounding noncrop habitat does not consistently improve pest management, meaning habitat conservation may bolster production in some systems and depress yields in others. Future efforts to develop tools that inform farmers when habitat conservation truly represents a win–win would benefit from increased understanding of how landscape effects are modulated by local farm management and the biology of pests and their enemies

    State of the climate in 2018

    Get PDF
    In 2018, the dominant greenhouse gases released into Earth’s atmosphere—carbon dioxide, methane, and nitrous oxide—continued their increase. The annual global average carbon dioxide concentration at Earth’s surface was 407.4 ± 0.1 ppm, the highest in the modern instrumental record and in ice core records dating back 800 000 years. Combined, greenhouse gases and several halogenated gases contribute just over 3 W m−2 to radiative forcing and represent a nearly 43% increase since 1990. Carbon dioxide is responsible for about 65% of this radiative forcing. With a weak La Niña in early 2018 transitioning to a weak El Niño by the year’s end, the global surface (land and ocean) temperature was the fourth highest on record, with only 2015 through 2017 being warmer. Several European countries reported record high annual temperatures. There were also more high, and fewer low, temperature extremes than in nearly all of the 68-year extremes record. Madagascar recorded a record daily temperature of 40.5°C in Morondava in March, while South Korea set its record high of 41.0°C in August in Hongcheon. Nawabshah, Pakistan, recorded its highest temperature of 50.2°C, which may be a new daily world record for April. Globally, the annual lower troposphere temperature was third to seventh highest, depending on the dataset analyzed. The lower stratospheric temperature was approximately fifth lowest. The 2018 Arctic land surface temperature was 1.2°C above the 1981–2010 average, tying for third highest in the 118-year record, following 2016 and 2017. June’s Arctic snow cover extent was almost half of what it was 35 years ago. Across Greenland, however, regional summer temperatures were generally below or near average. Additionally, a satellite survey of 47 glaciers in Greenland indicated a net increase in area for the first time since records began in 1999. Increasing permafrost temperatures were reported at most observation sites in the Arctic, with the overall increase of 0.1°–0.2°C between 2017 and 2018 being comparable to the highest rate of warming ever observed in the region. On 17 March, Arctic sea ice extent marked the second smallest annual maximum in the 38-year record, larger than only 2017. The minimum extent in 2018 was reached on 19 September and again on 23 September, tying 2008 and 2010 for the sixth lowest extent on record. The 23 September date tied 1997 as the latest sea ice minimum date on record. First-year ice now dominates the ice cover, comprising 77% of the March 2018 ice pack compared to 55% during the 1980s. Because thinner, younger ice is more vulnerable to melting out in summer, this shift in sea ice age has contributed to the decreasing trend in minimum ice extent. Regionally, Bering Sea ice extent was at record lows for almost the entire 2017/18 ice season. For the Antarctic continent as a whole, 2018 was warmer than average. On the highest points of the Antarctic Plateau, the automatic weather station Relay (74°S) broke or tied six monthly temperature records throughout the year, with August breaking its record by nearly 8°C. However, cool conditions in the western Bellingshausen Sea and Amundsen Sea sector contributed to a low melt season overall for 2017/18. High SSTs contributed to low summer sea ice extent in the Ross and Weddell Seas in 2018, underpinning the second lowest Antarctic summer minimum sea ice extent on record. Despite conducive conditions for its formation, the ozone hole at its maximum extent in September was near the 2000–18 mean, likely due to an ongoing slow decline in stratospheric chlorine monoxide concentration. Across the oceans, globally averaged SST decreased slightly since the record El Niño year of 2016 but was still far above the climatological mean. On average, SST is increasing at a rate of 0.10° ± 0.01°C decade−1 since 1950. The warming appeared largest in the tropical Indian Ocean and smallest in the North Pacific. The deeper ocean continues to warm year after year. For the seventh consecutive year, global annual mean sea level became the highest in the 26-year record, rising to 81 mm above the 1993 average. As anticipated in a warming climate, the hydrological cycle over the ocean is accelerating: dry regions are becoming drier and wet regions rainier. Closer to the equator, 95 named tropical storms were observed during 2018, well above the 1981–2010 average of 82. Eleven tropical cyclones reached Saffir–Simpson scale Category 5 intensity. North Atlantic Major Hurricane Michael’s landfall intensity of 140 kt was the fourth strongest for any continental U.S. hurricane landfall in the 168-year record. Michael caused more than 30 fatalities and 25billion(U.S.dollars)indamages.InthewesternNorthPacific,SuperTyphoonMangkhutledto160fatalitiesand25 billion (U.S. dollars) in damages. In the western North Pacific, Super Typhoon Mangkhut led to 160 fatalities and 6 billion (U.S. dollars) in damages across the Philippines, Hong Kong, Macau, mainland China, Guam, and the Northern Mariana Islands. Tropical Storm Son-Tinh was responsible for 170 fatalities in Vietnam and Laos. Nearly all the islands of Micronesia experienced at least moderate impacts from various tropical cyclones. Across land, many areas around the globe received copious precipitation, notable at different time scales. Rodrigues and Réunion Island near southern Africa each reported their third wettest year on record. In Hawaii, 1262 mm precipitation at Waipā Gardens (Kauai) on 14–15 April set a new U.S. record for 24-h precipitation. In Brazil, the city of Belo Horizonte received nearly 75 mm of rain in just 20 minutes, nearly half its monthly average. Globally, fire activity during 2018 was the lowest since the start of the record in 1997, with a combined burned area of about 500 million hectares. This reinforced the long-term downward trend in fire emissions driven by changes in land use in frequently burning savannas. However, wildfires burned 3.5 million hectares across the United States, well above the 2000–10 average of 2.7 million hectares. Combined, U.S. wildfire damages for the 2017 and 2018 wildfire seasons exceeded $40 billion (U.S. dollars)
    corecore