274 research outputs found

    Hybrid forward osmosis-reverse osmosis for wastewater reuse and seawater desalination: Understanding the optimal feed solution to minimise fouling

    Full text link
    © 2018 Institution of Chemical Engineers To enhance the seawater desalination energy efficiency forward osmosis – reverse osmosis (FO-RO) hybrid system has recently been developed. In this process, the FO “pre-treatment” step is designed to use seawater (SW) as draw solution to filter the wastewater (WW) while reducing the seawater osmotic pressure. Thereby reducing the operating pressure of the RO to desalinate the diluted SW. However, membrane fouling is a major issue that needs to be addressed. Proper selection of suitable WWs is necessary before proceeding with large-scale FO-RO desalination plants. In this study, long-term experiments were carried out, using state-of-the-art FO membrane, using real WW and SW solutions. A combination of water flux modelling and membrane characterisation were used to assess the degree of membrane fouling and the impact on the process performance. Initial water flux as high as 22.5 Lm−2 h−1 was observed when using secondary effluent. It was also found that secondary effluent causes negligible flux decline. On the other hand, biologically treated wastewater and primary effluent caused mild and severe flux decline respectively (25% and 50% of flux decline after 80 hours, compared to no-fouling conditions). Ammonia leakage to the diluted seawater was also measured, concluding that, if biologically treated wastewater is used as feed, the final NH4+ concentration in the draw is likely to be negligible

    Myeloid Diagnostic and Prognostic Markers of Immune Suppression in the Blood of Glioma Patients.

    Get PDF
    Although gliomas are confined to the central nervous system, their negative influence over the immune system extends to peripheral circulation. The immune suppression exerted by myeloid cells can affect both response to therapy and disease outcome. We analyzed the expansion of several myeloid parameters in the blood of low- and high-grade gliomas and assessed their relevance as biomarkers of disease and clinical outcome. Methods: Peripheral blood was obtained from 134 low- and high-grade glioma patients. CD14+, CD14+/p-STAT3+, CD14+/PD-L1+, CD15+ cells and four myeloid-derived suppressor cell (MDSC) subsets, were evaluated by flow cytometry. Arginase-1 (ARG1) quantity and activity was determined in the plasma. Multivariable logistic regression model was used to obtain a diagnostic score to discriminate glioma patients from healthy controls and between each glioma grade. A glioblastoma prognostic model was determined by multiple Cox regression using clinical and myeloid parameters. Results: Changes in myeloid parameters associated with immune suppression allowed to define a diagnostic score calculating the risk of being a glioma patient. The same parameters, together with age, permit to calculate the risk score in differentiating each glioma grade. A prognostic model for glioblastoma patients stemmed out from a Cox multiple analysis, highlighting the role of MDSC, p-STAT3, and ARG1 activity together with clinical parameters in predicting patient's outcome. Conclusions: This work emphasizes the role of systemic immune suppression carried out by myeloid cells in gliomas. The identification of biomarkers associated with immune landscape, diagnosis, and outcome of glioblastoma patients lays the ground for their clinical use

    SARS-CoV-2 Breakthrough Infections: Incidence and Risk Factors in a Large European Multicentric Cohort of Health Workers.

    Get PDF
    Background: The research aimed to investigate the incidence of SARS-CoV-2 breakthrough infections and their determinants in a large European cohort of more than 60,000 health workers. Methods: A multicentric retrospective cohort study, involving 12 European centers, was carried out within the ORCHESTRA project, collecting data up to 18 November 2021 on fully vaccinated health workers. The cumulative incidence of SARS-CoV-2 breakthrough infections was investigated with its association with occupational and social-demographic characteristics (age, sex, job title, previous SARS-CoV-2 infection, antibody titer levels, and time from the vaccination course completion). Results: Among 64,172 health workers from 12 European health centers, 797 breakthrough infections were observed (cumulative incidence of 1.2%). The primary analysis using individual data on 8 out of 12 centers showed that age and previous infection significantly modified breakthrough infection rates. In the meta-analysis of aggregated data from all centers, previous SARS-CoV-2 infection and the standardized antibody titer were inversely related to the risk of breakthrough infection (p = 0.008 and p = 0.007, respectively). Conclusion: The inverse correlation of antibody titer with the risk of breakthrough infection supports the evidence that vaccination plays a primary role in infection prevention, especially in health workers. Cellular immunity, previous clinical conditions, and vaccination timing should be further investigated

    Rise and Demise of Bioinformatics? Promise and Progress

    Get PDF
    The field of bioinformatics and computational biology has gone through a number of transformations during the past 15 years, establishing itself as a key component of new biology. This spectacular growth has been challenged by a number of disruptive changes in science and technology. Despite the apparent fatigue of the linguistic use of the term itself, bioinformatics has grown perhaps to a point beyond recognition. We explore both historical aspects and future trends and argue that as the field expands, key questions remain unanswered and acquire new meaning while at the same time the range of applications is widening to cover an ever increasing number of biological disciplines. These trends appear to be pointing to a redefinition of certain objectives, milestones, and possibly the field itself

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease
    corecore