1,601 research outputs found

    SPARC is a new myeloid-derived suppressor cell marker licensing suppressive activities

    Get PDF
    Myeloid-derived suppressor cells (MDSC) are well-known key negative regulators of the immune response during tumor growth, however scattered is the knowledge of their capacity to influence and adapt to the different tumor microenvironments and of the markers that identify those capacities. Here we show that the secreted protein acidic and rich in cysteine (SPARC) identifies in both human and mouse MDSC with immune suppressive capacity and pro-tumoral activities including the induction of epithelial-to-mesenchymal transition (EMT) and angiogenesis. In mice the genetic deletion of SPARC reduced MDSC immune suppression and reverted EMT. Sparc−/− MDSC were less suppressive overall and the granulocytic fraction was more prone to extrude neutrophil extracellular traps (NET). Surprisingly, arginase-I and NOS2, whose expression can be controlled by STAT3, were not down-regulated in Sparc−/− MDSC, although less suppressive than wild type (WT) counterpart. Flow cytometry analysis showed equal phosphorylation of STAT3 but reduced ROS production that was associated with reduced nuclear translocation of the NF-kB p50 subunit in Sparc−/− than WT MDSC. The limited p50 in nuclei reduce the formation of the immunosuppressive p50:p50 homodimers in favor of the p65:p50 inflammatory heterodimers. Supporting this hypothesis, the production of TNF by Sparc−/− MDSC was significantly higher than by WT MDSC. Although associated with tumor-induced chronic inflammation, TNF, if produced at high doses, becomes a key factor in mediating tumor rejection. Therefore, it is foreseeable that an unbalance in TNF production could skew MDSC toward an inflammatory, anti-tumor phenotype. Notably, TNF is also required for inflammation-driven NETosis. The high level of TNF in Sparc−/− MDSC might explain their increased spontaneous NET formation as that we detected both in vitro and in vivo, in association with signs of endothelial damage. We propose SPARC as a new potential marker of MDSC, in both human and mouse, with the additional feature of controlling MDSC suppressive activity while preventing an excessive inflammatory state through the control of NF-kB signaling pathway

    The Seveso studies on early and long-term effects of dioxin exposure: a review.

    Get PDF
    The industrial accident that occurred in the town of Seveso, Italy, in 1976 exposed a large population to substantial amounts of relatively pure 2,3,7,8-tetrachlorodibenzo-p-dioxin. Extensive monitoring of soil levels and measurements of a limited number of human blood samples allowed classification of the exposed population into three categories, A (highest exposure), B (median exposure), and R (lowest exposure). Early health investigations including liver function, immune function, neurologic impairment, and reproductive effects yielded inconclusive results. Chloracne (nearly 200 cases with a definite exposure dependence) was the only effect established with certainty. Long-term studies were conducted using the large population living in the surrounding noncontaminated territory as reference. An excess mortality from cardiovascular and respiratory diseases was uncovered, possibly related to the psychosocial consequences of the accident in addition to the chemical contamination. An excess of diabetes cases was also found. Results of cancer incidence and mortality follow-up showed an increased occurrence of cancer of the gastrointestinal sites and of the lymphatic and hematopoietic tissue. Experimental and epidemiologic data as well as mechanistic knowledge support the hypothesis that the observed cancer excesses are associated with dioxin exposure. Results cannot be viewed as conclusive. The study is continuing in an attempt to overcome the existing limitations (few individual exposure data, short latency period, and small population size for certain cancer types) and to explore new research paths (e.g., differences in individual susceptibility)

    Techno-economic assessment of SEWGS technology when applied to integrated steel-plant for CO2 emission mitigation

    Get PDF
    Mitigation of CO2 emissions in the industrial sector is one of the main climate challenges for the coming decades. This work, carried out within the STEPWISE H2020 project, performs a preliminary techno-economic assessment of the Sorption Enhanced Water Gas Shift (SEWGS) technology when integrated into the iron and steel plant to mitigate CO2 emissions. The SEWGS separates the CO2 from the iron and steel off-gases with residual energy content (i.e. Blast Furnace Gas, Basic Oxygen Furnace Gas and Coke Oven Gas) and the produced H2 is sent to the power generation section to produce the electricity required by the steel plant, while the CO2 is compressed and transported for storage. Detailed mass and energy balances are performed together with a SEWGS cost estimation to assess the energy penalty and additional costs related to CO2 capture. Results demonstrates the potential of SEWGS to capture over 80 % of CO2 in the off-gases, which results in entire plant CO2 emission reduction of 40 % with a Specific Energy Consumptions for CO2 Avoided (SPECCA) around 1.9 MJ/kgCO2. SEWGS outperforms a commercial amine scrubbing technology which has a SPECCA of 2.5 MJ/kgCO2 and only 20 % of CO2 avoided. The cost of CO2 avoided calculated on the basis of a fully integrated steel plant is around 33 €/tCO2 compared to 38 €/tCO2 of the amine technology

    Legionella detection in water networks as per iso 11731:2017: Can different filter pore sizes and direct placement on culture media influence laboratory results?

    Get PDF
    Determination of Legionella concentrations in water networks is useful for predicting legionellosis risks. The standard culture technique using concentration with membranes filters is the most commonly used method for environmental surveillance of Legionella. The aim of this study was to verify whether filtration with different filter pore sizes (0.2 and 0.45 \ub5m) according to (ISO) 11731:2017, followed by directly placing them on culture media, can influence Legionella detection. Three laboratories participated in an experimental study that tested a known suspension of Legionella pneumophila (Lpn) serogroup 1 (ATCC 33152) (approximate final cell density of 15 CFU/mL). E. coli (ATCC 11775) and Pseudomonas aeruginosa (ATCC 25668) were included as control tests. The average (95% CI) percentage of recovery of Lpn was 65% using 0.45-\ub5m filters and 15% using 0.2-\ub5m filters (p < 0.0001). For control tests, the average (95% CI) percentage of recovery was higher with 0.45 vs. 0.2 \ub5m filters: 97% vs. 64% for Escherichia coli (p < 0.00001) and 105% vs. 97% (p = 0.0244) for P. aeruginosa. Our results showed that the 0.45-\ub5m filters provided the greatest detection of Legionella. Because the current national guidelines leave the choice of membrane porosity to the operator, experimental studies are important for directing operators towards a conscious choice to standardize Legionella environmental surveillance methods

    Metal(loid)s role in the pathogenesis of amyotrophic lateral sclerosis: Environmental, epidemiological, and genetic data

    Get PDF
    Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disorder of the motor system. The etiology is still unknown and the pathogenesis remains unclear. ALS is familial in the 10% of cases with a Mendelian pattern of inheritance. In the remaining sporadic cases, a multifactorial origin is supposed in which several predisposing genes interact with environmental factors. The etiological role of environmental factors, such as pesticides, exposure to electromagnetic fields, and metals has been frequently investigated, with controversial findings. Studies in the past two decades have highlighted possible roles of metals, and ionic homeostasis dysregulation has been proposed as the main trigger to motor-neuron degeneration. This study aims at evaluating the possible role of environmental factors in etiopathogenesis of ALS, with a particular attention on metal contamination, focusing on the industrial Briga area in the province of Novara (Piedmont region, North Italy), characterized by: i) a higher incidence of sporadic ALS (sALS) in comparison with the entire province, and ii) the reported environmental pollution. Environmental data from surface, ground and discharge waters, and from soils were collected and specifically analyzed for metal content. Considering the significance of genetic mechanisms in ALS, a characterization for the main ALS genes has been performed to evaluate the genetic contribution for the sALS patients living in the area of study. The main findings of this study are the demonstration that in the Briga area the most common metal contaminants are Cu, Zn, Cr, Ni (widely used in tip-plating processes), that are above law limits in surface waters, discharge waters, and soil. In addition, other metals and metalloids, such as Cd, Pb, Mn, and As show a severe contamination in the same area. Results of genetic analyses show that sALS patients in the Briga area do not carry recurrent mutations or an excess of mutations in the four main ALS causative genes (SOD1, TARDBP, FUS, C9ORF72) and for ATXN2 CAG repeat locus. This study supports the hypothesis that the higher incidence of sALS in Briga area may be related to environmental metal(loid)s contamination, along with other environmental factors. Further studies, implementing analysis of genetic polymorphisms, as well as investigation with long term follow-up, may yield to key aspects into the etiology of ALS. The interplay between different approaches (environmental, chemical, epidemiological, genetic) of our work provides new insights and methodology to the comprehension of the disease etiology

    2,3,7,8-Tetrachlorodibenzo-p-dioxin plasma levels in Seveso 20 years after the accident.

    Get PDF
    In 1976, near Seveso, Italy, an industrial accident caused the release of large quantities of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) into the atmosphere, resulting in the highest levels of the toxicant ever recorded in humans. The contaminated area was divided into three zones (A, B, R) corresponding to decreasing TCDD levels in soil, and cohort including all residents was enumerated. The population of the surrounding noncontaminated area (non-ABR) was chosen as referent population. Two decades after the accident. plasma TCDD levels were measured in 62 subjects randomly sampled from the highest exposed zones (A and B) and 59 subjects from non-ABR, frequency matched for age, gender, and cigarette smoking status. Subjects living in the exposed areas have persistently elevated plasma TCDD levels (range = 1.2-89.9 ppt; geometric mean = 53.2 and 11.0 ppt for Zone A and Zone B, respectively). Levels significantly decrease by distance from the accident site (p = 0.0001), down to general population values (4.9 ppt) in non-ABR, thus validating the original zone classification based on environmental measurements. Women have higher TCDD levels than men in the entire study area (p = 0.0003 in Zone B; p = 0.007 in non-ABR). This gender difference persists after adjustment for location within the zone, consumption of meat derived from locally raised animals, age, body mass index, and smoking. There is no evidence for a gender difference in exposure, so variation in metabolism or elimination due to body fat or hormone-related factors may explain this finding. Elevated TCDD levels in women may contribute to adverse reproductive, developmental, and cancer outcomes
    • …
    corecore