77 research outputs found
An innovative index to incorporate transcriptomic data into weight of evidence approaches for environmental risk assessment
The sharp decrease in the cost of RNA-sequencing and the rapid improvement in computational analysis of eco-toxicogenomic data have brought new insights into the adverse effects of chemicals on aquatic organisms. Yet, transcriptomics is generally applied qualitatively in environmental risk assessments, hampering more effective exploitation of this evidence through multidisciplinary studies. In view of this limitation, a methodology is here presented to quantitatively elaborate transcriptional data in support to environmental risk assessment. The proposed methodology makes use of results from the application of Gene Set Enrichment Analysis to recent studies investigating the response of Mytilus galloprovincialis and Ruditapes philippinarum exposed to contaminants of emerging concern. The degree of changes in gene sets and the relevance of physiological reactions are integrated in the calculation of a hazard index. The outcome is then classified according to five hazard classes (from absent to severe), providing an evaluation of whole-transcriptome effects of chemical exposure. The application to experimental and simulated datasets proved that the method can effectively discriminate different levels of altered transcriptomic responses when compared to expert judgement (Spearman correlation coefficient of 0.96). A further application to data collected in two independent studies of Salmo trutta and Xenopus tropicalis exposed to contaminants confirmed the potential extension of the methodology to other aquatic species. This methodology can serve as a proof of concept for the integration of âgenomic toolsâ in environmental risk assessment based on multidisciplinary investigations. To this end, the proposed transcriptomic hazard index can now be incorporated into quantitative Weight of Evidence approaches and weighed, with results from other types of analysis, to elucidate the role of chemicals in adverse ecological effects
Human body odors of happiness and fear modulate the late positive potential component during neutral face processing: a preliminary ERP study on healthy subjects
Human body odors (HBOs) are powerful stimuli that can affect emotional, cognitive and behavioral processes. However, the characterization of the physiological response to HBOs is still to be fully investigated. Here, we analyzed the self-assessed emotion perception and the EEG event-related potentials (ERP) on 17 healthy young women during a simultaneous visual-olfactory stimulation. Particularly, we evaluated the effect of happiness and fear HBO on the amplitude of ERP waveforms elicited by neutral face processing. In addition, we evaluated the subjective valence and arousal perception of the presented neutral faces by means of the self-assessment-manikin test. We observed a significant increase in the amplitude of the late positive potential (LPP) for central left sites (i.e., C3) during the administration of HBOs with respect to clean air. On the other hand, we did not observe any significant change in the subjective valence and arousal scores as well as for the early components of the ERP (i.e., P100, N170, Vertex-Positive-Potential). Our preliminary results suggest that fear and happiness HBO can induce a protracted increase in the LPP, and possibly reflect an automatic and sustained engagement with emotionally significant content
Local lymph node involvement in pediatric renal cell carcinoma: A report from the Italian TREP project
Background. One of the most important adverse prognostic factors for adult renal cell carcinoma (RCC) is the retroperitoneal lymph node involvement. The aim of this article is to study the prognostic significance of local lymph node involvement in pediatric RCC and the role of retroperitoneal lymph node dissection (RLND) at diagnosis. Procedure. The series included 16 patients with RCC and lymph nodes involvement registered in the Italian Rare Tumors Pediatric Age (TREP) project, accounting for 26.2% of 61 pediatric RCC observed at AIEOP centers. Results. A radical nephrectomy was performed in all cases: at diagnosis in 12 cases, after preoperative chemotherapy (CT) in 4 cases. As a part of the same procedure 9 patients underwent RLND, and 7 received a more limited lymph nodes resection. Five (31.2%) developed disease recurrence 2-34 months after diagnosis (median, 6 months) plus 1 developed progression; 6 patients died, 1 of them from secondary leukemia. Among the nine patients receiving RLND, eight are alive and disease free. This compares with only one patient surviving among the seven receiving a more limited lymph nodes resection. The estimated 25-year PFS and OS rates for all patients were 61.4% (95% CI 33.2-80.5) and 50.8% (95% CI 16.5-77.5), respectively. Conclusions. Lymph node involvement is an unfavorable prognostic factor in children with RCC. RLND appears to be a critical factor to improve the outcome. However, when compared to similar adult patients, the outcome in children appears to be better, suggesting that pediatric RCC, or the host, may be critical differences. © 2008 Wiley-Liss, Inc
Contaminants from dredged sediments alter the transcriptome of Manila clam and induce shifts in microbiota composition
Background The reuse of dredged sediments in ports and lagoons is a big issue as it should not affect the quality and the equilibrium of ecosystems. In the lagoon of Venice, sediment management is of crucial importance as sediments are often utilized to built-up structures necessary to limit erosion. However, the impact of sediment reuse on organisms inhabiting this delicate area is poorly known. The Manila clam is a filter-feeding species of high economic and ecological value for the Venice lagoon experiencing a drastic decline in the last decades. In order to define the molecular mechanisms behind sediment toxicity, we exposed clams to sediments sampled from different sites
within one of the Venice lagoon navigable canals close to the industrial area. Moreover, we investigated the impacts of dredged sediments on clamâs microbial communities.
Results Concentrations of the trace elements and organic chemicals showed increasing concentrations from the city of Venice to sites close to the industrial area of Porto Marghera, where PCDD/Fs and PCBs concentrations were up to 120 times higher than the southern lagoon. While bioaccumulation of organic contaminants of industrial origin
reflected sedimentsâ chemical concentrations, metal bioaccumulation was not consistent with metal concentrations measured in sediments probably due to the activation of ABC transporters. At the transcriptional level, we found a persistent activation of the mTORC1 signalling pathway, which is central in the coordination of cellular responses to chemical stress. Microbiota characterization showed the over-representation of potential opportunistic pathogens following exposure to the most contaminated sediments, leading to host immune response activation. Despite the limited acquisition of new microbial species from sediments, the latter play an important role in shaping Manila clam microbial communities.
Conclusions Sediment management in the Venice lagoon will increase in the next years to maintain and create new canals as well as to allow the operation of the new mobile gates at the three Venice lagoon inlets. Our data reveal important transcriptional and microbial changes of Manila clams after exposure to sediments, therefore reuse of dredged sediments represents a potential risk for the conservation of this species and possibly for other organisms inhabiting the Venice lagoon
Design development and implementation of an irradiation station at the neutron time-of-flight facility at CERN
A new parasitic, mixed-field, neutron-dominated irradiation station has been recently commissioned at the European Laboratory for Particle Physics (CERN). The station is installed within the neutron time-of-flight (n_TOF) facility, taking advantage of the secondary radiation produced by the neutron spallation target, with neutrons ranging from 0.025 eV to several hundreds of MeV. The new station allows radiation damage studies to be performed in irradiation conditions that are closer to the ones encountered during the operation of particle accelerators; the irradiation tests carried out in the station will be complementary to the standard tests on materials, usually performed with gamma sources. Samples will be exposed to neutron-dominated doses in the MGy range per year, with minimal impact on the n_TOF facility operation. The station has 24 irradiation positions, each hosting up to 100 cm3 of sample material. In view of its proximity to the n_TOF target, inside protective shielding, the irradiation station and its operating procedures have been carefully developed taking into account the safety of personnel and to avoid any unwanted impact on the operation of the n_TOF facility and experiments. Due to the residual radioactivity of the whole area around the n_TOF target and of the irradiated samples, access to the irradiation station is forbidden to human operators even when the n_TOF facility is not in operation. Robots are used for the remote installation and retrieval of the samples, and other optimizations of the handling procedures were developed in compliance with radiation protection regulations and the aim of minimizing doses to personnel. The sample containers were designed to be radiation tolerant, compatible with remote handling, and subject to detailed risk analysis and testing during their development. The whole life cycle of the irradiated materials, including their post-irradiation examinations and final disposal, was considered and optimized
Prognostic Value of the Fibrosis-4 Index in Human Immunodeficiency Virus Type-1 Infected Patients Initiating Antiretroviral Therapy with or without Hepatitis C Virus
Objective: To evaluate the Fibrosis (FIB)-4 index as a predictor of major liver-related events (LRE) and liver-related death (LRD) in human immunodeficiency virus (HIV) type-1 patients initiating combination antiretroviral therapy (cART). Design: Retrospective analysis of a prospective cohort study. Setting: Italian HIV care centers participating to the ICONA Foundation cohort. Participants: Treatment-naive patients enrolled in ICONA were selected who: initiated cART, had hepatitis C virus (HCV) serology results, were HBsAg negative, had an available FIB-4 index at cART start and during follow up. Methods: Cox regression models were used to determine the association of FIB4 with the risk of major LRE (gastrointestinal bleeding, ascites, hepatic encephalopathy, hepato-renal syndrome or hepatocellular carcinoma) or LRD. Results: Three-thousand four-hundred seventy-five patients were enrolled: 73.3% were males, 27.2% HCV seropositive. At baseline (time of cART initiation) their median age was 39 years, had a median CD4+ T cell count of 260 cells/uL, and median HIV RNA 4.9 log copies/ mL, 65.9% had a FIB-4 < 1.45, 26.4% 1.45-3.25 and 7.7% > 3.25. Over a follow up of 18,662 person-years, 41 events were observed: 25 major LRE and 16 LRD (incidence rate, IR, 2.2 per 1,000 PYFU [95% confidence interval, CI 1.6-3.0]). IR was higher in HCV seropositives as compared to negatives (5.9 vs 0.5 per 1,000 PYFU). Higher baseline FIB-4 category as compared to < 1.45 (FIB-4 1.45-3.25: HR 3.55, 95% CI 1.09-11.58; FIB-4 > 3.25: HR 4.25, 1.21-14.92) and time-updated FIB-4 (FIB-4 1.45-3.25: HR 3.40, 1.02-11.40; FIB- 4> 3.25: HR 21.24, 6.75-66.84) were independently predictive of major LRE/LRD, after adjusting for HIV- and HCV-related variables, alcohol consumption and type of cART. Conclusions: The FIB-4 index at cART initiation, and its modification over time are risk factors for major LRE or LRD, independently of infection with HCV and could be used to monitor patients on cART
Switching to dual/monotherapy determines an increase in CD8+ in HIV-infected individuals: An observational cohort study
Background: The CD4/CD8 ratio has been associated with the risk of AIDS and non-AIDS events. We describe trends in immunological parameters in people who underwent a switch to monotherapy or dual therapy, compared to a control group remaining on triple antiretroviral therapy (ART). Methods: We included patients in Icona who started a three-drug combination ART regimen from an ART-naïve status and achieved a viral load †50 copies/mL; they were subsequently switched to another triple or to a mono or double regimen. Standard linear regression at fixed points in time (12-24 months after the switch) and linear mixed model analysis with random intercepts and slopes were used to compare CD4 and CD8 counts and their ratio over time according to regimen types (triple vs. dual and vs. mono). Results: A total of 1241 patients were included; 1073 switched to triple regimens, 104 to dual (72 with 1 nucleoside reverse transcriptase inhibitor (NRTI), 32 NRTI-sparing), and 64 to monotherapy. At 12 months after the switch, for the multivariable linear regression the mean change in the log10 CD4/CD8 ratio for patients on dual therapy was -0.03 (95% confidence interval (CI) -0.05, -0.0002), and the mean change in CD8 count was +99 (95% CI +12.1, +186.3), taking those on triple therapy as reference. In contrast, there was no evidence for a difference in CD4 count change. When using all counts, there was evidence for a significant difference in the slope of the ratio and CD8 count between people who were switched to triple (points/year change ratio = +0.056, CD8 = -25.7) and those to dual regimen (ratio = -0.029, CD8 = +110.4). Conclusions: We found an increase in CD8 lymphocytes in people who were switched to dual regimens compared to those who were switched to triple. Patients on monotherapy did not show significant differences. The long-term implications of this difference should be ascertained
Is physician assessment of alcohol consumption useful in predicting risk of severe liver disease among people with HIV and HIV/HCV co-infection?
Background: Alcohol consumption is a known risk factor for liver disease in HIV-infected populations. Therefore, knowledge of alcohol consumption behaviour and risk of disease progression associated with hazardous drinking are important in the overall management of HIV disease. We aimed at assessing the usefulness of routine data collected on alcohol consumption in predicting risk of severe liver disease (SLD) among people living with HIV (PLWHIV) with or without hepatitis C infection seen for routine clinical care in Italy. Methods: We included PLWHIV from two observational cohorts in Italy (ICONA and HepaICONA). Alcohol consumption was assessed by physician interview and categorized according to the National Institute for Food and Nutrition Italian guidelines into four categories: abstainer; moderate; hazardous and unknown. SLD was defined as presence of FIB4 > 3.25 or a clinical diagnosis of liver disease or liver-related death. Cox regression analysis was used to evaluate the association between level of alcohol consumption at baseline and risk of SLD. Results: Among 9542 included PLWHIV the distribution of alcohol consumption categories was: abstainers 3422 (36%), moderate drinkers 2279 (23%), hazardous drinkers 637 (7%) and unknown 3204 (34%). Compared to moderate drinkers, hazardous drinking was associated with higher risk of SLD (adjusted hazard ratio, aHR = 1.45; 95% CI: 1.03-2.03). After additionally controlling for mode of HIV transmission, HCV infection and smoking, the association was attenuated (aHR = 1.32; 95% CI: 0.94-1.85). There was no evidence that the association was stronger when restricting to the HIV/HCV co-infected population. Conclusions: Using a brief physician interview, we found evidence for an association between hazardous alcohol consumption and subsequent risk of SLD among PLWHIV, but this was not independent of HIV mode of transmission, HCV-infection and smoking. More efforts should be made to improve quality and validity of data on alcohol consumption in cohorts of HIV/HCV-infected individuals
- âŠ