12 research outputs found

    Underground radiobiology: a perspective at Gran Sasso National Laboratory

    Get PDF
    Scientific community and institutions (e. g., ICRP) consider that the Linear No-Threshold (LNT) model, which extrapolates stochastic risk at low dose/low dose rate from the risk at moderate/high doses, provides a prudent basis for practical purposes of radiological protection. However, biological low dose/dose rate responses that challenge the LNT model have been highlighted and important dowels came from radiobiology studies conducted in Deep Underground Laboratories (DULs). These extreme ultra-low radiation environments are ideal locations to conduct below-background radiobiology experiments, interesting from basic and applied science. The INFN Gran Sasso National Laboratory (LNGS) (Italy) is the site where most of the underground radiobiological data has been collected so far and where the first in vivo underground experiment was carried out using Drosophila melanogaster as model organism. Presently, many DULs around the world have implemented dedicated programs, meetings and proposals. The general message coming from studies conducted in DULs using protozoan, bacteria, mammalian cells and organisms (flies, worms, fishes) is that environmental radiation may trigger biological mechanisms that can increase the capability to cope against stress. However, several issues are still open, among them: the role of the quality of the radiation spectrum in modulating the biological response, the dependence on the biological endpoint and on the model system considered, the overall effect at organism level (detrimental or beneficial). At LNGS, we recently launched the RENOIR experiment aimed at improving knowledge on the environmental radiation spectrum and to investigate the specific role of the gamma component on the biological response of Drosophila melanogaster

    Meta-analytic study on substance intake and work-related accidents calls for attention to bio-psycho-social factors

    Full text link
    : Accidents at work are a major concern because of their social and economic impact. The causes are highly variable and often linked to risk behaviors that could be avoided, of which substance use is a prime example. The aim of this paper was to meta-analytically review the scientific literature on substance intake and its link to work-related accidents. From an initial pool of 19954 papers, we considered a final sample of 27 clustered in three groups according to substances class (alcohol, recreational drugs, medicines). Despite different pharmacological effects, substances consumed for recreational purposes significantly increased the risk of work-related accidents (odds ratio: alcohol 1.78, recreational drugs 1.47), whereas medicines did not: however, these results require caution due to the heterogeneity of the included studies and suspected publication bias. While bio-psycho-social factors could have helped to understand this association, selected studies neglected both the variegated effects and the root causes of recreational substance consumption. Future studies and interventions should consider these complexity factors to transcend the mere description of the phenomenon

    A systematic review on the role of substance consumption in work-related road traffic crashes reveals the importance of biopsychosocial factors in prevention

    Full text link
    Objective: Since many jobs imply driving, a relevant part of all road traffic crashes (RTC) is related to work. Statistics considering all crashes suggest that they are significantly associated with consumption of substances, but the root causes are not yet clear. The objective of the present paper was to systematically review the scientific literature concerning substances consumption and workrelated RTC. We queried the PubMed and Scopus electronic databases according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Articles were included if they reported all necessary data and survived a quality assessment. We selected a final sample of 30 articles from an initial pool of 7113. As hypothesized, taking any of the considered substances was found to increase the risk of work-related RTC. Descriptive statistics on work-related RTC showed a higher average positivity rate for medicines (14.8%) than for alcohol (3.02%) and drugs (0.84%). Interestingly, the impact of some medications found an unconvincing explanation in the mere occurrence of side effects, and it suggests that psychosocial and/or medical conditions could be better predictors of RTC. We therefore propose an intervention and prevention model that also considers biopsychosocial factors, for which further studies are needed in future research

    Irradiation of Mesenchymal Stromal Cells With Low and High Doses of Alpha Particles Induces Senescence and/or Apoptosis

    Full text link
    The use of high-linear energy transfer charged particles is gaining attention as a medical tool because of the emission of radiations with an efficient cell-killing ability. Considerable interest has developed in the use of targeted alpha-particle therapy for the treatment of micrometastases. Moreover, the use of helium beams is gaining momentum, especially for treating pediatric tumors. We analyzed the effects of alpha particles on bone marrow mesenchymal stromal cells (MSCs), which have a subpopulation of stem cells capable of generating adipocytes, chondrocytes, and osteocytes. Further, these cells contribute toward maintenance of homeostasis in the body. MSCs were irradiated with low and high doses of alpha particles or X-rays and a comparative biological analysis was performed. At a low dose (40 mGy), alpha particles exhibited a limited negative effect on the biology of MSCs compared with X-rays. No significant perturbation of cell cycle was observed, and a minimal increase in apoptosis or senescence was detected. Self-renewal was preserved as revealed by the CFU assay. On the contrary, with 2000 mGy alpha particles, we observed adverse effects on the vitality, functionality, and stemness of MSCs. These results are the consequence of different proportion of cells targeted by alpha particles or X-rays and the quality of induced DNA damage. The present study suggests that radiotherapy with alpha particles may spare healthy stem cells more efficaciously than X-ray treatments, an observation that should be taken into consideration by physicians while planning irradiation of tumor areas close to stem cell niches, such as bone marrow. J. Cell. Biochem. 118: 2993â\u80\u933002, 2017. © 2017 Wiley Periodicals, Inc

    Effectiveness of Flattening-Filter-Free versus Flattened Beams in V79 and Glioblastoma Patient-Derived Stem-like Cells

    Full text link
    Literature data on the administration of conventional high-dose beams with (FF) or without flattening filters (FFF) show conflicting results on biological effects at the cellular level. To contribute to this field, we irradiated V79 Chinese hamster lung fibroblasts and two patient-derived glioblastoma stem-like cell lines (GSCs—named #1 and #83) using a clinical 10 MV accelerator with FF (at 4 Gy/min) and FFF (at two dose rates 4 and 24 Gy/min). Cell killing and DNA damage induction, determined using the γ-H2AX assay, and gene expression were studied. No significant differences in the early survival of V79 cells were observed as a function of dose rates and FF or FFF beams, while a trend of reduction in late survival was observed at the highest dose rate with the FFF beam. GSCs showed similar survival levels as a function of dose rates, both delivered in the FFF regimen. The amount of DNA damage measured for both dose rates after 2 h was much higher in line #1 than in line #83, with statistically significant differences between the two dose rates only in line #83. The gene expression analysis of the two GSC lines indicates gene signatures mimicking the prognosis of glioblastoma (GBM) patients derived from a public database. Overall, the results support the current use of FFF and highlight the possibility of identifying patients with candidate gene signatures that could benefit from irradiation with FFF beams at a high dose rate

    Fluid balance and urine volume are independent predictors of mortality in acute kidney injury

    Get PDF
    Introduction: In ICUs, both fluid overload and oliguria are common complications associated with increased mortality among critically ill patients, particularly in acute kidney injury (AKI). Although fluid overload is an expected complication of oliguria, it remains unclear whether their effects on mortality are independent of each other. The aim of this study is to evaluate the impact of both fluid balance and urine volume on outcomes and determine whether they behave as independent predictors of mortality in adult ICU patients with AKI. Methods: We performed a secondary analysis of data from a multicenter, prospective cohort study in 10 Italian ICUs. AKI was defined by renal sequential organ failure assessment (SOFA) score (creatinine >3.5 mg/dL or urine output (UO) <500 mL/d). Oliguria was defined as a UO <500 mL/d. Mean fluid balance (MFB) and mean urine volume (MUV) were calculated as the arithmetic mean of all daily values. Use of diuretics was noted daily. To assess the impact of MFB and MUV on mortality of AKI patients, multivariate analysis was performed by Cox regression. Results: Of the 601 included patients, 132 had AKI during their ICU stay and the mortality in this group was 50%. Non-surviving AKI patients had higher MFB (1.31 +/- 1.24 versus 0.17 +/- 0.72 L/day; P < 0.001) and lower MUV (1.28 +/- 0.90 versus 2.35 +/- 0.98 L/day; P < 0.001) as compared to survivors. In the multivariate analysis, MFB (adjusted hazard ratio (HR) 1.67 per L/day, 95% CI 1.33 to 2.09; < 0.001) and MUV (adjusted HR 0.47 per L/day, 95% CI 0.33 to 0.67; < 0.001) remained independent risk factors for 28-day mortality after adjustment for age, gender, diabetes, hypertension, diuretic use, non-renal SOFA and sepsis. Diuretic use was associated with better survival in this population (adjusted HR 0.25, 95% CI 0.12 to 0.52; < 0.001). Conclusions: In this multicenter ICU study, a higher fluid balance and a lower urine volume were both important factors associated with 28-day mortality of AKI patients

    Utilization of Small Changes in Serum Creatinine with Clinical Risk Factors to Assess the Risk of AKI in Critically lll Adults

    Full text link
    BACKGROUND AND OBJECTIVES: Disease biomarkers require appropriate clinical context to be used effectively. Combining clinical risk factors, in addition to small changes in serum creatinine, has been proposed to improve the assessment of AKI. This notion was developed in order to identify the risk of AKI early in a patient's clinical course. We set out to assess the performance of this combination approach. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: A secondary analysis of data from a prospective multicenter intensive care unit cohort study (September 2009 to April 2010) was performed. Patients at high risk using this combination approach were defined as an early increase in serum creatinine of 0.1-0.4 mg/dl, depending on number of clinical factors predisposing to AKI. AKI was defined and staged using the Acute Kidney Injury Network criteria. The primary outcome was evolution to severe AKI (Acute Kidney Injury Network stages 2 and 3) within 7 days in the intensive care unit. RESULTS: Of 506 patients, 214 (42.2%) patients had early creatinine elevation and were deemed at high risk for AKI. This group was more likely to subsequently develop the primary endpoint (16.4% versus 1.0% [not at high risk], P<0.001). The sensitivity of this grouping for severe AKI was 92%, the specificity was 62%, the positive predictive value was 16%, and the negative predictive value was 99%. After adjustment for Sequential Organ Failure Assessment score, serum creatinine, and hazard tier for AKI, early creatinine elevation remained an independent predictor for severe AKI (adjusted relative risk, 12.86; 95% confidence interval, 3.52 to 46.97). Addition of early creatinine elevation to the best clinical model improved prediction of the primary outcome (area under the receiver operating characteristic curve increased from 0.75 to 0.83, P<0.001). CONCLUSION: Critically ill patients at high AKI risk, based on the combination of clinical factors and early creatinine elevation, are significantly more likely to develop severe AKI. As initially hypothesized, the high-risk combination group methodology can be used to identify patients at low risk for severe AKI in whom AKI biomarker testing may be expected to have low yield. The high risk combination group methodology could potentially allow clinicians to optimize biomarker use
    corecore