153,661 research outputs found
Radiation safety based on the sky shine effect in reactor
In the reactor operation, neutrons and gamma rays are the most dominant radiation.
As protection, lead and concrete shields are built around the reactor. However, the radiation
can penetrate the water shielding inside the reactor pool. This incident leads to the occurrence
of sky shine where a physical phenomenon of nuclear radiation sources was transmitted
panoramic that extends to the environment. The effect of this phenomenon is caused by the
fallout radiation into the surrounding area which causes the radiation dose to increase. High
doses of exposure cause a person to have stochastic effects or deterministic effects. Therefore,
this study was conducted to measure the radiation dose from sky shine effect that scattered
around the reactor at different distances and different height above the reactor platform. In this
paper, the analysis of the radiation dose of sky shine effect was measured using the
experimental metho
The energy-oriented management of public historic buildings: An integrated approach and methodology applications
In the European framework, there is a strong drive to develop integrated approaches aimed at understanding and improving the energy behavior of public historic buildings within urban contexts. However, the examples already provided tend to address the issue from mono-disciplinary perspectives, losing the opportunity for a coordinated view. The research suggests a methodology to reach the definition of a three-dimensional database, which incorporates spatial models and energy information, with the final goal of merging heterogeneous information that is useful to interpret the overall framework and to design sustainable development scenarios. The platform achieves GIS (Geographic Information System) and BIM (Building Information Modeling) integration by using the CityGML data model, for supporting multi-scale analyses without break of continuity, ranging from urban to building level. The discussion combines the applicative case with the theoretical background, deepening the role of a solid knowledge framework as a basis for sustainable interventions on public historic buildings. To better explain and test the methodology, a case study on the University built heritage of Pavia is presented and three possible outputs deriving from the database are discussed. The example demonstrates the strength of the approach, which is able to provide a variety of results coming from a unique source of information, ensuring coherence and unambiguousness at all levels of investigation
Machine Learning and Integrative Analysis of Biomedical Big Data.
Recent developments in high-throughput technologies have accelerated the accumulation of massive amounts of omics data from multiple sources: genome, epigenome, transcriptome, proteome, metabolome, etc. Traditionally, data from each source (e.g., genome) is analyzed in isolation using statistical and machine learning (ML) methods. Integrative analysis of multi-omics and clinical data is key to new biomedical discoveries and advancements in precision medicine. However, data integration poses new computational challenges as well as exacerbates the ones associated with single-omics studies. Specialized computational approaches are required to effectively and efficiently perform integrative analysis of biomedical data acquired from diverse modalities. In this review, we discuss state-of-the-art ML-based approaches for tackling five specific computational challenges associated with integrative analysis: curse of dimensionality, data heterogeneity, missing data, class imbalance and scalability issues
Recommended from our members
Validation and clinical implementation of an accurate Monte Carlo code for pencil beam scanning proton therapy.
Monte Carlo (MC)-based dose calculations are generally superior to analytical dose calculations (ADC) in modeling the dose distribution for proton pencil beam scanning (PBS) treatments. The purpose of this paper is to present a methodology for commissioning and validating an accurate MC code for PBS utilizing a parameterized source model, including an implementation of a range shifter, that can independently check the ADC in commercial treatment planning system (TPS) and fast Monte Carlo dose calculation in opensource platform (MCsquare). The source model parameters (including beam size, angular divergence and energy spread) and protons per MU were extracted and tuned at the nozzle exit by comparing Tool for Particle Simulation (TOPAS) simulations with a series of commissioning measurements using scintillation screen/CCD camera detector and ionization chambers. The range shifter was simulated as an independent object with geometric and material information. The MC calculation platform was validated through comprehensive measurements of single spots, field size factors (FSF) and three-dimensional dose distributions of spread-out Bragg peaks (SOBPs), both without and with the range shifter. Differences in field size factors and absolute output at various depths of SOBPs between measurement and simulation were within 2.2%, with and without a range shifter, indicating an accurate source model. TOPAS was also validated against anthropomorphic lung phantom measurements. Comparison of dose distributions and DVHs for representative liver and lung cases between independent MC and analytical dose calculations from a commercial TPS further highlights the limitations of the ADC in situations of highly heterogeneous geometries. The fast MC platform has been implemented within our clinical practice to provide additional independent dose validation/QA of the commercial ADC for patient plans. Using the independent MC, we can more efficiently commission ADC by reducing the amount of measured data required for low dose "halo" modeling, especially when a range shifter is employed
- …