216 research outputs found

    VOXEL-LEVEL ABSORBED DOSE CALCULATIONS WITH A DETERMINISTIC GRID-BASED BOLTZMANN SOLVER FOR NUCLEAR MEDICINE AND THE CLINICAL VALUE OF VOXEL-LEVEL CALCULATIONS

    Get PDF
    Voxel-level absorbed dose (VLAD) is rarely calculated for nuclear medicine (NM) procedures involving unsealed sources or 90Y microspheres (YM). The current standard of practice for absorbed dose calculations in NM utilizes MIRD S-values, which 1) assume a uniform distribution in organs, 2) do not use patient specific geometry, and 3) lack a tumor model. VLADs overcome these limitations. One reason VLADs are not routinely performed is the difficulty in obtaining accurate absorbed doses in a clinically acceptable time. The deterministic grid-based Boltzmann solver (GBBS) was recently applied to radiation oncology where it was reported as fast and accurate for both megavoltage photons and high dose rate nuclide-based photon brachytherapy. This dissertation had two goals. The first was to demonstrate that the general GBBS code ATTILA™ can be used for VLADs in NM, where primary photon and electron sources are distributed throughout a patient. The GBBS was evaluated in voxel-S-value geometries where agreement with Monte Carlo (MC) in the source voxel was 6% for 90Y and 131I; 20% differences were seen for mono-energetic 10 keV photons in bone. An adaptive tetrahedral mesh (ATM) generation procedure was developed using information from both the SPECT and CT for 90Y and 131I patients. The ATM with increased energy transport cutoffs, enabled GBBS transport to execute in under 2 (90Y) and 10 minutes (131I). GBBS absorbed doses to tumors and organs were within 4.5% of MC. Dose volume histograms were indistinguishable from MC. The second goal was to demonstrate VLAD value using 21 YM patients. Package insert dosimetry was not able to predict mean VLAD tumor absorbed doses. Partition model had large bias (factor of 0.39) and uncertainty (±128 Gy). Dose-response curves for hepatocellular carcinoma tumors were generated using logistic regression. The dose covering 70% of volume (D70) predicted binary modified RECIST response with an area under the curve of 80.3%. A D70 88 Gy threshold yielded 89% specificity and 69% sensitivity. The GBBS was shown to be fast and accurate, flaws in clinical dosimetry models were highlighted, and dose-response curves were generated. The findings in this dissertation support the adoption of VLADs in NM

    Numerical modeling of igniting non-premixed combustion systems using FGM

    Get PDF

    A fuzzy neural network based dynamic data allocation model on heterogeneous multi-GPUs for large-scale computations

    Get PDF
    The parallel computation capabilities of modern GPU (Graphics Processing Unit) processors have attracted increasing attention from researchers and engineers who have been conducting high computational throughput studies. However, current single GPU based engineering solutions are often struggle to fulfill their real-time requirements. Thus, the multi-GPU-based approach has become a popular and cost-effective choice for tackling the demands. In those cases, the computational load balancing over multiple GPU “nodes” is often the key and bottleneck that affect the quality and performance of the runtime system. The existing load balancing approaches are mainly based on the assumption that all GPU nodes in the same computer framework are of equal computational performance, which are often not the case due to cluster design and other legacy issues. This paper presents a novel dynamic load balancing (DLB) model for rapid data division and allocation on heterogeneous GPU nodes based on an innovative fuzzy neural network (FNN). In this research, a 5-state parameter feedback mechanism defining the overall cluster and node performances is proposed. The corresponding FNN-based DLB model will be capable of monitoring and predicting individual node performance under different workload scenarios. A real-time adaptive scheduler has been devised to reorganize the data inputs to each node when necessary to maintain their runtime computational performances. The devised model has been implemented on two dimensional (2D) discrete wavelet transform (DWT) tasks for evaluation. Experiment results show that this DLB model has enabled a high computational throughput while ensuring real-time and precision requirements from complex computational tasks

    Perceptual organization in image analysis : a mathematical approach based on scale, orientation and curvature

    Get PDF

    Learning Invariant Representations of Images for Computational Pathology

    Get PDF

    Uncertainty propagation and sensitivity analysis techniques in building performance simulation to support conceptual building and system design

    Get PDF
    Due to advances in computing and modeling, the Architecture Engineering and Construction (AEC)industry has arrived at an era of digital empiricism. Computational simulation tools are widely used across many engineering disciplines for design, evaluation and analysis. Experts in the field agree that design decisions taken during the early design stages have a significant impact on the real performance of the building. Nevertheless, building performance simulation is still hardly used during conceptual design. The European Commission has targeted a 20% reduction of CO2 emissions, a 20% increase of energy efficiency and a 20% increase in the use of renewable energy by 2020. These ambitious aims have resulted in the recasting of the Energy for Buildings Directive, demanding nearly-zero-net energy-buildings for new buildings and major refurbishments by 2020. The formulated aim requires for the first time an integrated design of the building’s demand and supply systems. The current research was triggered by the above observation. It uses semi-structured interviews and critical reviews of literature and software to establish the reasons that prevent Heating, Ventilation and Air Conditioning (HVAC) consultants from adopting Building Performance Simulation (BPS) tools and to identify the needs of practitioners during the conceptual design stage. In response to the identified needs, a rapid iterative development process is deployed to produce a prototypical software tool. Finally, the tool is heuristically tested on expert users to evaluate its capability to support the conceptual design process. The results obtained from interviews and reviews highlight that HVAC consultants work with an increasing number of design alternatives to prevent dysfunctional buildings. The complexity of design problems is increasing on the one hand due to the need for an early integration of engineering discipline’s and on the other hand due to the challenges in meeting the even more stringent requirements of new buildings. Furthermore, design teams run the risk of only identifying suboptimal solutions for the design problem when they limit themselves too early to a small number of design alternatives. The use of simulation tools helps facilitate a quick turnaround of performance evaluations for a great number of design alternatives early in the design process. By doing so, performance simulation tools have the potential to supplement design experience and support decision making. However, simulation tools are perceived by many as too detailed to be readily used for conceptual design support. Research findings suggest that tools for the early design stages are required to enable parametric studies and to provide facilities to explore the relationships between potential design decisions and performance aspects. Tools need to be able to dynamically scale the resolution of their interfaces to fit the different levels of information density characteristic of the different design stages. In addition, they need to be flexible enough to facilitate expansion of the system representations with innovative design concepts as the design progresses. Due to the need for parametric studies and the exploration of the relationships between potential design decision and performance aspects, this research explores the extension and application of BPS tools with techniques for uncertainty propagation and sensitivity analysis for conceptual design support. This endeavor requires (1) the evaluation and selection of an extension strategy, (2) the determination of the format and availability of input to techniques for uncertainty propagation and sensitivity analysis, as well as (3) developing knowledge regarding the extent and content of the design option space. To avoid the need to modify the source code of BPS tools, an external strategy is applied that embeds an existing simulation engine into a shell with extra features for statistical pre and post-processing by Latin Hypercube sampling and regression based sensitivity analysis. With regards to the model resolution, results suggest that it is more beneficial to use detailed models with adaptive interfaces rather than simpler tools. The advantages are twofold. Firstly, the BPS tool can use an existing validated simulation model - rather than a specifically developed abstract model with limited applicability. Secondly, the model is able to provide consistent feedback throughout the lifetime of the building. Within the iterative process, the conceptual design stage has some distinctive tasks, such as to explore the option space and to generate and evaluate design concepts. The option space is multidimensional, due to its multi-disciplinary set-up and wide-ranging interests of the participating practitioners. An empirical study as part of the research demonstrates the presence of at least two attributes, four subsystem categories and four relationships. Depending on the experience of the practicing designer, components, attributes and relationships are used to a very different extent. While experienced HVAC consultants seem to work mainly with relationships when compiling a design concept, novice designers prefer to work with components. The sampling based analysis strategy requires knowledge about the uncertainty of the parametric model input in the form of probability distribution functions. On the basis of a survey on internal gains for offices, this thesis concludes that current design guidelines provide useful data in a suitable format. Measurements conducted in an office building in Amsterdam confirm the trend towards decreasing equipment gains and the proportional increase of lighting gains. However, in the absence of data to derive a probability density function, this research suggests the definition of "explanatory" scenarios. It is common practice to use "normative" scenarios as input in building performance studies aiming to prove compliance with building regulations. The use of "exploratory" scenarios is less common. Scenario based load profiles have to meet three characteristics. They have to be: (1) locally representative; (2) up-to date and (3) need to match workplace culture. As part of this thesis explanatory data sets were developed representing climate change scenarios for The Netherlands. The exploratory scenarios facilitate the robustness assessment of the future performance of design alternatives. Tests with the Dutch data sets confirm that neither the current reference data nor the projected reference data provide valid results to predict uncertainty ranges for the peak cooling load as a potential robustness indicator. A simulation based comparative robustness assessment of three HVAC concepts over 15 and 30 years is reported. The results indicate a robust future performance for the floor-cooling based design alternative with respect to thermal comfort and cooling energy demand. The software prototype shows that detailed simulation tools with features for uncertainty propagation and sensitivity analysis provide the facilities to explore consequences of potential design decisions on performance aspects. In addition, they enable parametric studies and create the possibility to quantify parameter interactions and their collective impact on the performance aspect. Heuristic usability evaluation of the software prototype confirms the value to design practice. 85% of approached HVAC consultants state that the uncertainty of performance aspects is an important parameter to support conceptual design. More importantly, 80% of the practitioners consider the prototype to have great potential to reduce the number of necessary design iterations. This thesis concludes that simulation tools that quantitatively address uncertainties and sensitivities related to conceptual building design generate value by (1) providing an indication of the accuracy of the performance predictions; (2) allowing the identification of parameters and systems to which performance metrics react sensitively and in-sensitively, respectively; and (3) enabling a robustness assessment of design alternatives
    • …
    corecore