5 research outputs found

    Measurement and mathematical modeling of thermally induced injury and heat shock protein expression kinetics in normal and cancerous prostate cells.

    Get PDF
    Abstract Purpose: Hyperthermia can induce heat shock protein (HSP) expression in tumours, which will cause enhanced tumour viability and increased resistance to additional thermal, chemotherapy, and radiation treatments. The study objective was to determine the relationship of hyperthermia protocols with HSP expression kinetics and cell death and develop corresponding computational predictive models of normal and cancerous prostate cell response. Methods: HSP expression kinetics and cell viability were measured in PC3 prostate cancer and RWPE-1 normal prostate cells subjected to hyperthermia protocols of 44 to 60 C for 1 to 30 min. Hsp27, Hsp60, and Hsp70 expression kinetics were determined by western blotting and visualised with immunofluorescence and confocal microscopy. Based on measured HSP expression data, a mathematical model was developed for predicting thermally induced HSP expression. Cell viability was measured with propidium iodide staining and flow cytometry to quantify the injury parameters necessary for predicting cell death following hyperthermia. Results: Significant Hsp27 and Hsp70 levels were induced in both cell types with maximum HSP expression occurring at 16 h post-heating, and diminishing substantially after 72 h. PC3 cells were slightly more sensitive to thermal stress than RWPE-1 cells. Arrhenius analysis of injury data suggested a transition between injury mechanisms at 54 C. HSP expression and injury models were effective at predicting cellular response to hyperthermia. Conclusion: Measurement of thermally induced HSP expression kinetics and cell viability associated with hyperthermia enabled development of thermal dosimetry guidelines and predictive models for HSP expression and cell injury as a function of thermal stress to investigate and design more effective hyperthermia therapies

    Dynamic Data Driven Application System for Wildfire Spread Simulation

    Get PDF
    Wildfires have significant impact on both ecosystems and human society. To effectively manage wildfires, simulation models are used to study and predict wildfire spread. The accuracy of wildfire spread simulations depends on many factors, including GIS data, fuel data, weather data, and high-fidelity wildfire behavior models. Unfortunately, due to the dynamic and complex nature of wildfire, it is impractical to obtain all these data with no error. Therefore, predictions from the simulation model will be different from what it is in a real wildfire. Without assimilating data from the real wildfire and dynamically adjusting the simulation, the difference between the simulation and the real wildfire is very likely to continuously grow. With the development of sensor technologies and the advance of computer infrastructure, dynamic data driven application systems (DDDAS) have become an active research area in recent years. In a DDDAS, data obtained from wireless sensors is fed into the simulation model to make predictions of the real system. This dynamic input is treated as the measurement to evaluate the output and adjust the states of the model, thus to improve simulation results. To improve the accuracy of wildfire spread simulations, we apply the concept of DDDAS to wildfire spread simulation by dynamically assimilating sensor data from real wildfires into the simulation model. The assimilation system relates the system model and the observation data of the true state, and uses analysis approaches to obtain state estimations. We employ Sequential Monte Carlo (SMC) methods (also called particle filters) to carry out data assimilation in this work. Based on the structure of DDDAS, this dissertation presents the data assimilation system and data assimilation results in wildfire spread simulations. We carry out sensitivity analysis for different densities, frequencies, and qualities of sensor data, and quantify the effectiveness of SMC methods based on different measurement metrics. Furthermore, to improve simulation results, the image-morphing technique is introduced into the DDDAS for wildfire spread simulation

    Distributed Particle Filters for Data Assimilation in Simulation of Large Scale Spatial Temporal Systems

    Get PDF
    Assimilating real time sensor into a running simulation model can improve simulation results for simulating large-scale spatial temporal systems such as wildfire, road traffic and flood. Particle filters are important methods to support data assimilation. While particle filters can work effectively with sophisticated simulation models, they have high computation cost due to the large number of particles needed in order to converge to the true system state. This is especially true for large-scale spatial temporal simulation systems that have high dimensional state space and high computation cost by themselves. To address the performance issue of particle filter-based data assimilation, this dissertation developed distributed particle filters and applied them to large-scale spatial temporal systems. We first implemented a particle filter-based data assimilation framework and carried out data assimilation to estimate system state and model parameters based on an application of wildfire spread simulation. We then developed advanced particle routing methods in distributed particle filters to route particles among the Processing Units (PUs) after resampling in effective and efficient manners. In particular, for distributed particle filters with centralized resampling, we developed two routing policies named minimal transfer particle routing policy and maximal balance particle routing policy. For distributed PF with decentralized resampling, we developed a hybrid particle routing approach that combines the global routing with the local routing to take advantage of both. The developed routing policies are evaluated from the aspects of communication cost and data assimilation accuracy based on the application of data assimilation for large-scale wildfire spread simulations. Moreover, as cloud computing is gaining more and more popularity; we developed a parallel and distributed particle filter based on Hadoop & MapReduce to support large-scale data assimilation
    corecore