13,083 research outputs found

    Modelling the impact of treatment uncertainties in radiotherapy

    Get PDF
    Uncertainties are inevitably part of the radiotherapy process. Uncertainty in the dose deposited in the tumour exists due to organ motion, patient positioning errors, fluctuations in machine output, delineation of regions of interest, the modality of imaging used, and treatment planning algorithm assumptions among others; there is uncertainty in the dose required to eradicate a tumour due to interpatient variations in patient-specific variables such as their sensitivity to radiation; and there is uncertainty in the dose-volume restraints that limit dose to normal tissue. This thesis involves three major streams of research including investigation of the actual dose delivered to target and normal tissue, the effect of dose uncertainty on radiobiological indices, and techniques to display the dose uncertainty in a treatment planning system. All of the analyses are performed with the dose distribution from a four-field box treatment using 6 MV photons. The treatment fields include uniform margins between the clinical target volume and planning target volume of 0.5 cm, 1.0 cm, and 1.5 cm. The major work is preceded by a thorough literature review on the size of setup and organ motion errors for various organs and setup techniques used in radiotherapy. A Monte Carlo (MC) code was written to simulate both the treatment planning and delivery phases of the radiotherapy treatment. Using MC, the mean and the variation in treatment dose are calculated for both an individual patient and across a population of patients. In particular, the possible discrepancy in tumour position located from a single CT scan and the magnitude of reduction in dose variation following multiple CT scans is investigated. A novel convolution kernel to include multiple pretreatment CT scans in the calculation of mean treatment dose is derived. Variations in dose deposited to prostate and rectal wall are assessed for each of the margins and for various magnitudes of systematic and random error, and penumbra gradients. The linear quadratic model is used to calculate prostate Tumour Control Probability (TCP) incorporating an actual (modelled) delivered prostate dose. The Kallman s-model is used to calculate the normal tissue complication probability (NTCP), incorporating actual (modelled) fraction dose in the deforming rectal wall. The impact of each treatment uncertainty on the variation in the radiobiological index is calculated for the margin sizes.Thesis (Ph.D.)--Department of Physics and Mathematical Physics, 2002

    Liftoff and Transition Database Generation for Launch Vehicles Using Data-Fusion-Based Modeling

    Get PDF
    A data fusion technique for merging multiple data sources with differing fidelity and resolution was developed to support the production of aerodynamic line load databases for the Liftoff and Transition (LOT) flight phase of the Space Launch System (SLS). The technique uses a reduced order model based on a high-fidelity line load data set from Computational Fluid Dynamics (CFD) to predict solutions for a much larger solution space. Even higher-fidelity force and moment information (from wind-tunnel tests) is then used to adjust the model. The adjustment uses constrained optimization through the method of Lagrange multipliers in order to minimize the deviation of the line load distribution from the spatially-dense CFD solution, while ensuring that the integrated force and moment values match those observed in physical wind tunnel measurements. Though the wind-tunnel data are operationally-dense (available at many flow conditions), they are spatially coarse (as only the overall forces and moments are available). Conversely, CFD for such complex configurations is expensive, and thus operationally sparse. Data fusion techniques are necessary to make the most efficient use of available information, delivering accurate results within time and resource constraints

    Decoherence rates for Galilean covariant dynamics

    Full text link
    We introduce a measure of decoherence for a class of density operators. For Gaussian density operators in dimension one it coincides with an index used by Morikawa (1990). Spatial decoherence rates are derived for three large classes of the Galilean covariant quantum semigroups introduced by Holevo. We also characterize the relaxation to a Gaussian state for these dynamics and give a theorem for the convergence of the Wigner function to the probability distribution of the classical analog of the process.Comment: 23 page

    Measuring the Efficiency of an FCC Spectrum Auction

    Get PDF
    FCC spectrum auctions sell licenses to provide mobile phone service in designated geographic territories. We propose a method to structurally estimate the deterministic component of bidder valuations and apply it to the 1995–1996 C-block auction. We base our estimation of bidder values on a pairwise stability condition, which implies that two bidders cannot exchange licenses in a way that increases total surplus. Pairwise stability holds in many theoretical models of simultaneous ascending auctions, including some models of intimidatory collusion and demand reduction. Pairwise stability is also approximately satisfied in data that we examine from economic experiments. The lack of post-auction resale also suggests pairwise stability. Using our estimates of deterministic valuations, we measure the allocative efficiency of the C-block outcome.

    Do Input Quality and Structural Productivity Estimates Drive Measured Differences in Firm Productivity?

    Get PDF
    Firms in the same industry can differ in measured total factor productivity (TFP) by multiples of 3. Griliches (1957) suggests one explanation: the quality of inputs differs across firms. Labor inputs are traditionally measured only as the number of workers. We investigate whether adjusting for the quality of labor inputs substantially decreases measured TFP dispersion. We add labor market history variables such as experience and firm and industry tenure, as well as general human capital measures such as schooling and sex. We also investigate whether an innovative structural estimator for productivity due to Olley and Pakes (1996) substantially decreases measured residual TFP. Combining labor quality and structural estimates of productivity, the one standard deviation difference in residual TFPs in manufacturing drops from 0.70 to 0.67 multiples. Neither the structural productivity measure nor detailed input quality measures explain the very large measured residual TFP dispersion, despite statistically precise coefficient estimatesproduction function estimation; total factor productivity; input quality; structural estimates of productivity
    corecore