235 research outputs found

    Treatment planning comparison for head and neck cancer between photon, proton, and combined proton-photon therapy - from a fixed beam line to an arc.

    Get PDF
    BACKGROUND AND PURPOSE This study investigates whether combined proton-photon therapy (CPPT) improves treatment plan quality compared to single-modality intensity-modulated radiation therapy (IMRT) or intensity-modulated proton therapy (IMPT) for head and neck cancer (HNC) patients. Different proton beam arrangements for CPPT and IMPT are compared, which could be of specific interest concerning potential future upright-positioned treatments. Furthermore, it is evaluated if CPPT benefits remain under inter-fractional anatomical changes for HNC treatments. MATERIAL AND METHODS Five HNC patients with a planning CT and multiple (4-7) repeated CTs were studied. CPPT with simultaneously optimized photon and proton fluence, single-modality IMPT, and IMRT treatment plans were optimized on the planning CT and then recalculated and reoptimized on each repeated CT. For CPPT and IMPT, plans with different degrees of freedom for the proton beams were optimized. Fixed horizontal proton beam line (FHB), gantry-like, and arc-like plans were compared. RESULTS The target coverage for CPPT without adaptation is insufficient (average V95%=88.4%), while adapted plans can recover the initial treatment plan quality for target (average V95%=95.5%) and organs-at-risk. CPPT with increased proton beam flexibility increases plan quality and reduces normal tissue complication probability of Xerostomia and Dysphagia. On average, Xerostomia NTCP reductions compared to IMRT are -2.7%/-3.4%/-5.0% for CPPT FHB/CPPT Gantry/CPPT Arc. The differences for IMPT FHB/IMPT Gantry/IMPT Arc are +0.8%/-0.9%/-4.3%. CONCLUSION CPPT for HNC needs adaptive treatments. Increasing proton beam flexibility in CPPT, either by using a gantry or an upright-positioned patient, improves treatment plan quality. However, the photon component is substantially reduced, therefore, the balance between improved plan quality and costs must be further determined

    Spin Flip Probabilities in 208-Pb Measured with 200 MeV Protons

    Get PDF
    This research was sponsored by the National Science Foundation Grant NSF PHY-931478

    An approach for estimating dosimetric uncertainties in deformable dose accumulation in pencil beam scanning proton therapy for lung cancer

    Get PDF
    Deformable image registration (DIR) is an important component for dose accumulation and associated clinical outcome evaluation in radiotherapy. However, the resulting deformation vector field (DVF) is subject to unavoidable discrepancies when different algorithms are applied, leading to dosimetric uncertainties of the accumulated dose. We propose here an approach for proton therapy to estimate dosimetric uncertainties as a consequence of modeled or estimated DVF uncertainties. A patient-specific DVF uncertainty model was built on the first treatment fraction, by correlating the magnitude differences of five DIR results at each voxel to the magnitude of any single reference DIR. In the following fractions, only the reference DIR needs to be applied, and DVF geometric uncertainties were estimated by this model. The associated dosimetric uncertainties were then derived by considering the estimated geometric DVF uncertainty, the dose gradient of fractional recalculated dose distribution and the direction factor from the applied reference DIR of this fraction. This estimated dose uncertainty was respectively compared to the reference dose uncertainty when different DIRs were applied individually for each dose warping. This approach was validated on seven NSCLC patients, each with nine repeated CTs. The proposed model-based method is able to achieve dose uncertainty distribution on a conservative voxel-to-voxel comparison within +/- 5% of the prescribed dose to the 'reference' dosimetric uncertainty, for 77% of the voxels in the body and 66%-98% of voxels in investigated structures. We propose a method to estimate DIR induced uncertainties in dose accumulation for proton therapy of lung tumor treatments

    Neural parameters estimation for brain tumor growth modeling

    Full text link
    Understanding the dynamics of brain tumor progression is essential for optimal treatment planning. Cast in a mathematical formulation, it is typically viewed as evaluation of a system of partial differential equations, wherein the physiological processes that govern the growth of the tumor are considered. To personalize the model, i.e. find a relevant set of parameters, with respect to the tumor dynamics of a particular patient, the model is informed from empirical data, e.g., medical images obtained from diagnostic modalities, such as magnetic-resonance imaging. Existing model-observation coupling schemes require a large number of forward integrations of the biophysical model and rely on simplifying assumption on the functional form, linking the output of the model with the image information. In this work, we propose a learning-based technique for the estimation of tumor growth model parameters from medical scans. The technique allows for explicit evaluation of the posterior distribution of the parameters by sequentially training a mixture-density network, relaxing the constraint on the functional form and reducing the number of samples necessary to propagate through the forward model for the estimation. We test the method on synthetic and real scans of rats injected with brain tumors to calibrate the model and to predict tumor progression

    Is there a Pronounced Giant Dipole Resonance in ^4He?

    Get PDF
    A four-nucleon calculation of the total ^4He photodisintegration cross section is performed. The full final-state interaction is taken into account for the first time. This is achieved via the method of the Lorentz integral transform. Semi-realistic NN interactions are employed. Different from the known partial two-body ^4He(\gamma,n)^3He and ^4He(\gamma,p)^3H cross sections our total cross section exhibits a pronounced giant resonance. Thus, in contrast to older (Îł,np)(\gamma,np) data, we predict quite a strong contribution of the (Îł,np)(\gamma,np) channel at the giant resonance peak energy.Comment: 10 pages, Latex (REVTEX), 4 Postscript figures, to appear in Phys. Rev. Let

    Metacognition as Evidence for Evidentialism

    Get PDF
    Metacognition is the monitoring and controlling of cognitive processes. I examine the role of metacognition in ‘ordinary retrieval cases’, cases in which it is intuitive that via recollection the subject has a justiïŹed belief. Drawing on psychological research on metacognition, I argue that evidentialism has a unique, accurate prediction in each ordinary retrieval case: the subject has evidence for the proposition she justiïŹedly believes. But, I argue, process reliabilism has no unique, accurate predictions in these cases. I conclude that ordinary retrieval cases better support evidentialism than process reliabilism. This conclusion challenges several common assumptions. One is that non-evidentialism alone allows for a naturalized epistemology, i.e., an epistemology that is fully in accordance with scientiïŹc research and methodology. Another is that process reliabilism fares much better than evidentialism in the epistemology of memory

    The Epistemic Status of Processing Fluency as Source for Judgments of Truth

    Get PDF
    This article combines findings from cognitive psychology on the role of processing fluency in truth judgments with epistemological theory on justification of belief. We first review evidence that repeated exposure to a statement increases the subjective ease with which that statement is processed. This increased processing fluency, in turn, increases the probability that the statement is judged to be true. The basic question discussed here is whether the use of processing fluency as a cue to truth is epistemically justified. In the present analysis, based on Bayes’ Theorem, we adopt the reliable-process account of justification presented by Goldman (1986) and show that fluency is a reliable cue to truth, under the assumption that the majority of statements one has been exposed to are true. In the final section, we broaden the scope of this analysis and discuss how processing fluency as a potentially universal cue to judged truth may contribute to cultural differences in commonsense beliefs

    Optimal margin and edge-enhanced intensity maps in the presence of motion and uncertainty

    Get PDF
    In radiation therapy, intensity maps involving margins have long been used to counteract the effects of dose blurring arising from motion. More recently, intensity maps with increased intensity near the edge of the tumour (edge enhancements) have been studied to evaluate their ability to offset similar effects that affect tumour coverage. In this paper, we present a mathematical methodology to derive margin and edge-enhanced intensity maps that aim to provide tumour coverage while delivering minimum total dose. We show that if the tumour is at most about twice as large as the standard deviation of the blurring distribution, the optimal intensity map is a pure scaling increase of the static intensity map without any margins or edge enhancements. Otherwise, if the tumour size is roughly twice (or more) the standard deviation of motion, then margins and edge enhancements are preferred, and we present formulae to calculate the exact dimensions of these intensity maps. Furthermore, we extend our analysis to include scenarios where the parameters of the motion distribution are not known with certainty, but rather can take any value in some range. In these cases, we derive a similar threshold to determine the structure of an optimal margin intensity map.National Cancer Institute (U.S.) (grant R01-CA103904)National Cancer Institute (U.S.) (grant R01-CA118200)Natural Sciences and Engineering Research Council of Canada (NSERC)Siemens AktiengesellschaftMassachusetts Institute of Technology. Hugh Hampton Young Memorial Fund fellowshi
    • 

    corecore