10 research outputs found
Primordial Gravitational Waves Enhancement
We reconsider the enhancement of primordial gravitational waves that arises
from a quantum gravitational model of inflation. A distinctive feature of this
model is that the end of inflation witnesses a brief phase during which the
Hubble parameter oscillates in sign, changing the usual Hubble friction to
anti-friction. An earlier analysis of this model was based on numerically
evolving the graviton mode functions after guessing their initial conditions
near the end of inflation. The current study is based on an equation which
directly evolves the normalized square of the magnitude. We are also able to
make a very reliable estimate for the initial condition using a rapidly
converging expansion for the sub-horizon regime. Results are obtained for the
energy density per logarithmic wave number as a fraction of the critical
density. These results exhibit how the enhanced signal depends upon the number
of oscillatory periods; they also show the resonant effects associated with
particular wave numbers.Comment: 25 pages, 14 figure
Interaction of gravitational waves with matter
We develop a unified formalism for describing the interaction of
gravitational waves with matter that clearly separates the effects of general
relativity from those due to interactions in the matter. Using it, we derive a
general expression for the dispersion of gravitational waves in matter in terms
of correlation functions for the matter in flat spacetime. The self energy of a
gravitational wave is shown to have contributions analogous to the paramagnetic
and diamagnetic contributions to the self energy of an electromagnetic wave. We
apply the formalism to some simple systems - free particles, an interacting
scalar field, and a fermionic superfluid.Comment: NORDITA-2011-8
Wave Propagation in Stochastic Spacetimes: Localization, Amplification and Particle Creation
Here we study novel effects associated with electromagnetic wave propagation
in a Robertson-Walker universe and the Schwarzschild spacetime with a small
amount of metric stochasticity. We find that localization of electromagnetic
waves occurs in a Robertson-Walker universe with time-independent metric
stochasticity, while time-dependent metric stochasticity induces exponential
instability in the particle production rate. For the Schwarzschild metric,
time-independent randomness can decrease the total luminosity of Hawking
radiation due to multiple scattering of waves outside the black hole and gives
rise to event horizon fluctuations and thus fluctuations in the Hawking
temperature.Comment: 26 pages, 1 Postscript figure, submitted to Phys. Rev. D on July 29,
199
Superadiabatic-type magnetic amplification in conventional cosmology
We consider the evolution of cosmological magnetic fields in FRW models and
outline a geometrical mechanism for their superadiabatic amplification on large
scales. The mechanism operates within standard electromagnetic theory and
applies to FRW universes with open spatial sections. We discuss the general
relativistic nature of the effect and show how it modifies the adiabatic
magnetic evolution. Assuming a universe that is only marginally open today, we
estimate the main features of the superadiabatically amplified residual field.Comment: Minor changes. Published versio
Decaying Vacuum Energy and Deflationary Cosmology in Open and Closed Universes
We consider a nonsingular deflationary cosmological model with decaying
vacuum energy density in universes of arbitrary spatial curvature. Irrespective
of the value of , the models are characterized by an arbitrary time scale
which determines the initial temperature of the universe and the
largest value of the vacuum energy density, the slow decay of which generates
all the presently observed matter-energy of the universe. If is of
the order of the Planck time, the models begin with the Planck temperature and
the present day value of the cosmological constant satisfies
as theoretically suggested. It is also
shown that all models allow a density parameter and that the
age of the universe is large enough to agree with observations even with the
high value of suggested by recent measurements.Comment: 20 pages, 2 figures (available from the authors), uses LaTe
Development and Optimization of a Machine-Learning Prediction Model for Acute Desquamation After Breast Radiation Therapy in the Multicenter REQUITE Cohort
Purpose
Some patients with breast cancer treated by surgery and radiation therapy experience clinically significant toxicity, which may adversely affect cosmesis and quality of life. There is a paucity of validated clinical prediction models for radiation toxicity. We used machine learning (ML) algorithms to develop and optimise a clinical prediction model for acute breast desquamation after whole breast external beam radiation therapy in the prospective multicenter REQUITE cohort study.
Methods and Materials
Using demographic and treatment-related features (m = 122) from patients (n = 2058) at 26 centers, we trained 8 ML algorithms with 10-fold cross-validation in a 50:50 random-split data set with class stratification to predict acute breast desquamation. Based on performance in the validation data set, the logistic model tree, random forest, and naïve Bayes models were taken forward to cost-sensitive learning optimisation.
Results
One hundred and ninety-two patients experienced acute desquamation. Resampling and cost-sensitive learning optimisation facilitated an improvement in classification performance. Based on maximising sensitivity (true positives), the “hero” model was the cost-sensitive random forest algorithm with a false-negative: false-positive misclassification penalty of 90:1 containing m = 114 predictive features. Model sensitivity and specificity were 0.77 and 0.66, respectively, with an area under the curve of 0.77 in the validation cohort.
Conclusions
ML algorithms with resampling and cost-sensitive learning generated clinically valid prediction models for acute desquamation using patient demographic and treatment features. Further external validation and inclusion of genomic markers in ML prediction models are worthwhile, to identify patients at increased risk of toxicity who may benefit from supportive intervention or even a change in treatment plan
Development and Optimization of a Machine-Learning Prediction Model for Acute Desquamation After Breast Radiation Therapy in the Multicenter REQUITE Cohort
PurposeSome patients with breast cancer treated by surgery and radiation therapy experience clinically significant toxicity, which may adversely affect cosmesis and quality of life. There is a paucity of validated clinical prediction models for radiation toxicity. We used machine learning (ML) algorithms to develop and optimise a clinical prediction model for acute breast desquamation after whole breast external beam radiation therapy in the prospective multicenter REQUITE cohort study.Methods and MaterialsUsing demographic and treatment-related features (m = 122) from patients (n = 2058) at 26 centers, we trained 8 ML algorithms with 10-fold cross-validation in a 50:50 random-split data set with class stratification to predict acute breast desquamation. Based on performance in the validation data set, the logistic model tree, random forest, and naïve Bayes models were taken forward to cost-sensitive learning optimisation.ResultsOne hundred and ninety-two patients experienced acute desquamation. Resampling and cost-sensitive learning optimisation facilitated an improvement in classification performance. Based on maximising sensitivity (true positives), the “hero” model was the cost-sensitive random forest algorithm with a false-negative: false-positive misclassification penalty of 90:1 containing m = 114 predictive features. Model sensitivity and specificity were 0.77 and 0.66, respectively, with an area under the curve of 0.77 in the validation cohort.ConclusionsML algorithms with resampling and cost-sensitive learning generated clinically valid prediction models for acute desquamation using patient demographic and treatment features. Further external validation and inclusion of genomic markers in ML prediction models are worthwhile, to identify patients at increased risk of toxicity who may benefit from supportive intervention or even a change in treatment plan.</div