2,227 research outputs found
Quantum Tests of the Foundations of General Relativity
The role of the equivalence principle in the context of non-relativistic
quantum mechanics and matter wave interferometry, especially atom beam
interferometry, will be discussed. A generalised form of the weak equivalence
principle which is capable of covering quantum phenomena too, will be proposed.
It is shown that this generalised equivalence principle is valid for matter
wave interferometry and for the dynamics of expectation values. In addition,
the use of this equivalence principle makes it possible to determine the
structure of the interaction of quantum systems with gravitational and inertial
fields. It is also shown that the path of the mean value of the position
operator in the case of gravitational interaction does fulfill this generalised
equivalence principle.Comment: Classical and Quantum Gravity 15, 13 (1998
Matching LTB and FRW spacetimes through a null hypersurface
Matching of a LTB metric representing dust matter to a background FRW
universe across a null hypersurface is studied. In general, an unrestricted
matching is possible only if the background FRW is flat or open. There is in
general no gravitational impulsive wave present on the null hypersurface which
is shear-free and expanding. Special cases of the vanishing pressure or energy
density on the hypersurface is discussed. In the case of vanishing energy
momentum tensor of the null hypersurface, i.e. in the case of a null boundary,
it turns out that all possible definitions of the Hubble parameter on the null
hypersurface, being those of LTB or that of FRW, are equivalent, and that a
flat FRW can only be joined smoothly to a flat LTB.Comment: 9 page
Tests of relativity using a microwave resonator
The frequencies of a cryogenic sapphire oscillator and a hydrogen maser are
compared to set new constraints on a possible violation of Lorentz invariance.
We determine the variation of the oscillator frequency as a function of its
orientation (Michelson-Morley test) and of its velocity (Kennedy-Thorndike
test) with respect to a preferred frame candidate. We constrain the
corresponding parameters of the Mansouri and Sexl test theory to and which is equivalent to the best previous result for the
former and represents a 30 fold improvement for the latter.Comment: 8 pages, 2 figures, submitted to Physical Review Letters (October 3,
2002
Long-term follow-up of patients undergoing resection of tnm stage i colorectal cancer: an analysis of tumour and host determinants of outcome
Background
Screening for colorectal cancer improves cancer-specific survival (CSS) through the detection of early-stage disease; however, its impact on overall survival (OS) is unclear. The present study examined tumour and host determinants of outcome in TNM Stage I disease.
Methods
All patients with pathologically confirmed TNM Stage I disease across 4 hospitals in the North of Glasgow between 2000 and 2008 were included. The preoperative modified Glasgow Prognostic Score (mGPS) was used as a marker of the host systemic inflammatory response (SIR).
Results
There were 191 patients identified, 105 (55 %) were males, 91 (48 %) were over the age of 75 years and 7 (4 %) patients underwent an emergency operation. In those with a preoperative CRP result (n = 150), 35 (24 %) patients had evidence of an elevated mGPS. Median follow-up of survivors was 116 months (minimum 72 months) during which 88 (46 %) patients died; 7 (8 %) had postoperative deaths, 15 (17 %) had cancer-related deaths and 66 (75 %) had non-cancer-related deaths. 5-year CSS was 95 % and OS was 76 %. On univariate analysis, advancing age (p < 0.001), emergency presentation (p = 0.008), and an elevated mGPS (p = 0.012) were associated with reduced OS. On multivariate analysis, only age (HR = 3.611, 95 % CI 2.049–6.365, p < 0.001) and the presence of an elevated mGPS (HR = 2.173, 95 % CI 1.204–3.921, p = 0.010) retained significance.
Conclusions
In patients undergoing resection for TNM Stage I colorectal cancer, an elevated mGPS was an objective independent marker of poorer OS. These patients may benefit from a targeted intervention
An Evolutionary Approach to Load Balancing Parallel Computations
We present a new approach to balancing the workload in a multicomputer when the problem is decomposed into subproblems mapped to the processors. It is based on a hybrid genetic algorithm. A number of design choices for genetic algorithms are combined in order to ameliorate the problem of premature convergence that is often encountered in the implementation of classical genetic algorithms. The algorithm is hybridized by including a hill climbing procedure which significantly improves the efficiency of the evolution. Moreover, it makes use of problem specific information to evade some computational costs and to reinforce favorable aspects of the genetic search at some appropriate points. The experimental results show that the hybrid genetic algorithm can find solutions within 3% of the optimum in a reasonable time. They also suggest that this approach is not biased towards particular problem structures
Factors associated with the efficacy of polyp detection during routine flexible sigmoidoscopy
Objective: Flexible sigmoidoscopy reduces the incidence of colonic cancer through the detection and removal of premalignant adenomas. However, the efficacy of the procedure is variable. The aim of the present study was to examine factors associated with the efficacy of detecting polyps during flexible sigmoidoscopy.
Design and patients: Retrospective observational cohort study of all individuals undergoing routine flexible sigmoidoscopy in NHS Greater Glasgow and Clyde from January 2013 to January 2016.
Results: A total of 7713 patients were included. Median age was 52 years and 50% were male. Polyps were detected in 1172 (13%) patients. On multivariate analysis, increasing age (OR 1.020 (1.016–1.023) p<0.001), male sex (OR 1.23 (1.10–1.38) p<0.001) and the use of any bowel preparation (OR 3.55 (1.47–8.57) p<0.001) were associated with increasing numbers of polyps being detected. There was no significant difference in the number of polyps found in patients who had received an oral laxative preparation compared with an enema (OR 3.81 (1.57–9.22) vs 3.45 (1.43–8.34)), or in those who received sedation versus those who had not (OR 1.00 vs 1.04 (0.91–1.17) p=0.591). Furthermore, the highest number of polyps was found when the sigmoidoscope was inserted to the descending colon (OR 1.30 (1.04–1.63)).
Conclusions: Increasing age, male sex and the utilisation of any bowel preparation were associated with an increased polyp detection rate. However, the use of sedation or oral laxative preparation appears to confer no additional benefit. In addition, the results indicate that insertion to the descending colon optimises the efficacy of flexible sigmoidoscopy polyp detection
Parallel Genetic Algorithms with Application to Load Balancing for Parallel Computing
A new coarse grain parallel genetic algorithm (PGA) and a new implementation of a data-parallel GA are presented in this paper. They are based on models of natural evolution in which the population is formed of discontinuous or continuous subpopulations. In addition to simulating natural evolution, the intrinsic parallelism in the two PGA\u27s minimizes the possibility of premature convergence that the implementation of classic GA\u27s often encounters. Intrinsic parallelism also allows the evolution of fit genotypes in a smaller number of generations in the PGA\u27s than in sequential GA\u27s, leading to superlinear speed-ups. The PGA\u27s have been implemented on a hypercube and a Connection Machine, and their operation is demonstrated by applying them to the load balancing problem in parallel computing. The PGA\u27s have found near-optimal solutions which are comparable to the solutions of a simulated annealing algorithm and are better than those produced by a sequential GA and by other load balancing methods. On one hand, The PGA\u27s accentuate the advantage of parallel computers for simulating natural evolution. On the other hand, they represent new techniques for load balancing parallel computations
Thick planar domain wall: its thin wall limit and dynamics
We consider a planar gravitating thick domain wall of the
theory as a spacetime with finite thickness glued to two vacuum spacetimes on
each side of it. Darmois junction conditions written on the boundaries of the
thick wall with the embedding spacetimes reproduce the Israel junction
condition across the wall in the limit of infinitesimal thickness. The thick
planar domain wall located at a fixed position is then transformed to a new
coordinate system in which its dynamics can be formulated. It is shown that the
wall's core expands as if it were a thin wall. The thickness in the new
coordinates is not constant anymore and its time dependence is given.Comment: 11 pages, to appear in IJMP
Time scale based analysis of in-situ crystal formation in droplet undergoing rapid dehydration
The surface structure of crystalline particles affects the functionality of the particles in drug delivery. Prediction of the final structure of particles that crystallize easily within the spray drying process is of interests for many applications. A theoretical framework was developed for the prediction of crystal structure precipitating on the surface of the particle. This model was based on the dimensionless Damkohler number (Da), to be an indicator of final particle morphology. Timescales of evaporation and reaction were required for calculation of the Damkohler number. The modified evaporation time scale was estimated based on the time that is available for the crystal to precipitate after supersaturation. The reaction time scale was estimated based on the time scale for induction time. Mannitol was produced under different processing conditions in order to validate the theoretical model. Results showed for the high Damkohler numbers, the surface structure of the particle was rough, while smaller Damkohler numbers led to relatively smooth particle surfaces. Additionally, although the beta polymorph was dominant in all of the experiments, alpha polymorph was precipitated in the experiments with a large Damkohler number. The theoretical framework developed will be a useful predictive tool to guide the manipulation of particle crystallization in spray dryers
Limits on the Time Evolution of Space Dimensions from Newton's Constant
Limits are imposed upon the possible rate of change of extra spatial
dimensions in a decrumpling model Universe with time variable spatial
dimensions (TVSD) by considering the time variation of (1+3)-dimensional
Newton's constant. Previous studies on the time variation of (1+3)-dimensional
Newton's constant in TVSD theory had not been included the effects of the
volume of the extra dimensions and the effects of the surface area of the unit
sphere in D-space dimensions. Our main result is that the absolute value of the
present rate of change of spatial dimensions to be less than about
10^{-14}yr^{-1}. Our results would appear to provide a prima facie case for
ruling the TVSD model out. We show that based on observational bounds on the
present-day variation of Newton's constant, one would have to conclude that the
spatial dimension of the Universe when the Universe was at the Planck scale to
be less than or equal to 3.09. If the dimension of space when the Universe was
at the Planck scale is constrained to be fractional and very close to 3, then
the whole edifice of TVSD model loses credibility.Comment: 22 pages, accepted for publication in Int.J.Mod.Phys.
- …