53,017 research outputs found
The Interface between Quantum Mechanics and General Relativity
The generation, as well as the detection, of gravitational radiation by means
of charged superfluids is considered. One example of such a "charged
superfluid" consists of a pair of Planck-mass-scale, ultracold "Millikan oil
drops," each with a single electron on its surface, in which the oil of the
drop is replaced by superfluid helium. When levitated in a magnetic trap, and
subjected to microwave-frequency electromagnetic radiation, a pair of such
"Millikan oil drops" separated by a microwave wavelength can become an
efficient quantum transducer between quadrupolar electromagnetic and
gravitational radiation. This leads to the possibility of a Hertz-like
experiment, in which the source of microwave-frequency gravitational radiation
consists of one pair of "Millikan oil drops" driven by microwaves, and the
receiver of such radiation consists of another pair of "Millikan oil drops" in
the far field driven by the gravitational radiation generated by the first
pair. The second pair then back-converts the gravitional radiation into
detectable microwaves. The enormous enhancement of the conversion efficiency
for these quantum transducers over that for electrons arises from the fact that
there exists macroscopic quantum phase coherence in these charged superfluid
systems.Comment: 22 pages, 7 figures; Lamb medal lecture on January 5, 2006 at the
Physics of Quantum Electronics Winter Colloquium at Snowbird, Utah; accepted
for publication in J. Mod. Optic
The effectiveness of refactoring, based on a compatibility testing taxonomy and a dependency graph
In this paper, we describe and then appraise a testing taxonomy proposed by van Deursen and Moonen (VD&M) based on the post-refactoring repeatability of tests. Four categories of refactoring are identified by VD&M ranging from semantic-preserving to incompatible, where, for the former, no new tests are required and for the latter, a completely new test set has to be developed. In our appraisal of the taxonomy, we heavily stress the need for the inter-dependence of the refactoring categories to be considered when making refactoring decisions and we base that need on a refactoring dependency graph developed as part of the research. We demonstrate that while incompatible refactorings may be harmful and time-consuming from a testing perspective, semantic-preserving refactorings can have equally unpleasant hidden ramifications despite their advantages. In fact, refactorings which fall into neither category have the most interesting properties. We support our results with empirical refactoring data drawn from seven Java open-source systems (OSS) and from the same analysis form a tentative categorization of code smells
Guide for Directors of Community-based Organizations / GuÃa de Dirigentes de Organizaciones de Base
Towards Work-Efficient Parallel Parameterized Algorithms
Parallel parameterized complexity theory studies how fixed-parameter
tractable (fpt) problems can be solved in parallel. Previous theoretical work
focused on parallel algorithms that are very fast in principle, but did not
take into account that when we only have a small number of processors (between
2 and, say, 1024), it is more important that the parallel algorithms are
work-efficient. In the present paper we investigate how work-efficient fpt
algorithms can be designed. We review standard methods from fpt theory, like
kernelization, search trees, and interleaving, and prove trade-offs for them
between work efficiency and runtime improvements. This results in a toolbox for
developing work-efficient parallel fpt algorithms.Comment: Prior full version of the paper that will appear in Proceedings of
the 13th International Conference and Workshops on Algorithms and Computation
(WALCOM 2019), February 27 - March 02, 2019, Guwahati, India. The final
authenticated version is available online at
https://doi.org/10.1007/978-3-030-10564-8_2
DYNAMICAL FORMATION SIGNATURES OF BLACK HOLE BINARIES IN THE FIRST DETECTED MERGERS BY LIGO
The dynamical formation of stellar-mass black hole-black hole binaries has long been a promising source of gravitational waves for the Laser Interferometer Gravitational-Wave Observatory (LIGO). Mass segregation, gravitational focusing, and multibody dynamical interactions naturally increase the interaction rate between the most massive black holes in dense stellar systems, eventually leading them to merge. We find that dynamical interactions, particularly three-body binary formation, enhance the merger rate of black hole binaries with total mass M-tot roughly as proportional to M-tot beta, with beta greater than or similar to 4. We find that this relation holds mostly independently of the initial mass function, but the exact value depends on the degree of mass segregation. The detection rate of such massive black hole binaries is only further enhanced by LIGO's greater sensitivity to massive black hole binaries with M-tot less than or similar to 80 M-circle dot We find that for power-law BH mass functions dN/dM proportional to M-alpha with alpha <= 2, LIGO is most likely to detect black hole binaries with a mass twice that of the maximum initial black hole mass and a mass ratio near one. Repeated mergers of black holes inside the cluster result in about similar to 5% of mergers being observed between two and three times the maximum initial black hole mass. Using these relations, one may be able to invert the observed distribution to the initial mass function with multiple detections of merging black hole binaries
- …