18 research outputs found
Note on Ward-Horadam H(x) - binomials' recurrences and related interpretations, II
We deliver here second new recurrence formula,
were array is appointed by sequence of
functions which in predominantly considered cases where chosen to be
polynomials . Secondly, we supply a review of selected related combinatorial
interpretations of generalized binomial coefficients. We then propose also a
kind of transfer of interpretation of coefficients onto
coefficients interpretations thus bringing us back to
and Donald Ervin Knuth relevant investigation decades
ago.Comment: 57 pages, 8 figure
Knots, BPS states, and algebraic curves
We analyze relations between BPS degeneracies related to
Labastida-Marino-Ooguri-Vafa (LMOV) invariants, and algebraic curves associated
to knots. We introduce a new class of such curves that we call extremal
A-polynomials, discuss their special properties, and determine exact and
asymptotic formulas for the corresponding (extremal) BPS degeneracies. These
formulas lead to nontrivial integrality statements in number theory, as well as
to an improved integrality conjecture stronger than the known M-theory
integrality predictions. Furthermore we determine the BPS degeneracies encoded
in augmentation polynomials and show their consistency with known colored
HOMFLY polynomials. Finally we consider refined BPS degeneracies for knots,
determine them from the knowledge of super-A-polynomials, and verify their
integrality. We illustrate our results with twist knots, torus knots, and
various other knots with up to 10 crossings.Comment: 43 pages, 6 figure
Interferometry with independent Bose-Einstein ondensates: parity as an EPR/Bell quantum variable
When independent Bose-Einstein condensates (BEC), described quantum
mechanically by Fock (number) states, are sent into interferometers, the
measurement of the output port at which the particles are detected provides a
binary measurement, with two possible results . With two interferometers
and two BEC's, the parity (product of all results obtained at each
interferometer) has all the features of an Einstein-Podolsky-Rosen quantity,
with perfect correlations predicted by quantum mechanics when the settings
(phase shifts of the interferometers) are the same. When they are different,
significant violations of Bell inequalities are obtained. These violations do
not tend to zero when the number of particles increases, and can therefore
be obtained with arbitrarily large systems, but a condition is that all
particles should be detected. We discuss the general experimental requirements
for observing such effects, the necessary detection of all particles in
correlation, the role of the pixels of the CCD detectors, and that of the
alignments of the interferometers in terms of matching of the wave fronts of
the sources in the detection regions. Another scheme involving three
interferometers and three BEC's is discussed; it leads to Greenberger Horne
Zeilinger (GHZ) sign contradictions, as in the usual GHZ case with three
particles, but for an arbitrarily large number of them. Finally,
generalizations of the Hardy impossibilities to an arbitrarily large number of
particles are introduced. BEC's provide a large versality for observing
violations of local realism in a variety of experimental arrangements.Comment: appendix adde
Q(sqrt(-3))-Integral Points on a Mordell Curve
We use an extension of quadratic Chabauty to number fields,recently developed by the author with Balakrishnan, Besser and M ̈uller,combined with a sieving technique, to determine the integral points overQ(√−3) on the Mordell curve y2 = x3 − 4
Computer Science for Continuous Data:Survey, Vision, Theory, and Practice of a Computer Analysis System
Building on George Boole's work, Logic provides a rigorous foundation for the powerful tools in Computer Science that underlie nowadays ubiquitous processing of discrete data, such as strings or graphs. Concerning continuous data, already Alan Turing had applied "his" machines to formalize and study the processing of real numbers: an aspect of his oeuvre that we transform from theory to practice.The present essay surveys the state of the art and envisions the future of Computer Science for continuous data: natively, beyond brute-force discretization, based on and guided by and extending classical discrete Computer Science, as bridge between Pure and Applied Mathematics