5,132 research outputs found
A note on the fourth moment of Dirichlet -functions
We prove an asymptotic formula for the fourth power mean of Dirichlet
L-functions averaged over primitive characters to modulus q and over t\in [0,T]
which is particularly effective when q \ge T. In this range the correct order
of magnitude was not previously known.Comment: 8 page
Galois Unitaries, Mutually Unbiased Bases, and MUB-balanced states
A Galois unitary is a generalization of the notion of anti-unitary operators.
They act only on those vectors in Hilbert space whose entries belong to some
chosen number field. For Mutually Unbiased Bases the relevant number field is a
cyclotomic field. By including Galois unitaries we are able to remove a
mismatch between the finite projective group acting on the bases on the one
hand, and the set of those permutations of the bases that can be implemented as
transformations in Hilbert space on the other hand. In particular we show that
there exist transformations that cycle through all the bases in every dimension
which is an odd power of an odd prime. (For even primes unitary MUB-cyclers
exist.) These transformations have eigenvectors, which are MUB-balanced states
(i.e. rotationally symmetric states in the original terminology of Wootters and
Sussman) if and only if d = 3 modulo 4. We conjecture that this construction
yields all such states in odd prime power dimension.Comment: 32 pages, 2 figures, AMS Latex. Version 2: minor improvements plus a
few additional reference
Cross-Language Learning for Program Classification using Bilateral Tree-Based Convolutional Neural Networks
Towards the vision of translating code that implements an algorithm from one programming language into another, this paper proposes an approach for automated program classification using bilateral tree-based convolutional neural networks (BiTBCNNs). It is layered on top of two tree-based convolutional neural networks (TBCNNs), each of which recognizes the algorithm of code written in an individual programming language. The combination layer of the networks recognizes the similarities and differences among code in different programming languages. The BiTBCNNs are trained using the source code in different languages but known to implement the same algorithms and/or functionalities. For a preliminary evaluation, we use 3591 Java and 3534 C++ code snippets from 6 algorithms we crawled systematically from GitHub. We obtained over 90% accuracy in the cross-language binary classification task to tell whether any given two code snippets implement a same algorithm. Also, for the algorithm classification task, i.e., to predict which one of the six algorithm labels is implemented by an arbitrary C++ code snippet, we achieved over 80% precision
On the variance of sums of arithmetic functions over primes in short intervals and pair correlation for L-functions in the Selberg class
We establish the equivalence of conjectures concerning the pair correlation
of zeros of -functions in the Selberg class and the variances of sums of a
related class of arithmetic functions over primes in short intervals. This
extends the results of Goldston & Montgomery [7] and Montgomery & Soundararajan
[11] for the Riemann zeta-function to other -functions in the Selberg class.
Our approach is based on the statistics of the zeros because the analogue of
the Hardy-Littlewood conjecture for the auto-correlation of the arithmetic
functions we consider is not available in general. One of our main findings is
that the variances of sums of these arithmetic functions over primes in short
intervals have a different form when the degree of the associated -functions
is 2 or higher to that which holds when the degree is 1 (e.g. the Riemann
zeta-function). Specifically, when the degree is 2 or higher there are two
regimes in which the variances take qualitatively different forms, whilst in
the degree-1 case there is a single regime
- …