12,646 research outputs found
Algorithms for FFT Beamforming Radio Interferometers
Radio interferometers consisting of identical antennas arranged on a regular
lattice permit fast Fourier transform beamforming, which reduces the
correlation cost from in the number of antennas to
. We develop a formalism for describing this process and
apply this formalism to derive a number of algorithms with a range of
observational applications. These include algorithms for forming arbitrarily
pointed tied-array beams from the regularly spaced Fourier-transform formed
beams, sculpting the beams to suppress sidelobes while only losing
percent-level sensitivity, and optimally estimating the position of a detected
source from its observed brightness in the set of beams. We also discuss the
effect that correlations in the visibility-space noise, due to cross-talk and
sky contributions, have on the optimality of Fourier transform beamforming,
showing that it does not strictly preserve the sky information of the
correlation, even for an idealized array. Our results have applications to a
number of upcoming interferometers, in particular the Canadian Hydrogen
Intensity Mapping Experiment--Fast Radio Burst (CHIME/FRB) project.Comment: 17 pages, 4 figures, accepted to Ap
Nearly Optimal Private Convolution
We study computing the convolution of a private input with a public input
, while satisfying the guarantees of -differential
privacy. Convolution is a fundamental operation, intimately related to Fourier
Transforms. In our setting, the private input may represent a time series of
sensitive events or a histogram of a database of confidential personal
information. Convolution then captures important primitives including linear
filtering, which is an essential tool in time series analysis, and aggregation
queries on projections of the data.
We give a nearly optimal algorithm for computing convolutions while
satisfying -differential privacy. Surprisingly, we follow
the simple strategy of adding independent Laplacian noise to each Fourier
coefficient and bounding the privacy loss using the composition theorem of
Dwork, Rothblum, and Vadhan. We derive a closed form expression for the optimal
noise to add to each Fourier coefficient using convex programming duality. Our
algorithm is very efficient -- it is essentially no more computationally
expensive than a Fast Fourier Transform.
To prove near optimality, we use the recent discrepancy lowerbounds of
Muthukrishnan and Nikolov and derive a spectral lower bound using a
characterization of discrepancy in terms of determinants
Compressive Mining: Fast and Optimal Data Mining in the Compressed Domain
Real-world data typically contain repeated and periodic patterns. This
suggests that they can be effectively represented and compressed using only a
few coefficients of an appropriate basis (e.g., Fourier, Wavelets, etc.).
However, distance estimation when the data are represented using different sets
of coefficients is still a largely unexplored area. This work studies the
optimization problems related to obtaining the \emph{tightest} lower/upper
bound on Euclidean distances when each data object is potentially compressed
using a different set of orthonormal coefficients. Our technique leads to
tighter distance estimates, which translates into more accurate search,
learning and mining operations \textit{directly} in the compressed domain.
We formulate the problem of estimating lower/upper distance bounds as an
optimization problem. We establish the properties of optimal solutions, and
leverage the theoretical analysis to develop a fast algorithm to obtain an
\emph{exact} solution to the problem. The suggested solution provides the
tightest estimation of the -norm or the correlation. We show that typical
data-analysis operations, such as k-NN search or k-Means clustering, can
operate more accurately using the proposed compression and distance
reconstruction technique. We compare it with many other prevalent compression
and reconstruction techniques, including random projections and PCA-based
techniques. We highlight a surprising result, namely that when the data are
highly sparse in some basis, our technique may even outperform PCA-based
compression.
The contributions of this work are generic as our methodology is applicable
to any sequential or high-dimensional data as well as to any orthogonal data
transformation used for the underlying data compression scheme.Comment: 25 pages, 20 figures, accepted in VLD
Optimal photonic indistinguishability tests in multimode networks
Particle indistinguishability is at the heart of quantum statistics that
regulates fundamental phenomena such as the electronic band structure of
solids, Bose-Einstein condensation and superconductivity. Moreover, it is
necessary in practical applications such as linear optical quantum computation
and simulation, in particular for Boson Sampling devices. It is thus crucial to
develop tools to certify genuine multiphoton interference between multiple
sources. Here we show that so-called Sylvester interferometers are near-optimal
for the task of discriminating the behaviors of distinguishable and
indistinguishable photons. We report the first implementations of integrated
Sylvester interferometers with 4 and 8 modes with an efficient, scalable and
reliable 3D-architecture. We perform two-photon interference experiments
capable of identifying indistinguishable photon behaviour with a Bayesian
approach using very small data sets. Furthermore, we employ experimentally this
new device for the assessment of scattershot Boson Sampling. These results open
the way to the application of Sylvester interferometers for the optimal
assessment of multiphoton interference experiments.Comment: 9+10 pages, 6+6 figures, added supplementary material, completed and
updated bibliograph
- …