3,566 research outputs found
Bipartite entangled stabilizer mutually unbiased bases as maximum cliques of Cayley graphs
We examine the existence and structure of particular sets of mutually
unbiased bases (MUBs) in bipartite qudit systems. In contrast to well-known
power-of-prime MUB constructions, we restrict ourselves to using maximally
entangled stabilizer states as MUB vectors. Consequently, these bipartite
entangled stabilizer MUBs (BES MUBs) provide no local information, but are
sufficient and minimal for decomposing a wide variety of interesting operators
including (mixtures of) Jamiolkowski states, entanglement witnesses and more.
The problem of finding such BES MUBs can be mapped, in a natural way, to that
of finding maximum cliques in a family of Cayley graphs. Some relationships
with known power-of-prime MUB constructions are discussed, and observables for
BES MUBs are given explicitly in terms of Pauli operators.Comment: 8 pages, 1 figur
The SIC Question: History and State of Play
Recent years have seen significant advances in the study of symmetric
informationally complete (SIC) quantum measurements, also known as maximal sets
of complex equiangular lines. Previously, the published record contained
solutions up to dimension 67, and was with high confidence complete up through
dimension 50. Computer calculations have now furnished solutions in all
dimensions up to 151, and in several cases beyond that, as large as dimension
844. These new solutions exhibit an additional type of symmetry beyond the
basic definition of a SIC, and so verify a conjecture of Zauner in many new
cases. The solutions in dimensions 68 through 121 were obtained by Andrew
Scott, and his catalogue of distinct solutions is, with high confidence,
complete up to dimension 90. Additional results in dimensions 122 through 151
were calculated by the authors using Scott's code. We recap the history of the
problem, outline how the numerical searches were done, and pose some
conjectures on how the search technique could be improved. In order to
facilitate communication across disciplinary boundaries, we also present a
comprehensive bibliography of SIC research.Comment: 16 pages, 1 figure, many references; v3: updating bibliography,
dimension eight hundred forty fou
Hybrid LSTM and Encoder-Decoder Architecture for Detection of Image Forgeries
With advanced image journaling tools, one can easily alter the semantic
meaning of an image by exploiting certain manipulation techniques such as
copy-clone, object splicing, and removal, which mislead the viewers. In
contrast, the identification of these manipulations becomes a very challenging
task as manipulated regions are not visually apparent. This paper proposes a
high-confidence manipulation localization architecture which utilizes
resampling features, Long-Short Term Memory (LSTM) cells, and encoder-decoder
network to segment out manipulated regions from non-manipulated ones.
Resampling features are used to capture artifacts like JPEG quality loss,
upsampling, downsampling, rotation, and shearing. The proposed network exploits
larger receptive fields (spatial maps) and frequency domain correlation to
analyze the discriminative characteristics between manipulated and
non-manipulated regions by incorporating encoder and LSTM network. Finally,
decoder network learns the mapping from low-resolution feature maps to
pixel-wise predictions for image tamper localization. With predicted mask
provided by final layer (softmax) of the proposed architecture, end-to-end
training is performed to learn the network parameters through back-propagation
using ground-truth masks. Furthermore, a large image splicing dataset is
introduced to guide the training process. The proposed method is capable of
localizing image manipulations at pixel level with high precision, which is
demonstrated through rigorous experimentation on three diverse datasets
Minimal average degree aberration and the state polytope for experimental designs
For a particular experimental design, there is interest in finding which
polynomial models can be identified in the usual regression set up. The
algebraic methods based on Groebner bases provide a systematic way of doing
this. The algebraic method does not in general produce all estimable models but
it can be shown that it yields models which have minimal average degree in a
well-defined sense and in both a weighted and unweighted version. This provides
an alternative measure to that based on "aberration" and moreover is applicable
to any experimental design. A simple algorithm is given and bounds are derived
for the criteria, which may be used to give asymptotic Nyquist-like
estimability rates as model and sample sizes increase
- …