674,248 research outputs found
Static replica approach to critical correlations in glassy systems
We discuss the slow relaxation phenomenon in glassy systems by means of
replicas by constructing a static field theory approach to the problem. At the
mean field level we study how criticality in the four point correlation
functions arises because of the presence of soft modes and we derive an
effective replica field theory for these critical fluctuations. By using this
at the Gaussian level we obtain many physical quantities: the correlation
length, the exponent parameter that controls the Mode-Coupling dynamical
exponents for the two-point correlation functions, and the prefactor of the
critical part of the four point correlation functions. Moreover we perform a
one-loop computation in order to identify the region in which the mean field
Gaussian approximation is valid. The result is a Ginzburg criterion for the
glass transition. We define and compute in this way a proper Ginzburg number.
Finally, we present numerical values of all these quantities obtained from the
Hypernetted Chain approximation for the replicated liquid theory.Comment: 34 pages, 1 figure - to be published on J.Chem.Phys. for a special
issue on the Glass Transitio
Regular Combinators for String Transformations
We focus on (partial) functions that map input strings to a monoid such as
the set of integers with addition and the set of output strings with
concatenation. The notion of regularity for such functions has been defined
using two-way finite-state transducers, (one-way) cost register automata, and
MSO-definable graph transformations. In this paper, we give an algebraic and
machine-independent characterization of this class analogous to the definition
of regular languages by regular expressions. When the monoid is commutative, we
prove that every regular function can be constructed from constant functions
using the combinators of choice, split sum, and iterated sum, that are analogs
of union, concatenation, and Kleene-*, respectively, but enforce unique (or
unambiguous) parsing. Our main result is for the general case of
non-commutative monoids, which is of particular interest for capturing regular
string-to-string transformations for document processing. We prove that the
following additional combinators suffice for constructing all regular
functions: (1) the left-additive versions of split sum and iterated sum, which
allow transformations such as string reversal; (2) sum of functions, which
allows transformations such as copying of strings; and (3) function
composition, or alternatively, a new concept of chained sum, which allows
output values from adjacent blocks to mix.Comment: This is the full version, with omitted proofs and constructions, of
the conference paper currently in submissio
Discrete phase space based on finite fields
The original Wigner function provides a way of representing in phase space
the quantum states of systems with continuous degrees of freedom. Wigner
functions have also been developed for discrete quantum systems, one popular
version being defined on a 2N x 2N discrete phase space for a system with N
orthogonal states. Here we investigate an alternative class of discrete Wigner
functions, in which the field of real numbers that labels the axes of
continuous phase space is replaced by a finite field having N elements. There
exists such a field if and only if N is a power of a prime; so our formulation
can be applied directly only to systems for which the state-space dimension
takes such a value. Though this condition may seem limiting, we note that any
quantum computer based on qubits meets the condition and can thus be
accommodated within our scheme. The geometry of our N x N phase space also
leads naturally to a method of constructing a complete set of N+1 mutually
unbiased bases for the state space.Comment: 60 pages; minor corrections and additional references in v2 and v3;
improved historical introduction in v4; references to quantum error
correction in v5; v6 corrects the value quoted for the number of similarity
classes for N=
Positive maps, majorization, entropic inequalities, and detection of entanglement
In this paper, we discuss some general connections between the notions of
positive map, weak majorization and entropic inequalities in the context of
detection of entanglement among bipartite quantum systems. First, basing on the
fact that any positive map can
be written as the difference between two completely positive maps
, we propose a possible way to generalize the
Nielsen--Kempe majorization criterion. Then we present two methods of
derivation of some general classes of entropic inequalities useful for the
detection of entanglement. While the first one follows from the aforementioned
generalized majorization relation and the concept of the Schur--concave
decreasing functions, the second is based on some functional inequalities. What
is important is that, contrary to the Nielsen--Kempe majorization criterion and
entropic inequalities, our criteria allow for the detection of entangled states
with positive partial transposition when using indecomposable positive maps. We
also point out that if a state with at least one maximally mixed subsystem is
detected by some necessary criterion based on the positive map , then
there exist entropic inequalities derived from (by both procedures)
that also detect this state. In this sense, they are equivalent to the
necessary criterion [I\ot\Lambda](\varrho_{AB})\geq 0. Moreover, our
inequalities provide a way of constructing multi--copy entanglement witnesses
and therefore are promising from the experimental point of view. Finally, we
discuss some of the derived inequalities in the context of recently introduced
protocol of state merging and possibility of approximating the mean value of a
linear entanglement witness.Comment: the published version, 25 pages in NJP format, 6 figure
Black-Box Uselessness: Composing Separations in Cryptography
Black-box separations have been successfully used to identify the limits of a powerful set of tools in cryptography, namely those of black-box reductions. They allow proving that a large set of techniques are not capable of basing one primitive ? on another ?. Such separations, however, do not say anything about the power of the combination of primitives ??,?? for constructing ?, even if ? cannot be based on ?? or ?? alone.
By introducing and formalizing the notion of black-box uselessness, we develop a framework that allows us to make such conclusions. At an informal level, we call primitive ? black-box useless (BBU) for ? if ? cannot help constructing ? in a black-box way, even in the presence of another primitive ?. This is formalized by saying that ? is BBU for ? if for any auxiliary primitive ?, whenever there exists a black-box construction of ? from (?,?), then there must already also exist a black-box construction of ? from ? alone. We also formalize various other notions of black-box uselessness, and consider in particular the setting of efficient black-box constructions when the number of queries to ? is below a threshold.
Impagliazzo and Rudich (STOC\u2789) initiated the study of black-box separations by separating key agreement from one-way functions. We prove a number of initial results in this direction, which indicate that one-way functions are perhaps also black-box useless for key agreement. In particular, we show that OWFs are black-box useless in any construction of key agreement in either of the following settings: (1) the key agreement has perfect correctness and one of the parties calls the OWF a constant number of times; (2) the key agreement consists of a single round of interaction (as in Merkle-type protocols). We conjecture that OWFs are indeed black-box useless for general key agreement.
We also show that certain techniques for proving black-box separations can be lifted to the uselessness regime. In particular, we show that the lower bounds of Canetti, Kalai, and Paneth (TCC\u2715) as well as Garg, Mahmoody, and Mohammed (Crypto\u2717 & TCC\u2717) for assumptions behind indistinguishability obfuscation (IO) can be extended to derive black-box uselessness of a variety of primitives for obtaining (approximately correct) IO. These results follow the so-called "compiling out" technique, which we prove to imply black-box uselessness.
Eventually, we study the complementary landscape of black-box uselessness, namely black-box helpfulness. We put forth the conjecture that one-way functions are black-box helpful for building collision-resistant hash functions. We define two natural relaxations of this conjecture, and prove that both of these conjectures are implied by a natural conjecture regarding random permutations equipped with a collision finder oracle, as defined by Simon (Eurocrypt\u2798). This conjecture may also be of interest in other contexts, such as amplification of hardness
- …