14 research outputs found
Small Pseudo-Random Families of Matrices: Derandomizing Approximate Quantum Encryption
A quantum encryption scheme (also called private quantum channel, or state
randomization protocol) is a one-time pad for quantum messages. If two parties
share a classical random string, one of them can transmit a quantum state to
the other so that an eavesdropper gets little or no information about the state
being transmitted. Perfect encryption schemes leak no information at all about
the message. Approximate encryption schemes leak a non-zero (though small)
amount of information but require a shorter shared random key. Approximate
schemes with short keys have been shown to have a number of applications in
quantum cryptography and information theory.
This paper provides the first deterministic, polynomial-time constructions of
quantum approximate encryption schemes with short keys. Previous constructions
(quant-ph/0307104) are probabilistic--that is, they show that if the operators
used for encryption are chosen at random, then with high probability the
resulting protocol will be a secure encryption scheme. Moreover, the resulting
protocol descriptions are exponentially long. Our protocols use keys of the
same length as (or better length than) the probabilistic constructions; to
encrypt qubits approximately, one needs bits of shared key.
An additional contribution of this paper is a connection between classical
combinatorial derandomization and constructions of pseudo-random matrix
families in a continuous space.Comment: 11 pages, no figures. In Proceedings of RANDOM 2004, Cambridge, MA,
August 200
Better lossless condensers through derandomized curve samplers
Lossless condensers are unbalanced expander graphs, with expansion close to optimal. Equivalently, they may be viewed as functions that use a short random seed to map a source on n bits to a source on many fewer bits while preserving all of the min-entropy. It is known how to build lossless condensers when the graphs are slightly unbalanced in the work of M. Capalbo et al. (2002). The highly unbalanced case is also important but the only known construction does not condense the source well. We give explicit constructions of lossless condensers with condensing close to optimal, and using near-optimal seed length. Our main technical contribution is a randomness-efficient method for sampling FD (where F is a field) with low-degree curves. This problem was addressed before in the works of E. Ben-Sasson et al. (2003) and D. Moshkovitz and R. Raz (2006) but the solutions apply only to degree one curves, i.e., lines. Our technique is new and elegant. We use sub-sampling and obtain our curve samplers by composing a sequence of low-degree manifolds, starting with high-dimension, low-degree manifolds and proceeding through lower and lower dimension manifolds with (moderately) growing degrees, until we finish with dimension-one, low-degree manifolds, i.e., curves. The technique may be of independent interest
Randomness-efficient Low Degree Tests and Short PCPs via Epsilon-Biased Sets
We present the first explicit construction of Probabilistically Checkable Proofs (PCPs) and Locally Testable Codes (LTCs) of fixed constant query complexity which have almostlinear (= n ) size. Such objects were recently shown to exist (nonconstructively) by Goldreich and Sudan [17]. Previous explicit constructions required size n1+# #) with 1/# queries. The ke
Randomness-Efficient Curve Samplers
Curve samplers are sampling algorithms that proceed by viewing the domain as a vector space over a finite field, and randomly picking a low-degree curve in it as the sample. Curve samplers exhibit a nice property besides the sampling property: the restriction of low-degree polynomials over the domain to the sampled curve is still low-degree. This property is often used in combination with the sampling property and has found many applications, including PCP constructions, local decoding of codes, and algebraic PRG constructions.
The randomness complexity of curve samplers is a crucial parameter for its applications. It is known that (non-explicit) curve samplers using O(logN + log(1/δ)) random bits exist, where N is the domain size and δ is the confidence error. The question of explicitly constructing randomness-efficient curve samplers was first raised in [TSU06] they obtained curve samplers with near-optimal randomness complexity.
We present an explicit construction of low-degree curve samplers with optimal randomness complexity (up to a constant factor), sampling curves of degree (m log_q (1/δ))^(O(1)) in F^m_q. Our construction is a delicate combination of several components, including extractor machinery, limited independence, iterated sampling, and list-recoverable codes
Probabilistically Checkable Reconfiguration Proofs and Inapproximability of Reconfiguration Problems
Motivated by the inapproximability of reconfiguration problems, we present a
new PCP-type characterization of PSPACE, which we call a probabilistically
checkable reconfiguration proof (PCRP): Any PSPACE computation can be encoded
into an exponentially long sequence of polynomially long proofs such that every
adjacent pair of the proofs differs in at most one bit, and every proof can be
probabilistically checked by reading a constant number of bits.
Using the new characterization, we prove PSPACE-completeness of approximate
versions of many reconfiguration problems, such as the Maxmin -SAT
Reconfiguration problem. This resolves the open problem posed by Ito, Demaine,
Harvey, Papadimitriou, Sideri, Uehara, and Uno (ISAAC 2008; Theor. Comput. Sci.
2011) as well as the Reconfiguration Inapproximability Hypothesis by Ohsaka
(STACS 2023) affirmatively. We also present PSPACE-completeness of
approximating the Maxmin Clique Reconfiguration problem to within a factor of
for some constant .Comment: 31 page
Property Testing with Online Adversaries
The online manipulation-resilient testing model, proposed by Kalemaj,
Raskhodnikova and Varma (ITCS 2022 and Theory of Computing 2023), studies
property testing in situations where access to the input degrades continuously
and adversarially. Specifically, after each query made by the tester is
answered, the adversary can intervene and either erase or corrupt data
points. In this work, we investigate a more nuanced version of the online model
in order to overcome old and new impossibility results for the original model.
We start by presenting an optimal tester for linearity and a lower bound for
low-degree testing of Boolean functions in the original model. We overcome the
lower bound by allowing batch queries, where the tester gets a group of queries
answered between manipulations of the data. Our batch size is small enough so
that function values for a single batch on their own give no information about
whether the function is of low degree. Finally, to overcome the impossibility
results of Kalemaj et al. for sortedness and the Lipschitz property of
sequences, we extend the model to include , i.e., adversaries that make
less than one erasure per query. For sortedness, we characterize the rate of
erasures for which online testing can be performed, exhibiting a sharp
transition from optimal query complexity to impossibility of testability (with
any number of queries). Our online tester works for a general class of local
properties of sequences. One feature of our results is that we get new (and in
some cases, simpler) optimal algorithms for several properties in the standard
property testing model.Comment: To be published in 15th Innovations in Theoretical Computer Science
(ITCS 2024