60 research outputs found
Approximating Cumulative Pebbling Cost Is Unique Games Hard
The cumulative pebbling complexity of a directed acyclic graph is defined
as , where the minimum is taken over all
legal (parallel) black pebblings of and denotes the number of
pebbles on the graph during round . Intuitively, captures
the amortized Space-Time complexity of pebbling copies of in parallel.
The cumulative pebbling complexity of a graph is of particular interest in
the field of cryptography as is tightly related to the
amortized Area-Time complexity of the Data-Independent Memory-Hard Function
(iMHF) [AS15] defined using a constant indegree directed acyclic
graph (DAG) and a random oracle . A secure iMHF should have
amortized Space-Time complexity as high as possible, e.g., to deter brute-force
password attacker who wants to find such that . Thus, to
analyze the (in)security of a candidate iMHF , it is crucial to
estimate the value but currently, upper and lower bounds for
leading iMHF candidates differ by several orders of magnitude. Blocki and Zhou
recently showed that it is -Hard to compute , but
their techniques do not even rule out an efficient
-approximation algorithm for any constant . We
show that for any constant , it is Unique Games hard to approximate
to within a factor of .
(See the paper for the full abstract.)Comment: 28 pages, updated figures and corrected typo
Computationally Data-Independent Memory Hard Functions
Memory hard functions (MHFs) are an important cryptographic primitive that are used to design egalitarian proofs of work and in the construction of moderately expensive key-derivation functions resistant to brute-force attacks. Broadly speaking, MHFs can be divided into two categories: data-dependent memory hard functions (dMHFs) and data-independent memory hard functions (iMHFs). iMHFs are resistant to certain side-channel attacks as the memory access pattern induced by the honest evaluation algorithm is independent of the potentially sensitive input e.g., password. While dMHFs are potentially vulnerable to side-channel attacks (the induced memory access pattern might leak useful information to a brute-force attacker), they can achieve higher cumulative memory complexity (CMC) in comparison than an iMHF. In particular, any iMHF that can be evaluated in N steps on a sequential machine has CMC at most ?((N^2 log log N)/log N). By contrast, the dMHF scrypt achieves maximal CMC ?(N^2) - though the CMC of scrypt would be reduced to just ?(N) after a side-channel attack.
In this paper, we introduce the notion of computationally data-independent memory hard functions (ciMHFs). Intuitively, we require that memory access pattern induced by the (randomized) ciMHF evaluation algorithm appears to be independent from the standpoint of a computationally bounded eavesdropping attacker - even if the attacker selects the initial input. We then ask whether it is possible to circumvent known upper bound for iMHFs and build a ciMHF with CMC ?(N^2). Surprisingly, we answer the question in the affirmative when the ciMHF evaluation algorithm is executed on a two-tiered memory architecture (RAM/Cache).
We introduce the notion of a k-restricted dynamic graph to quantify the continuum between unrestricted dMHFs (k=n) and iMHFs (k=1). For any ? > 0 we show how to construct a k-restricted dynamic graph with k=?(N^(1-?)) that provably achieves maximum cumulative pebbling cost ?(N^2). We can use k-restricted dynamic graphs to build a ciMHF provided that cache is large enough to hold k hash outputs and the dynamic graph satisfies a certain property that we call "amenable to shuffling". In particular, we prove that the induced memory access pattern is indistinguishable to a polynomial time attacker who can monitor the locations of read/write requests to RAM, but not cache. We also show that when k=o(N^(1/log log N))then any k-restricted graph with constant indegree has cumulative pebbling cost o(N^2). Our results almost completely characterize the spectrum of k-restricted dynamic graphs
Towards Human Computable Passwords
An interesting challenge for the cryptography community is to design
authentication protocols that are so simple that a human can execute them
without relying on a fully trusted computer. We propose several candidate
authentication protocols for a setting in which the human user can only receive
assistance from a semi-trusted computer --- a computer that stores information
and performs computations correctly but does not provide confidentiality. Our
schemes use a semi-trusted computer to store and display public challenges
. The human user memorizes a random secret mapping
and authenticates by computing responses
to a sequence of public challenges where
is a function that is easy for the
human to evaluate. We prove that any statistical adversary needs to sample
challenge-response pairs to recover , for
a security parameter that depends on two key properties of . To
obtain our results, we apply the general hypercontractivity theorem to lower
bound the statistical dimension of the distribution over challenge-response
pairs induced by and . Our lower bounds apply to arbitrary
functions (not just to functions that are easy for a human to evaluate),
and generalize recent results of Feldman et al. As an application, we propose a
family of human computable password functions in which the user
needs to perform primitive operations (e.g., adding two digits or
remembering ), and we show that .
For these schemes, we prove that forging passwords is equivalent to recovering
the secret mapping. Thus, our human computable password schemes can maintain
strong security guarantees even after an adversary has observed the user login
to many different accounts.Comment: Fixed bug in definition of Q^{f,j} and modified proofs accordingl
Differentially Private Data Analysis of Social Networks via Restricted Sensitivity
We introduce the notion of restricted sensitivity as an alternative to global
and smooth sensitivity to improve accuracy in differentially private data
analysis. The definition of restricted sensitivity is similar to that of global
sensitivity except that instead of quantifying over all possible datasets, we
take advantage of any beliefs about the dataset that a querier may have, to
quantify over a restricted class of datasets. Specifically, given a query f and
a hypothesis H about the structure of a dataset D, we show generically how to
transform f into a new query f_H whose global sensitivity (over all datasets
including those that do not satisfy H) matches the restricted sensitivity of
the query f. Moreover, if the belief of the querier is correct (i.e., D is in
H) then f_H(D) = f(D). If the belief is incorrect, then f_H(D) may be
inaccurate.
We demonstrate the usefulness of this notion by considering the task of
answering queries regarding social-networks, which we model as a combination of
a graph and a labeling of its vertices. In particular, while our generic
procedure is computationally inefficient, for the specific definition of H as
graphs of bounded degree, we exhibit efficient ways of constructing f_H using
different projection-based techniques. We then analyze two important query
classes: subgraph counting queries (e.g., number of triangles) and local
profile queries (e.g., number of people who know a spy and a computer-scientist
who know each other). We demonstrate that the restricted sensitivity of such
queries can be significantly lower than their smooth sensitivity. Thus, using
restricted sensitivity we can maintain privacy whether or not D is in H, while
providing more accurate results in the event that H holds true
A New Connection Between Node and Edge Depth Robust Graphs
Given a directed acyclic graph (DAG) G = (V,E), we say that G is (e,d)-depth-robust (resp. (e,d)-edge-depth-robust) if for any set S ? V (resp. S ? E) of at most |S| ? e nodes (resp. edges) the graph G-S contains a directed path of length d. While edge-depth-robust graphs are potentially easier to construct many applications in cryptography require node depth-robust graphs with small indegree. We create a graph reduction that transforms an (e, d)-edge-depth-robust graph with m edges into a (e/2,d)-depth-robust graph with O(m) nodes and constant indegree. One immediate consequence of this result is the first construction of a provably ((n log log n)/log n, n/{(log n)^{1 + log log n}})-depth-robust graph with constant indegree, where previous constructions for e = (n log log n)/log n had d = O(n^{1-?}). Our reduction crucially relies on ST-Robust graphs, a new graph property we introduce which may be of independent interest. We say that a directed, acyclic graph with n inputs and n outputs is (k?, k?)-ST-Robust if we can remove any k? nodes and there exists a subgraph containing at least k? inputs and k? outputs such that each of the k? inputs is connected to all of the k? outputs. If the graph if (k?,n-k?)-ST-Robust for all k? ? n we say that the graph is maximally ST-robust. We show how to construct maximally ST-robust graphs with constant indegree and O(n) nodes. Given a family ? of ST-robust graphs and an arbitrary (e, d)-edge-depth-robust graph G we construct a new constant-indegree graph Reduce(G, ?) by replacing each node in G with an ST-robust graph from ?. We also show that ST-robust graphs can be used to construct (tight) proofs-of-space and (asymptotically) improved wide-block labeling functions
- β¦