29,433 research outputs found
Constructions of Almost Optimal Resilient Boolean Functions on Large Even Number of Variables
In this paper, a technique on constructing nonlinear resilient Boolean
functions is described. By using several sets of disjoint spectra functions on
a small number of variables, an almost optimal resilient function on a large
even number of variables can be constructed. It is shown that given any ,
one can construct infinitely many -variable ( even), -resilient
functions with nonlinearity . A large class of highly
nonlinear resilient functions which were not known are obtained. Then one
method to optimize the degree of the constructed functions is proposed. Last,
an improved version of the main construction is given.Comment: 14 pages, 2 table
Two-Source Condensers with Low Error and Small Entropy Gap via Entropy-Resilient Functions
In their seminal work, Chattopadhyay and Zuckerman (STOC\u2716) constructed a two-source extractor with error epsilon for n-bit sources having min-entropy {polylog}(n/epsilon). Unfortunately, the construction\u27s running-time is {poly}(n/epsilon), which means that with polynomial-time constructions, only polynomially-small errors are possible. Our main result is a {poly}(n,log(1/epsilon))-time computable two-source condenser. For any k >= {polylog}(n/epsilon), our condenser transforms two independent (n,k)-sources to a distribution over m = k-O(log(1/epsilon)) bits that is epsilon-close to having min-entropy m - o(log(1/epsilon)). Hence, achieving entropy gap of o(log(1/epsilon)).
The bottleneck for obtaining low error in recent constructions of two-source extractors lies in the use of resilient functions. Informally, this is a function that receives input bits from r players with the property that the function\u27s output has small bias even if a bounded number of corrupted players feed adversarial inputs after seeing the inputs of the other players. The drawback of using resilient functions is that the error cannot be smaller than ln r/r. This, in return, forces the running time of the construction to be polynomial in 1/epsilon.
A key component in our construction is a variant of resilient functions which we call entropy-resilient functions. This variant can be seen as playing the above game for several rounds, each round outputting one bit. The goal of the corrupted players is to reduce, with as high probability as they can, the min-entropy accumulated throughout the rounds. We show that while the bias decreases only polynomially with the number of players in a one-round game, their success probability decreases exponentially in the entropy gap they are attempting to incur in a repeated game
Bloom Filters in Adversarial Environments
Many efficient data structures use randomness, allowing them to improve upon
deterministic ones. Usually, their efficiency and correctness are analyzed
using probabilistic tools under the assumption that the inputs and queries are
independent of the internal randomness of the data structure. In this work, we
consider data structures in a more robust model, which we call the adversarial
model. Roughly speaking, this model allows an adversary to choose inputs and
queries adaptively according to previous responses. Specifically, we consider a
data structure known as "Bloom filter" and prove a tight connection between
Bloom filters in this model and cryptography.
A Bloom filter represents a set of elements approximately, by using fewer
bits than a precise representation. The price for succinctness is allowing some
errors: for any it should always answer `Yes', and for any it should answer `Yes' only with small probability.
In the adversarial model, we consider both efficient adversaries (that run in
polynomial time) and computationally unbounded adversaries that are only
bounded in the number of queries they can make. For computationally bounded
adversaries, we show that non-trivial (memory-wise) Bloom filters exist if and
only if one-way functions exist. For unbounded adversaries we show that there
exists a Bloom filter for sets of size and error , that is
secure against queries and uses only
bits of memory. In comparison, is the best
possible under a non-adaptive adversary
Efficient public-key cryptography with bounded leakage and tamper resilience
We revisit the question of constructing public-key encryption and signature schemes with security in the presence of bounded leakage and tampering memory attacks. For signatures we obtain the first construction in the standard model; for public-key encryption we obtain the first construction free of pairing (avoiding non-interactive zero-knowledge proofs). Our constructions are based on generic building blocks, and, as we show, also admit efficient instantiations under fairly standard number-theoretic assumptions.
The model of bounded tamper resistance was recently put forward by Damgård et al. (Asiacrypt 2013) as an attractive path to achieve security against arbitrary memory tampering attacks without making hardware assumptions (such as the existence of a protected self-destruct or key-update mechanism), the only restriction being on the number of allowed tampering attempts (which is a parameter of the scheme). This allows to circumvent known impossibility results for unrestricted tampering (Gennaro et al., TCC 2010), while still being able to capture realistic tampering attack
Derandomization and Group Testing
The rapid development of derandomization theory, which is a fundamental area
in theoretical computer science, has recently led to many surprising
applications outside its initial intention. We will review some recent such
developments related to combinatorial group testing. In its most basic setting,
the aim of group testing is to identify a set of "positive" individuals in a
population of items by taking groups of items and asking whether there is a
positive in each group.
In particular, we will discuss explicit constructions of optimal or
nearly-optimal group testing schemes using "randomness-conducting" functions.
Among such developments are constructions of error-correcting group testing
schemes using randomness extractors and condensers, as well as threshold group
testing schemes from lossless condensers.Comment: Invited Paper in Proceedings of 48th Annual Allerton Conference on
Communication, Control, and Computing, 201
Children’s experiences of domestic violence and abuse: siblings’ accounts of relational coping
This article explores how young people see their relationships, particularly their sibling relationships, in families affected by domestic violence, and how relationality emerges in their accounts as a resource to build an agentic sense of self. The ‘voice’ of children is largely absent from domestic violence literature, which typically portrays them as passive, damaged and relationally incompetent. Children’s own understandings of their relational worlds are often overlooked, and consequently existing models of children’s social interactions give inadequate accounts of their meaning-making-in-context. Drawn from a larger study of children’s experiences of domestic violence and abuse, this paper uses two case studies of sibling relationships to explore young people’s use of relational resources, for coping with violence in the home. The paper explores how relationality and coping intertwine in young people’s accounts, and disrupts the taken for granted assumption that children’s ‘premature caring’ or ‘parentification’ is (only) pathological in children’s responses to domestic violence. This has implications for understanding young people’s experiences in the present, and supporting their capacity for relationship building in the future
Randomness Extraction in AC0 and with Small Locality
Randomness extractors, which extract high quality (almost-uniform) random
bits from biased random sources, are important objects both in theory and in
practice. While there have been significant progress in obtaining near optimal
constructions of randomness extractors in various settings, the computational
complexity of randomness extractors is still much less studied. In particular,
it is not clear whether randomness extractors with good parameters can be
computed in several interesting complexity classes that are much weaker than P.
In this paper we study randomness extractors in the following two models of
computation: (1) constant-depth circuits (AC0), and (2) the local computation
model. Previous work in these models, such as [Vio05a], [GVW15] and [BG13],
only achieve constructions with weak parameters. In this work we give explicit
constructions of randomness extractors with much better parameters. As an
application, we use our AC0 extractors to study pseudorandom generators in AC0,
and show that we can construct both cryptographic pseudorandom generators
(under reasonable computational assumptions) and unconditional pseudorandom
generators for space bounded computation with very good parameters.
Our constructions combine several previous techniques in randomness
extractors, as well as introduce new techniques to reduce or preserve the
complexity of extractors, which may be of independent interest. These include
(1) a general way to reduce the error of strong seeded extractors while
preserving the AC0 property and small locality, and (2) a seeded randomness
condenser with small locality.Comment: 62 page
- …