13,543 research outputs found
Algorithms for Extended Alpha-Equivalence and Complexity
Equality of expressions in lambda-calculi, higher-order programming languages, higher-order programming calculi and process calculi is defined as alpha-equivalence. Permutability of bindings in let-constructs and structural congruence axioms extend alpha-equivalence. We analyse these extended alpha-equivalences and show that there are calculi with polynomial time algorithms, that a multiple-binding “let ” may make alpha-equivalence as hard as finding graph-isomorphisms, and that the replication operator in the pi-calculus may lead to an EXPSPACE-hard alpha-equivalence problem
Sample Complexity Bounds on Differentially Private Learning via Communication Complexity
In this work we analyze the sample complexity of classification by
differentially private algorithms. Differential privacy is a strong and
well-studied notion of privacy introduced by Dwork et al. (2006) that ensures
that the output of an algorithm leaks little information about the data point
provided by any of the participating individuals. Sample complexity of private
PAC and agnostic learning was studied in a number of prior works starting with
(Kasiviswanathan et al., 2008) but a number of basic questions still remain
open, most notably whether learning with privacy requires more samples than
learning without privacy.
We show that the sample complexity of learning with (pure) differential
privacy can be arbitrarily higher than the sample complexity of learning
without the privacy constraint or the sample complexity of learning with
approximate differential privacy. Our second contribution and the main tool is
an equivalence between the sample complexity of (pure) differentially private
learning of a concept class (or ) and the randomized one-way
communication complexity of the evaluation problem for concepts from . Using
this equivalence we prove the following bounds:
1. , where is the Littlestone's (1987)
dimension characterizing the number of mistakes in the online-mistake-bound
learning model. Known bounds on then imply that can be much
higher than the VC-dimension of .
2. For any , there exists a class such that but .
3. For any , there exists a class such that the sample complexity of
(pure) -differentially private PAC learning is but
the sample complexity of the relaxed -differentially private
PAC learning is . This resolves an open problem of
Beimel et al. (2013b).Comment: Extended abstract appears in Conference on Learning Theory (COLT)
201
Nominal Unification of Higher Order Expressions with Recursive Let
A sound and complete algorithm for nominal unification of higher-order
expressions with a recursive let is described, and shown to run in
non-deterministic polynomial time. We also explore specializations like nominal
letrec-matching for plain expressions and for DAGs and determine the complexity
of corresponding unification problems.Comment: Pre-proceedings paper presented at the 26th International Symposium
on Logic-Based Program Synthesis and Transformation (LOPSTR 2016), Edinburgh,
Scotland UK, 6-8 September 2016 (arXiv:1608.02534
- …