119 research outputs found
Spectral Signatures in Backdoor Attacks
A recent line of work has uncovered a new form of data poisoning: so-called
\emph{backdoor} attacks. These attacks are particularly dangerous because they
do not affect a network's behavior on typical, benign data. Rather, the network
only deviates from its expected output when triggered by a perturbation planted
by an adversary.
In this paper, we identify a new property of all known backdoor attacks,
which we call \emph{spectral signatures}. This property allows us to utilize
tools from robust statistics to thwart the attacks. We demonstrate the efficacy
of these signatures in detecting and removing poisoned examples on real image
sets and state of the art neural network architectures. We believe that
understanding spectral signatures is a crucial first step towards designing ML
systems secure against such backdoor attacksComment: 16 pages, accepted to NIPS 201
Robust polynomial regression up to the information theoretic limit
We consider the problem of robust polynomial regression, where one receives
samples that are usually within of a polynomial , but have a chance of being arbitrary adversarial outliers.
Previously, it was known how to efficiently estimate only when . We give an algorithm that works for the entire feasible
range of , while simultaneously improving other parameters of the
problem. We complement our algorithm, which gives a factor 2 approximation,
with impossibility results that show, for example, that a approximation
is impossible even with infinitely many samples.Comment: 19 Pages. To appear in FOCS 201
Efficient Statistics, in High Dimensions, from Truncated Samples
We provide an efficient algorithm for the classical problem, going back to
Galton, Pearson, and Fisher, of estimating, with arbitrary accuracy the
parameters of a multivariate normal distribution from truncated samples.
Truncated samples from a -variate normal means a samples is only revealed if it falls
in some subset ; otherwise the samples are hidden and
their count in proportion to the revealed samples is also hidden. We show that
the mean and covariance matrix can be
estimated with arbitrary accuracy in polynomial-time, as long as we have oracle
access to , and has non-trivial measure under the unknown -variate
normal distribution. Additionally we show that without oracle access to ,
any non-trivial estimation is impossible.Comment: to appear at 59th Annual IEEE Symposium on Foundations of Computer
Science (FOCS), 201
Byzantine Stochastic Gradient Descent
This paper studies the problem of distributed stochastic optimization in an
adversarial setting where, out of the machines which allegedly compute
stochastic gradients every iteration, an -fraction are Byzantine, and
can behave arbitrarily and adversarially. Our main result is a variant of
stochastic gradient descent (SGD) which finds -approximate
minimizers of convex functions in iterations. In contrast, traditional
mini-batch SGD needs iterations,
but cannot tolerate Byzantine failures. Further, we provide a lower bound
showing that, up to logarithmic factors, our algorithm is
information-theoretically optimal both in terms of sampling complexity and time
complexity
- …