4,074 research outputs found
Randomized Smoothing for Stochastic Optimization
We analyze convergence rates of stochastic optimization procedures for
non-smooth convex optimization problems. By combining randomized smoothing
techniques with accelerated gradient methods, we obtain convergence rates of
stochastic optimization procedures, both in expectation and with high
probability, that have optimal dependence on the variance of the gradient
estimates. To the best of our knowledge, these are the first variance-based
rates for non-smooth optimization. We give several applications of our results
to statistical estimation problems, and provide experimental results that
demonstrate the effectiveness of the proposed algorithms. We also describe how
a combination of our algorithm with recent work on decentralized optimization
yields a distributed stochastic optimization algorithm that is order-optimal.Comment: 39 pages, 3 figure
Smoothed Analysis of Dynamic Networks
We generalize the technique of smoothed analysis to distributed algorithms in
dynamic network models. Whereas standard smoothed analysis studies the impact
of small random perturbations of input values on algorithm performance metrics,
dynamic graph smoothed analysis studies the impact of random perturbations of
the underlying changing network graph topologies. Similar to the original
application of smoothed analysis, our goal is to study whether known strong
lower bounds in dynamic network models are robust or fragile: do they withstand
small (random) perturbations, or do such deviations push the graphs far enough
from a precise pathological instance to enable much better performance? Fragile
lower bounds are likely not relevant for real-world deployment, while robust
lower bounds represent a true difficulty caused by dynamic behavior. We apply
this technique to three standard dynamic network problems with known strong
worst-case lower bounds: random walks, flooding, and aggregation. We prove that
these bounds provide a spectrum of robustness when subjected to
smoothing---some are extremely fragile (random walks), some are moderately
fragile / robust (flooding), and some are extremely robust (aggregation).Comment: 20 page
Black-Box Certification with Randomized Smoothing: A Functional Optimization Based Framework
Randomized classifiers have been shown to provide a promising approach for
achieving certified robustness against adversarial attacks in deep learning.
However, most existing methods only leverage Gaussian smoothing noise and only
work for perturbation. We propose a general framework of adversarial
certification with non-Gaussian noise and for more general types of attacks,
from a unified functional optimization perspective. Our new framework allows us
to identify a key trade-off between accuracy and robustness via designing
smoothing distributions, helping to design new families of non-Gaussian
smoothing distributions that work more efficiently for different
settings, including , and attacks. Our proposed
methods achieve better certification results than previous works and provide a
new perspective on randomized smoothing certification
- β¦