281 research outputs found

    Optimal Unateness Testers for Real-Valued Functions: Adaptivity Helps

    Get PDF
    We study the problem of testing unateness of functions f:{0,1}^d -> R. We give an O(d/epsilon . log(d/epsilon))-query nonadaptive tester and an O(d/epsilon)-query adaptive tester and show that both testers are optimal for a fixed distance parameter epsilon. Previously known unateness testers worked only for Boolean functions, and their query complexity had worse dependence on the dimension both for the adaptive and the nonadaptive case. Moreover, no lower bounds for testing unateness were known. We generalize our results to obtain optimal unateness testers for functions f:[n]^d -> R. Our results establish that adaptivity helps with testing unateness of real-valued functions on domains of the form {0,1}^d and, more generally, [n]^d. This stands in contrast to the situation for monotonicity testing where there is no adaptivity gap for functions f:[n]^d -> R

    Local Lipschitz Filters for Bounded-Range Functions with Applications to Arbitrary Real-Valued Functions

    Full text link
    We study local filters for the Lipschitz property of real-valued functions f:V[0,r]f: V \to [0,r], where the Lipschitz property is defined with respect to an arbitrary undirected graph G=(V,E)G=(V,E). We give nearly optimal local Lipschitz filters both with respect to 1\ell_1-distance and 0\ell_0-distance. Previous work only considered unbounded-range functions over [n]d[n]^d. Jha and Raskhodnikova (SICOMP `13) gave an algorithm for such functions with lookup complexity exponential in dd, which Awasthi et al. (ACM Trans. Comput. Theory) showed was necessary in this setting. We demonstrate that important applications of local Lipschitz filters can be accomplished with filters for functions with bounded-range. For functions f:[n]d[0,r]f: [n]^d\to [0,r], we circumvent the lower bound and achieve running time (drlogn)O(logr)(d^r\log n)^{O(\log r)} for the 1\ell_1-respecting filter and dO(r)polylog nd^{O(r)}\text{polylog } n for the 0\ell_0-respecting filter. Our local filters provide a novel Lipschitz extension that can be implemented locally. Furthermore, we show that our algorithms have nearly optimal dependence on rr for the domain {0,1}d\{0,1\}^d. In addition, our lower bound resolves an open question of Awasthi et al., removing one of the conditions necessary for their lower bound for general range. We prove our lower bound via a reduction from distribution-free Lipschitz testing and a new technique for proving hardness for {\em adaptive} algorithms. We provide two applications of our local filters to arbitrary real-valued functions. In the first application, we use them in conjunction with the Laplace mechanism for differential privacy and noisy binary search to provide mechanisms for privately releasing outputs of black-box functions, even in the presence of malicious clients. In the second application, we use our local filters to obtain the first nontrivial tolerant tester for the Lipschitz property

    Differentially Private Empirical Risk Minimization with Sparsity-Inducing Norms

    Get PDF
    Differential privacy is concerned about the prediction quality while measuring the privacy impact on individuals whose information is contained in the data. We consider differentially private risk minimization problems with regularizers that induce structured sparsity. These regularizers are known to be convex but they are often non-differentiable. We analyze the standard differentially private algorithms, such as output perturbation, Frank-Wolfe and objective perturbation. Output perturbation is a differentially private algorithm that is known to perform well for minimizing risks that are strongly convex. Previous works have derived excess risk bounds that are independent of the dimensionality. In this paper, we assume a particular class of convex but non-smooth regularizers that induce structured sparsity and loss functions for generalized linear models. We also consider differentially private Frank-Wolfe algorithms to optimize the dual of the risk minimization problem. We derive excess risk bounds for both these algorithms. Both the bounds depend on the Gaussian width of the unit ball of the dual norm. We also show that objective perturbation of the risk minimization problems is equivalent to the output perturbation of a dual optimization problem. This is the first work that analyzes the dual optimization problems of risk minimization problems in the context of differential privacy

    Parameterized Property Testing of Functions

    Get PDF
    We investigate the parameters in terms of which the complexity of sublinear-time algorithms should be expressed. Our goal is to find input parameters that are tailored to the combinatorics of the specific problem being studied and design algorithms that run faster when these parameters are small. This direction enables us to surpass the (worst-case) lower bounds, expressed in terms of the input size, for several problems. Our aim is to develop a similar level of understanding of the complexity of sublinear-time algorithms to the one that was enabled by research in parameterized complexity for classical algorithms. Specifically, we focus on testing properties of functions. By parameterizing the query complexity in terms of the size r of the image of the input function, we obtain testers for monotonicity and convexity of functions of the form f:[n]to mathbb{R} with query complexity O(log r), with no dependence on n. The result for monotonicity circumvents the Omega(log n) lower bound by Fischer (Inf. Comput., 2004) for this problem. We present several other parameterized testers, providing compelling evidence that expressing the query complexity of property testers in terms of the input size is not always the best choice
    corecore