84 research outputs found

    Isoperimetric Inequalities for Real-Valued Functions with Applications to Monotonicity Testing

    Get PDF
    We generalize the celebrated isoperimetric inequality of Khot, Minzer, and Safra (SICOMP 2018) for Boolean functions to the case of real-valued functions f:{0,1}^d ? ?. Our main tool in the proof of the generalized inequality is a new Boolean decomposition that represents every real-valued function f over an arbitrary partially ordered domain as a collection of Boolean functions over the same domain, roughly capturing the distance of f to monotonicity and the structure of violations of f to monotonicity. We apply our generalized isoperimetric inequality to improve algorithms for testing monotonicity and approximating the distance to monotonicity for real-valued functions. Our tester for monotonicity has query complexity O?(min(r ?d,d)), where r is the size of the image of the input function. (The best previously known tester makes O(d) queries, as shown by Chakrabarty and Seshadhri (STOC 2013).) Our tester is nonadaptive and has 1-sided error. We prove a matching lower bound for nonadaptive, 1-sided error testers for monotonicity. We also show that the distance to monotonicity of real-valued functions that are ?-far from monotone can be approximated nonadaptively within a factor of O(?{d log d}) with query complexity polynomial in 1/? and the dimension d. This query complexity is known to be nearly optimal for nonadaptive algorithms even for the special case of Boolean functions. (The best previously known distance approximation algorithm for real-valued functions, by Fattal and Ron (TALG 2010) achieves O(d log r)-approximation.

    Improved Monotonicity Testers via Hypercube Embeddings

    Get PDF

    Adaptive Boolean Monotonicity Testing in Total Influence Time

    Get PDF
    Testing monotonicity of a Boolean function f:{0,1}^n -> {0,1} is an important problem in the field of property testing. It has led to connections with many interesting combinatorial questions on the directed hypercube: routing, random walks, and new isoperimetric theorems. Denoting the proximity parameter by epsilon, the best tester is the non-adaptive O~(epsilon^{-2}sqrt{n}) tester of Khot-Minzer-Safra (FOCS 2015). A series of recent results by Belovs-Blais (STOC 2016) and Chen-Waingarten-Xie (STOC 2017) have led to Omega~(n^{1/3}) lower bounds for adaptive testers. Reducing this gap is a significant question, that touches on the role of adaptivity in monotonicity testing of Boolean functions. We approach this question from the perspective of parametrized property testing, a concept recently introduced by Pallavoor-Raskhodnikova-Varma (ACM TOCT 2017), where one seeks to understand performance of testers with respect to parameters other than just the size. Our result is an adaptive monotonicity tester with one-sided error whose query complexity is O(epsilon^{-2}I(f)log^5 n), where I(f) is the total influence of the function. Therefore, adaptivity provably helps monotonicity testing for low influence functions

    Monotonicity Testing for Boolean Functions over Graph Products

    Get PDF
    We establish a directed analogue of Chung and Tetali's isoperimetric inequality for graph products. We use this inequality to obtain new bounds on the query complexity for testing monotonicity of Boolean-valued functions over products of general posets

    Nearly Optimal Bounds for Sample-Based Testing and Learning of kk-Monotone Functions

    Full text link
    We study monotonicity testing of functions f ⁣:{0,1}d{0,1}f \colon \{0,1\}^d \to \{0,1\} using sample-based algorithms, which are only allowed to observe the value of ff on points drawn independently from the uniform distribution. A classic result by Bshouty-Tamon (J. ACM 1996) proved that monotone functions can be learned with exp(O(min{1εd,d}))\exp(O(\min\{\frac{1}{\varepsilon}\sqrt{d},d\})) samples and it is not hard to show that this bound extends to testing. Prior to our work the only lower bound for this problem was Ω(exp(d)/ε)\Omega(\sqrt{\exp(d)/\varepsilon}) in the small ε\varepsilon parameter regime, when ε=O(d3/2)\varepsilon = O(d^{-3/2}), due to Goldreich-Goldwasser-Lehman-Ron-Samorodnitsky (Combinatorica 2000). Thus, the sample complexity of monotonicity testing was wide open for εd3/2\varepsilon \gg d^{-3/2}. We resolve this question, obtaining a tight lower bound of exp(Ω(min{1εd,d}))\exp(\Omega(\min\{\frac{1}{\varepsilon}\sqrt{d},d\})) for all ε\varepsilon at most a sufficiently small constant. In fact, we prove a much more general result, showing that the sample complexity of kk-monotonicity testing and learning for functions f ⁣:{0,1}d[r]f \colon \{0,1\}^d \to [r] is exp(Θ(min{rkεd,d}))\exp(\Theta(\min\{\frac{rk}{\varepsilon}\sqrt{d},d\})). For testing with one-sided error we show that the sample complexity is exp(Θ(d))\exp(\Theta(d)). Beyond the hypercube, we prove nearly tight bounds (up to polylog factors of d,k,r,1/εd,k,r,1/\varepsilon in the exponent) of exp(Θ~(min{rkεd,d}))\exp(\widetilde{\Theta}(\min\{\frac{rk}{\varepsilon}\sqrt{d},d\})) on the sample complexity of testing and learning measurable kk-monotone functions f ⁣:Rd[r]f \colon \mathbb{R}^d \to [r] under product distributions. Our upper bound improves upon the previous bound of exp(O~(min{kε2d,d}))\exp(\widetilde{O}(\min\{\frac{k}{\varepsilon^2}\sqrt{d},d\})) by Harms-Yoshida (ICALP 2022) for Boolean functions (r=2r=2)
    corecore