47 research outputs found
Nearly Optimal Bounds for Sample-Based Testing and Learning of -Monotone Functions
We study monotonicity testing of functions
using sample-based algorithms, which are only allowed to observe the value of
on points drawn independently from the uniform distribution. A classic
result by Bshouty-Tamon (J. ACM 1996) proved that monotone functions can be
learned with samples and it
is not hard to show that this bound extends to testing. Prior to our work the
only lower bound for this problem was in
the small parameter regime, when , due
to Goldreich-Goldwasser-Lehman-Ron-Samorodnitsky (Combinatorica 2000). Thus,
the sample complexity of monotonicity testing was wide open for . We resolve this question, obtaining a tight lower bound of
for all
at most a sufficiently small constant. In fact, we prove a much more general
result, showing that the sample complexity of -monotonicity testing and
learning for functions is
. For testing with
one-sided error we show that the sample complexity is .
Beyond the hypercube, we prove nearly tight bounds (up to polylog factors of
in the exponent) of
on the
sample complexity of testing and learning measurable -monotone functions under product distributions. Our upper bound
improves upon the previous bound of
by
Harms-Yoshida (ICALP 2022) for Boolean functions ()
Adaptive Lower Bound for Testing Monotonicity on the Line
In the property testing model, the task is to distinguish objects possessing some property from the objects that are far from it. One of such properties is monotonicity, when the objects are functions from one poset to another. This is an active area of research. In this paper we study query complexity of epsilon-testing monotonicity of a function f : [n]->[r]. All our lower bounds are for adaptive two-sided testers.
- We prove a nearly tight lower bound for this problem in terms of r. The bound is Omega((log r)/(log log r)) when epsilon = 1/2. No previous satisfactory lower bound in terms of r was known.
- We completely characterise query complexity of this problem in terms of n for smaller values of epsilon. The complexity is Theta(epsilon^{-1} log (epsilon n)). Apart from giving the lower bound, this improves on the best known upper bound.
Finally, we give an alternative proof of the Omega(epsilon^{-1}d log n - epsilon^{-1}log epsilon^{-1}) lower bound for testing monotonicity on the hypergrid [n]^d due to Chakrabarty and Seshadhri (RANDOM\u2713)
Optimal Unateness Testers for Real-Valued Functions: Adaptivity Helps
We study the problem of testing unateness of functions f:{0,1}^d -> R. We give an O(d/epsilon . log(d/epsilon))-query nonadaptive tester and an O(d/epsilon)-query adaptive tester and show that both testers are optimal for a fixed distance parameter epsilon. Previously known unateness testers worked only for Boolean functions, and their query complexity had worse dependence on the dimension both for the adaptive and the nonadaptive case. Moreover, no lower bounds for testing unateness were known. We generalize our results to obtain optimal unateness testers for functions f:[n]^d -> R.
Our results establish that adaptivity helps with testing unateness of real-valued functions on domains of the form {0,1}^d and, more generally, [n]^d. This stands in contrast to the situation for monotonicity testing where there is no adaptivity gap for functions f:[n]^d -> R
Parameterized Property Testing of Functions
We investigate the parameters in terms of which the complexity of sublinear-time algorithms should be expressed. Our goal is to find input parameters that are tailored to the combinatorics of the specific problem being studied and design algorithms that run faster when these parameters are small. This direction enables us to surpass the (worst-case) lower bounds, expressed in terms of the input size, for several problems. Our aim is to develop a similar level of understanding of the complexity of sublinear-time algorithms to the one that was enabled by research in parameterized complexity for classical algorithms.
Specifically, we focus on testing properties of functions. By parameterizing the query complexity in terms of the size r of the image of the input function, we obtain testers for monotonicity and convexity of functions of the form f:[n]to mathbb{R} with query complexity O(log r), with no dependence on n. The result for monotonicity circumvents the Omega(log n) lower bound by Fischer (Inf. Comput., 2004) for this problem. We present several other parameterized testers, providing compelling evidence that expressing the query complexity of property testers in terms of the input size is not always the best choice
Isoperimetric Inequalities for Real-Valued Functions with Applications to Monotonicity Testing
We generalize the celebrated isoperimetric inequality of Khot, Minzer, and Safra (SICOMP 2018) for Boolean functions to the case of real-valued functions f:{0,1}^d ? ?. Our main tool in the proof of the generalized inequality is a new Boolean decomposition that represents every real-valued function f over an arbitrary partially ordered domain as a collection of Boolean functions over the same domain, roughly capturing the distance of f to monotonicity and the structure of violations of f to monotonicity.
We apply our generalized isoperimetric inequality to improve algorithms for testing monotonicity and approximating the distance to monotonicity for real-valued functions. Our tester for monotonicity has query complexity O?(min(r ?d,d)), where r is the size of the image of the input function. (The best previously known tester makes O(d) queries, as shown by Chakrabarty and Seshadhri (STOC 2013).) Our tester is nonadaptive and has 1-sided error. We prove a matching lower bound for nonadaptive, 1-sided error testers for monotonicity. We also show that the distance to monotonicity of real-valued functions that are ?-far from monotone can be approximated nonadaptively within a factor of O(?{d log d}) with query complexity polynomial in 1/? and the dimension d. This query complexity is known to be nearly optimal for nonadaptive algorithms even for the special case of Boolean functions. (The best previously known distance approximation algorithm for real-valued functions, by Fattal and Ron (TALG 2010) achieves O(d log r)-approximation.
Testing Hereditary Properties of Sequences
A hereditary property of a sequence is one that is preserved when restricting to subsequences. We show that there exist hereditary properties of sequences that cannot be tested with sublinear queries, resolving an open question posed by Newman et al. This proof relies crucially on an infinite alphabet, however; for finite alphabets, we observe that any hereditary property can be tested with a constant number of queries