We study monotonicity testing of functions f:{0,1}d→{0,1}
using sample-based algorithms, which are only allowed to observe the value of
f on points drawn independently from the uniform distribution. A classic
result by Bshouty-Tamon (J. ACM 1996) proved that monotone functions can be
learned with exp(O(min{ε1d,d})) samples and it
is not hard to show that this bound extends to testing. Prior to our work the
only lower bound for this problem was Ω(exp(d)/ε) in
the small ε parameter regime, when ε=O(d−3/2), due
to Goldreich-Goldwasser-Lehman-Ron-Samorodnitsky (Combinatorica 2000). Thus,
the sample complexity of monotonicity testing was wide open for ε≫d−3/2. We resolve this question, obtaining a tight lower bound of
exp(Ω(min{ε1d,d})) for all ε
at most a sufficiently small constant. In fact, we prove a much more general
result, showing that the sample complexity of k-monotonicity testing and
learning for functions f:{0,1}d→[r] is
exp(Θ(min{εrkd,d})). For testing with
one-sided error we show that the sample complexity is exp(Θ(d)).
Beyond the hypercube, we prove nearly tight bounds (up to polylog factors of
d,k,r,1/ε in the exponent) of
exp(Θ(min{εrkd,d})) on the
sample complexity of testing and learning measurable k-monotone functions f:Rd→[r] under product distributions. Our upper bound
improves upon the previous bound of
exp(O(min{ε2kd,d})) by
Harms-Yoshida (ICALP 2022) for Boolean functions (r=2)