Chernoff bound is a fundamental tool in theoretical computer science. It has
been extensively used in randomized algorithm design and stochastic type
analysis. Discrepancy theory, which deals with finding a bi-coloring of a set
system such that the coloring of each set is balanced, has a huge number of
applications in approximation algorithms design. Chernoff bound [Che52] implies
that a random bi-coloring of any set system with n sets and n elements will
have discrepancy O(nlogn) with high probability, while the famous
result by Spencer [Spe85] shows that there exists an O(n) discrepancy
solution.
The study of hyperbolic polynomials dates back to the early 20th century when
used to solve PDEs by G{\aa}rding [G{\aa}r59]. In recent years, more
applications are found in control theory, optimization, real algebraic
geometry, and so on. In particular, the breakthrough result by Marcus,
Spielman, and Srivastava [MSS15] uses the theory of hyperbolic polynomials to
prove the Kadison-Singer conjecture [KS59], which is closely related to
discrepancy theory.
In this paper, we present a list of new results for hyperbolic polynomials:
* We show two nearly optimal hyperbolic Chernoff bounds: one for Rademacher
sum of arbitrary vectors and another for random vectors in the hyperbolic cone.
* We show a hyperbolic anti-concentration bound.
* We generalize the hyperbolic Kadison-Singer theorem [Br\"a18] for vectors
in sub-isotropic position, and prove a hyperbolic Spencer theorem for any
constant hyperbolic rank vectors.
The classical matrix Chernoff and discrepancy results are based on
determinant polynomial. To the best of our knowledge, this paper is the first
work that shows either concentration or anti-concentration results for
hyperbolic polynomials. We hope our findings provide more insights into
hyperbolic and discrepancy theories