The knockoff filter of Barber and Candes (arXiv:1404.5609) is a flexible
framework for multiple testing in supervised learning models, based on
introducing synthetic predictor variables to control the false discovery rate
(FDR). Using the conditional calibration framework of Fithian and Lei
(arXiv:2007.10438), we introduce the calibrated knockoff procedure, a method
that uniformly improves the power of any knockoff procedure. We implement our
method for fixed-X knockoffs and show theoretically and empirically that the
improvement is especially notable in two contexts where knockoff methods can be
nearly powerless: when the rejection set is small, and when the structure of
the design matrix prevents us from constructing good knockoff variables. In
these contexts, calibrated knockoffs even outperform competing FDR-controlling
methods like the (dependence-adjusted) Benjamini-Hochberg procedure in many
scenarios.Comment: 52 pages, 19 figure