We study the finite sample behavior of Lasso-based inference methods such as
post double Lasso and debiased Lasso. We show that these methods can exhibit
substantial omitted variable biases (OVBs) due to Lasso not selecting relevant
controls. This phenomenon can occur even when the coefficients are sparse and
the sample size is large and larger than the number of controls. Therefore,
relying on the existing asymptotic inference theory can be problematic in
empirical applications. We compare the Lasso-based inference methods to modern
high-dimensional OLS-based methods and provide practical guidance