Drawing statistical inferences from large datasets in a model-robust way is
an important problem in statistics and data science. In this paper, we propose
methods that are robust to large and unequal noise in different observational
units (i.e., heteroskedasticity) for statistical inference in linear
regression. We leverage the Hadamard estimator, which is unbiased for the
variances of ordinary least-squares regression. This is in contrast to the
popular White's sandwich estimator, which can be substantially biased in high
dimensions. We propose to estimate the signal strength, noise level,
signal-to-noise ratio, and mean squared error via the Hadamard estimator. We
develop a new degrees of freedom adjustment that gives more accurate confidence
intervals than variants of White's sandwich estimator. Moreover, we provide
conditions ensuring the estimator is well-defined, by studying a new random
matrix ensemble in which the entries of a random orthogonal projection matrix
are squared. We also show approximate normality, using the second-order
Poincare inequality. Our work provides improved statistical theory and methods
for linear regression in high dimensions