research

Analysis of the limiting spectral measure of large random matrices of the separable covariance type

Abstract

Consider the random matrix Σ=D1/2XD~1/2\Sigma = D^{1/2} X \widetilde D^{1/2} where DD and D~\widetilde D are deterministic Hermitian nonnegative matrices with respective dimensions N×NN \times N and n×nn \times n, and where XX is a random matrix with independent and identically distributed centered elements with variance 1/n1/n. Assume that the dimensions NN and nn grow to infinity at the same pace, and that the spectral measures of DD and D~\widetilde D converge as N,n→∞N,n \to\infty towards two probability measures. Then it is known that the spectral measure of ΣΣ∗\Sigma\Sigma^* converges towards a probability measure μ\mu characterized by its Stieltjes Transform. In this paper, it is shown that μ\mu has a density away from zero, this density is analytical wherever it is positive, and it behaves in most cases as ∣x−a∣\sqrt{|x - a|} near an edge aa of its support. A complete characterization of the support of μ\mu is also provided. \\ Beside its mathematical interest, this analysis finds applications in a certain class of statistical estimation problems.Comment: Correction of the proof of Lemma 3.

    Similar works