28 research outputs found

    Jointly Sparse Support Recovery via Deep Auto-encoder with Applications in MIMO-based Grant-Free Random Access for mMTC

    Full text link
    In this paper, a data-driven approach is proposed to jointly design the common sensing (measurement) matrix and jointly support recovery method for complex signals, using a standard deep auto-encoder for real numbers. The auto-encoder in the proposed approach includes an encoder that mimics the noisy linear measurement process for jointly sparse signals with a common sensing matrix, and a decoder that approximately performs jointly sparse support recovery based on the empirical covariance matrix of noisy linear measurements. The proposed approach can effectively utilize the feature of common support and properties of sparsity patterns to achieve high recovery accuracy, and has significantly shorter computation time than existing methods. We also study an application example, i.e., device activity detection in Multiple-Input Multiple-Output (MIMO)-based grant-free random access for massive machine type communications (mMTC). The numerical results show that the proposed approach can provide pilot sequences and device activity detection with better detection accuracy and substantially shorter computation time than well-known recovery methods.Comment: 5 pages, 8 figures, to be publised in IEEE SPAWC 2020. arXiv admin note: text overlap with arXiv:2002.0262

    Measurement Matrix Design for Compressive Sensing Based MIMO Radar

    Full text link
    In colocated multiple-input multiple-output (MIMO) radar using compressive sensing (CS), a receive node compresses its received signal via a linear transformation, referred to as measurement matrix. The samples are subsequently forwarded to a fusion center, where an L1-optimization problem is formulated and solved for target information. CS-based MIMO radar exploits the target sparsity in the angle-Doppler-range space and thus achieves the high localization performance of traditional MIMO radar but with many fewer measurements. The measurement matrix is vital for CS recovery performance. This paper considers the design of measurement matrices that achieve an optimality criterion that depends on the coherence of the sensing matrix (CSM) and/or signal-to-interference ratio (SIR). The first approach minimizes a performance penalty that is a linear combination of CSM and the inverse SIR. The second one imposes a structure on the measurement matrix and determines the parameters involved so that the SIR is enhanced. Depending on the transmit waveforms, the second approach can significantly improve SIR, while maintaining CSM comparable to that of the Gaussian random measurement matrix (GRMM). Simulations indicate that the proposed measurement matrices can improve detection accuracy as compared to a GRMM

    Discrimination on the Grassmann Manifold: Fundamental Limits of Subspace Classifiers

    Full text link
    We present fundamental limits on the reliable classification of linear and affine subspaces from noisy, linear features. Drawing an analogy between discrimination among subspaces and communication over vector wireless channels, we propose two Shannon-inspired measures to characterize asymptotic classifier performance. First, we define the classification capacity, which characterizes necessary and sufficient conditions for the misclassification probability to vanish as the signal dimension, the number of features, and the number of subspaces to be discerned all approach infinity. Second, we define the diversity-discrimination tradeoff which, by analogy with the diversity-multiplexing tradeoff of fading vector channels, characterizes relationships between the number of discernible subspaces and the misclassification probability as the noise power approaches zero. We derive upper and lower bounds on these measures which are tight in many regimes. Numerical results, including a face recognition application, validate the results in practice.Comment: 19 pages, 4 figures. Revised submission to IEEE Transactions on Information Theor
    corecore