1 research outputs found

    Multivariate Extension of Matrix-based Renyi's \alpha-order Entropy Functional

    Full text link
    The matrix-based Renyi's \alpha-order entropy functional was recently introduced using the normalized eigenspectrum of a Hermitian matrix of the projected data in a reproducing kernel Hilbert space (RKHS). However, the current theory in the matrix-based Renyi's \alpha-order entropy functional only defines the entropy of a single variable or mutual information between two random variables. In information theory and machine learning communities, one is also frequently interested in multivariate information quantities, such as the multivariate joint entropy and different interactive quantities among multiple variables. In this paper, we first define the matrix-based Renyi's \alpha-order joint entropy among multiple variables. We then show how this definition can ease the estimation of various information quantities that measure the interactions among multiple variables, such as interactive information and total correlation. We finally present an application to feature selection to show how our definition provides a simple yet powerful way to estimate a widely-acknowledged intractable quantity from data. A real example on hyperspectral image (HSI) band selection is also provided.Comment: To appear in IEEE Transactions on Pattern Analysis and Machine Intelligence. Matlab code is available from Google drive at https://drive.google.com/open?id=1SlxzEOX8RbnLwCgRyqGwMOL7vuT90Gje or Baidu Cloud at https://pan.baidu.com/s/1xupfXCmIV20gXPr0TicGkg (access code: d1sa
    corecore