On the estimation of Tsallis entropy and a novel information measure based on its properties

Abstract

This letter explores a plug-in estimator of second-order Tsallis entropy based on Kernel Density Estimation (KDE) and its implicit regularization process. First, it is shown that the expected value of the estimator corresponds to the entropy of an Additive White Gaussian Noise (AWGN) model. Then, we prove various relevant properties of the Tsallis entropy: It is monotonically non-decreasing under random variables addition, its derivative with respect to the Gaussian noise power is monotonically non-increasing, and it is concave in the additive noise power. From these, we derive an information metric that provides an alternative to the strategy of regularization.This work was supported in part by the Spanish Ministry of Science and Innovation project RODIN under Grant PID2019-105717RB-C22, in part by Generalitat de Catalunya under Grant 2021 SGR 01033, in part by Fellowship 2019 FI 00620 and Fellowship 2023 FI “Joan Oró” 00050 by the Generalitat de Catalunya and the European Social Fund, and in part by Fellowship 2022 FPI-UPC 028 by the Universitat Politècnica de Catalunya and Banc de Santander. The associate editor coordinating the review of this manuscript and approving it for publication was Prof. Giuseppe Thadeu Freitas de AbreuPeer ReviewedPostprint (published version

    Similar works