3 research outputs found

    Sensitivity Analysis for Binary Sampling Systems via Quantitative Fisher Information Lower Bounds

    Full text link
    This article addresses the sensitivity of sensor systems with minimal signal digitization complexity regarding the estimation of analog model parameters. Digital measurements are exclusively available in a hard-limited form, and the parameters of the analog received signals shall be inferred through efficient algorithms. As a benchmark, the achievable estimation accuracy is to be assessed based on theoretical error bounds. To this end, characterization of the parametric likelihood is required, which forms a challenge for multivariate binary distributions. In this context, we analyze the Fisher information matrix of the exponential family and derive a conservative approximation for arbitrary models. The conservative information matrix rests on a surrogate exponential family, defined by two equivalences to the real data-generating system. This probabilistic notion enables designing estimators that consistently achieve the sensitivity level defined by the inverse of the conservative information matrix without characterizing the distributions involved. For parameter estimation with multivariate binary samples, using an equivalent quadratic exponential distribution tames the computational complexity of the conservative information matrix such that a quantitative assessment of the achievable error level becomes tractable. We exploit this for the performance analysis concerning signal parameter estimation with an array of low-complexity binary sensors by examining the achievable sensitivity in comparison to an ideal system featuring receivers supporting data acquisition with infinite amplitude resolution. Additionally, we demonstrate data-driven sensitivity analysis through the presented framework by learning the guaranteed achievable performance when processing sensor data obtained with recursive binary sampling schemes as implemented in ΣΔ\Sigma\Delta-modulating analog-to-digital converters.Comment: Former title was: Fisher Information Lower Bounds with Applications in Hardware-Aware Nonlinear Signal Processin

    Non-Linear Transformations of Gaussians and Gaussian-Mixtures with implications on Estimation and Information Theory

    Full text link
    This paper investigates the statistical properties of non-linear transformations (NLT) of random variables, in order to establish useful tools for estimation and information theory. Specifically, the paper focuses on linear regression analysis of the NLT output and derives sufficient general conditions to establish when the input-output regression coefficient is equal to the \emph{partial} regression coefficient of the output with respect to a (additive) part of the input. A special case is represented by zero-mean Gaussian inputs, obtained as the sum of other zero-mean Gaussian random variables. The paper shows how this property can be generalized to the regression coefficient of non-linear transformations of Gaussian-mixtures. Due to its generality, and the wide use of Gaussians and Gaussian-mixtures to statistically model several phenomena, this theoretical framework can find applications in multiple disciplines, such as communication, estimation, and information theory, when part of the nonlinear transformation input is the quantity of interest and the other part is the noise. In particular, the paper shows how the said properties can be exploited to simplify closed-form computation of the signal-to-noise ratio (SNR), the estimation mean-squared error (MSE), and bounds on the mutual information in additive non-Gaussian (possibly non-linear) channels, also establishing relationships among them.Comment: 26 pages, 4 figures (8 sub-figures), submitted to IEEE Trans. on Information Theory 20th April 201
    corecore