Toward Minimal-Sufficiency in Regression Tasks: An Approach Based on a Variational Estimation Bottleneck

Abstract

We propose a new variational estimation bottleneck based on a mean-squared error metric to aid regression tasks. In particular, this bottleneck - which draws inspiration from a variational information bottleneck for classification counterparts - consists of two components: (1) one captures the notion of Vr -sufficiency that quantifies the ability for an estimator in some class of estimators Vr to infer the quantity of interest; (2) the other component appears to capture a notion of Vr - minimality that quantifies the ability of the estimator to generalize to new data. We demonstrate how to train this bottleneck for regression problems. We also conduct various experiments in image denoising and deraining applications showcasing that our proposed approach can lead to neural network regressors offering better performance without suffering from overfitting

    Similar works