We consider the problem of combining a (possibly uncountably infinite) set of
affine estimators in non-parametric regression model with heteroscedastic
Gaussian noise. Focusing on the exponentially weighted aggregate, we prove a
PAC-Bayesian type inequality that leads to sharp oracle inequalities in
discrete but also in continuous settings. The framework is general enough to
cover the combinations of various procedures such as least square regression,
kernel ridge regression, shrinking estimators and many other estimators used in
the literature on statistical inverse problems. As a consequence, we show that
the proposed aggregate provides an adaptive estimator in the exact minimax
sense without neither discretizing the range of tuning parameters nor splitting
the set of observations. We also illustrate numerically the good performance
achieved by the exponentially weighted aggregate