Divergence-based spectral approximation with degree constraint as a concave optimization problem

Abstract

The Kullback-Leibler pseudo-distance, or divergence, can be used as a criterion for spectral approximation. Unfortunately this criterion is not convex over the most general classes of rational spectra. In this work it will be shown that divergence minimization is equivalent to a costrained entropy minimization problem, whose concave structure can be exploited in order to guarantee global convergence in the most general case.QC 2011090

    Similar works

    Full text

    thumbnail-image