This work is concerned with derivation and analysis of a modified vectorial kernel orthogonal greedy algorithm (VKOGA) for approximation of nonlinear vectorial functions. The algorithm pursues simultaneous approximation of all vector components over a shared linear subspace of the underlying function Hilbert space in a greedy fashion [16, 37] and inherits the selection principle of the f /P-Greedy algorithm . For the considered algorithm we perform a limit analysis of the selection criteria for already included subspace basis functions. We show that the approximation gain is bounded globally and for the multivariate case the limit functions correspond to a directional Hermite interpolation. We further prove algebraic convergence similar to , improved by a dimension-dependent factor, and introduce a new a-posteriori error bound. Comparison to related variants of our algorithm are presented. Targeted applications of this algorithm are model reduction of multiscale models . Sparse approximation of nonlinear functions is a challenging task that arises in many different areas of modern computing. A key element to find sparse representations is the concept of m-term approximation [31, 8], which basically is a measure of how well a function from a given function space can be approximated by linearly combining m functions ou
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.