60 research outputs found

    Aspects of nonlinear approximation with dictionaries

    Get PDF

    Weighted Thresholding and Nonlinear Approximation

    Full text link
    We present a new method for performing nonlinear approximation with redundant dictionaries. The method constructs an m−m-term approximation of the signal by thresholding with respect to a weighted version of its canonical expansion coefficients, thereby accounting for dependency between the coefficients. The main result is an associated strong Jackson embedding, which provides an upper bound on the corresponding reconstruction error. To complement the theoretical results, we compare the proposed method to the pure greedy method and the Windowed-Group Lasso by denoising music signals with elements from a Gabor dictionary.Comment: 22 pages, 3 figure

    Sample Complexity of Offline Reinforcement Learning with Deep ReLU Networks

    Full text link
    We study the statistical theory of offline reinforcement learning (RL) with deep ReLU network function approximation. We analyze a variant of fitted-Q iteration (FQI) algorithm under a new dynamic condition that we call Besov dynamic closure, which encompasses the conditions from prior analyses for deep neural network function approximation. Under Besov dynamic closure, we prove that the FQI-type algorithm enjoys the sample complexity of O~(κ1+d/α⋅ϵ−2−2d/α)\tilde{\mathcal{O}}\left( \kappa^{1 + d/\alpha} \cdot \epsilon^{-2 - 2d/\alpha} \right) where κ\kappa is a distribution shift measure, dd is the dimensionality of the state-action space, α\alpha is the (possibly fractional) smoothness parameter of the underlying MDP, and ϵ\epsilon is a user-specified precision. This is an improvement over the sample complexity of O~(K⋅κ2+d/α⋅ϵ−2−d/α)\tilde{\mathcal{O}}\left( K \cdot \kappa^{2 + d/\alpha} \cdot \epsilon^{-2 - d/\alpha} \right) in the prior result [Yang et al., 2019] where KK is an algorithmic iteration number which is arbitrarily large in practice. Importantly, our sample complexity is obtained under the new general dynamic condition and a data-dependent structure where the latter is either ignored in prior algorithms or improperly handled by prior analyses. This is the first comprehensive analysis for offline RL with deep ReLU network function approximation under a general setting.Comment: A short version published in the ICML Workshop on Reinforcement Learning Theory, 202

    Wavelet and Multiscale Methods

    Get PDF
    Various scientific models demand finer and finer resolutions of relevant features. Paradoxically, increasing computational power serves to even heighten this demand. Namely, the wealth of available data itself becomes a major obstruction. Extracting essential information from complex structures and developing rigorous models to quantify the quality of information leads to tasks that are not tractable by standard numerical techniques. The last decade has seen the emergence of several new computational methodologies to address this situation. Their common features are the nonlinearity of the solution methods as well as the ability of separating solution characteristics living on different length scales. Perhaps the most prominent examples lie in multigrid methods and adaptive grid solvers for partial differential equations. These have substantially advanced the frontiers of computability for certain problem classes in numerical analysis. Other highly visible examples are: regression techniques in nonparametric statistical estimation, the design of universal estimators in the context of mathematical learning theory and machine learning; the investigation of greedy algorithms in complexity theory, compression techniques and encoding in signal and image processing; the solution of global operator equations through the compression of fully populated matrices arising from boundary integral equations with the aid of multipole expansions and hierarchical matrices; attacking problems in high spatial dimensions by sparse grid or hyperbolic wavelet concepts. This workshop proposed to deepen the understanding of the underlying mathematical concepts that drive this new evolution of computation and to promote the exchange of ideas emerging in various disciplines
    • …
    corecore