5,843 research outputs found

    Multiscale Decompositions and Optimization

    Full text link
    In this paper, the following type Tikhonov regularization problem will be systematically studied: [(u_t,v_t):=\argmin_{u+v=f} {|v|_X+t|u|_Y},] where YY is a smooth space such as a \BV space or a Sobolev space and XX is the pace in which we measure distortion. Examples of the above problem occur in denoising in image processing, in numerically treating inverse problems, and in the sparse recovery problem of compressed sensing. It is also at the heart of interpolation of linear operators by the real method of interpolation. We shall characterize of the minimizing pair (ut,vt)(u_t,v_t) for (X,Y)=(L_2(\Omega),\BV(\Omega)) as a primary example and generalize Yves Meyer's result in [11] and Antonin Chambolle's result in [6]. After that, the following multiscale decomposition scheme will be studied: [u_{k+1}:=\argmin_{u\in \BV(\Omega)\cap L_2(\Omega)} {1/2|f-u|^2_{L_2}+t_{k}|u-u_k|_{\BV}},] where u0=0u_0=0 and Ī©\Omega is a bounded Lipschitz domain in Rd\R^d. This method was introduced by Eitan Tadmor et al. and we will improve the L2L_2 convergence result in \cite{Tadmor}. Other pairs such as (X,Y)=(Lp,W1(LĻ„))(X,Y)=(L_p,W^{1}(L_\tau)) and (X,Y)=(ā„“2,ā„“p)(X,Y)=(\ell_2,\ell_p) will also be mentioned. In the end, the numerical implementation for (X,Y)=(L_2(\Omega),\BV(\Omega)) and the corresponding convergence results will be given.Comment: 33 page

    A scale-based approach to finding effective dimensionality in manifold learning

    Get PDF
    The discovering of low-dimensional manifolds in high-dimensional data is one of the main goals in manifold learning. We propose a new approach to identify the effective dimension (intrinsic dimension) of low-dimensional manifolds. The scale space viewpoint is the key to our approach enabling us to meet the challenge of noisy data. Our approach finds the effective dimensionality of the data over all scale without any prior knowledge. It has better performance compared with other methods especially in the presence of relatively large noise and is computationally efficient.Comment: Published in at http://dx.doi.org/10.1214/07-EJS137 the Electronic Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Asymptotic stability for neural networks with mixed time-delays: The discrete-time case

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link - Copyright 2009 Elsevier LtdThis paper is concerned with the stability analysis problem for a new class of discrete-time recurrent neural networks with mixed time-delays. The mixed time-delays that consist of both the discrete and distributed time-delays are addressed, for the first time, when analyzing the asymptotic stability for discrete-time neural networks. The activation functions are not required to be differentiable or strictly monotonic. The existence of the equilibrium point is first proved under mild conditions. By constructing a new Lyapnuovā€“Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the discrete-time neural networks to be globally asymptotically stable. As an extension, we further consider the stability analysis problem for the same class of neural networks but with state-dependent stochastic disturbances. All the conditions obtained are expressed in terms of LMIs whose feasibility can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Biotechnology and Biological Sciences Research Council (BBSRC) of the UK under Grants BB/C506264/1 and 100/EGM17735, the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grants GR/S27658/01 and EP/C524586/1, an International Joint Project sponsored by the Royal Society of the UK, the Natural Science Foundation of Jiangsu Province of China under Grant BK2007075, the National Natural Science Foundation of China under Grant 60774073, and the Alexander von Humboldt Foundation of Germany
    • ā€¦
    corecore