20,007 research outputs found

    Theoretical Interpretations and Applications of Radial Basis Function Networks

    Get PDF
    Medical applications usually used Radial Basis Function Networks just as Artificial Neural Networks. However, RBFNs are Knowledge-Based Networks that can be interpreted in several way: Artificial Neural Networks, Regularization Networks, Support Vector Machines, Wavelet Networks, Fuzzy Controllers, Kernel Estimators, Instanced-Based Learners. A survey of their interpretations and of their corresponding learning algorithms is provided as well as a brief survey on dynamic learning algorithms. RBFNs' interpretations can suggest applications that are particularly interesting in medical domains

    Three dimensional loop quantum gravity: physical scalar product and spin foam models

    Full text link
    In this paper, we address the problem of the dynamics in three dimensional loop quantum gravity with zero cosmological constant. We construct a rigorous definition of Rovelli's generalized projection operator from the kinematical Hilbert space--corresponding to the quantization of the infinite dimensional kinematical configuration space of the theory--to the physical Hilbert space. In particular, we provide the definition of the physical scalar product which can be represented in terms of a sum over (finite) spin-foam amplitudes. Therefore, we establish a clear-cut connection between the canonical quantization of three dimensional gravity and spin-foam models. We emphasize two main properties of the result: first that no cut-off in the kinematical degrees of freedom of the theory is introduced (in contrast to standard `lattice' methods), and second that no ill-defined sum over spins (`bubble' divergences) are present in the spin foam representation.Comment: Typos corrected, version appearing in Class. Quant. Gra

    A Comparative Study of Pairwise Learning Methods based on Kernel Ridge Regression

    Full text link
    Many machine learning problems can be formulated as predicting labels for a pair of objects. Problems of that kind are often referred to as pairwise learning, dyadic prediction or network inference problems. During the last decade kernel methods have played a dominant role in pairwise learning. They still obtain a state-of-the-art predictive performance, but a theoretical analysis of their behavior has been underexplored in the machine learning literature. In this work we review and unify existing kernel-based algorithms that are commonly used in different pairwise learning settings, ranging from matrix filtering to zero-shot learning. To this end, we focus on closed-form efficient instantiations of Kronecker kernel ridge regression. We show that independent task kernel ridge regression, two-step kernel ridge regression and a linear matrix filter arise naturally as a special case of Kronecker kernel ridge regression, implying that all these methods implicitly minimize a squared loss. In addition, we analyze universality, consistency and spectral filtering properties. Our theoretical results provide valuable insights in assessing the advantages and limitations of existing pairwise learning methods.Comment: arXiv admin note: text overlap with arXiv:1606.0427
    • …
    corecore