10,787 research outputs found
How the Dimension of Space Affects the Products of Pre-Biotic Evolution: The Spatial Population Dynamics of Structural Complexity and The Emergence of Membranes
We show that autocatalytic networks of epsilon-machines and their population
dynamics differ substantially between spatial (geographically distributed) and
nonspatial (panmixia) populations. Generally, regions of spacetime-invariant
autocatalytic networks---or domains---emerge in geographically distributed
populations. These are separated by functional membranes of complementary
epsilon-machines that actively translate between the domains and are
responsible for their growth and stability. We analyze both spatial and
nonspatial populations, determining the algebraic properties of the
autocatalytic networks that allow for space to affect the dynamics and so
generate autocatalytic domains and membranes. In addition, we analyze
populations of intermediate spatial architecture, delineating the thresholds at
which spatial memory (information storage) begins to determine the character of
the emergent auto-catalytic organization.Comment: 9 pages, 7 figures, 2 tables;
http://cse.ucdavis.edu/~cmg/compmech/pubs/ss.ht
A Comparative Study of Pairwise Learning Methods based on Kernel Ridge Regression
Many machine learning problems can be formulated as predicting labels for a
pair of objects. Problems of that kind are often referred to as pairwise
learning, dyadic prediction or network inference problems. During the last
decade kernel methods have played a dominant role in pairwise learning. They
still obtain a state-of-the-art predictive performance, but a theoretical
analysis of their behavior has been underexplored in the machine learning
literature.
In this work we review and unify existing kernel-based algorithms that are
commonly used in different pairwise learning settings, ranging from matrix
filtering to zero-shot learning. To this end, we focus on closed-form efficient
instantiations of Kronecker kernel ridge regression. We show that independent
task kernel ridge regression, two-step kernel ridge regression and a linear
matrix filter arise naturally as a special case of Kronecker kernel ridge
regression, implying that all these methods implicitly minimize a squared loss.
In addition, we analyze universality, consistency and spectral filtering
properties. Our theoretical results provide valuable insights in assessing the
advantages and limitations of existing pairwise learning methods.Comment: arXiv admin note: text overlap with arXiv:1606.0427
- …