10,089 research outputs found
An algorithm for recognition of n-collapsing words
AbstractA word w over a finite alphabet Σ is n-collapsing if for an arbitrary deterministic finite automaton A=〈Q,Σ,δ〉, the inequality |δ(Q,w)|≤|Q|−n holds provided that |δ(Q,u)|≤|Q|−n for some word u∈Σ+ (depending on A). We prove that the property of n-collapsing is algorithmically recognizable for any given positive integer n. We also prove that the language of all n-collapsing words is context-sensitive
From neural PCA to deep unsupervised learning
A network supporting deep unsupervised learning is presented. The network is
an autoencoder with lateral shortcut connections from the encoder to decoder at
each level of the hierarchy. The lateral shortcut connections allow the higher
levels of the hierarchy to focus on abstract invariant features. While standard
autoencoders are analogous to latent variable models with a single layer of
stochastic variables, the proposed network is analogous to hierarchical latent
variables models. Learning combines denoising autoencoder and denoising sources
separation frameworks. Each layer of the network contributes to the cost
function a term which measures the distance of the representations produced by
the encoder and the decoder. Since training signals originate from all levels
of the network, all layers can learn efficiently even in deep networks. The
speedup offered by cost terms from higher levels of the hierarchy and the
ability to learn invariant features are demonstrated in experiments.Comment: A revised version of an article that has been accepted for
publication in Advances in Independent Component Analysis and Learning
Machines (2015), edited by Ella Bingham, Samuel Kaski, Jorma Laaksonen and
Jouko Lampine
- …