1,416 research outputs found
Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review
The paper characterizes classes of functions for which deep learning can be
exponentially better than shallow learning. Deep convolutional networks are a
special case of these conditions, though weight sharing is not the main reason
for their exponential advantage
What you will gain by rounding : theory and algorithms for rounding rank
When factorizing binary matrices, we often have to make a choice between using expensive combinatorial methods
that retain the discrete nature of the data and using continuous methods that can be more efficient but destroy the discrete structure. Alternatively, we can first compute a continuous factorization and subsequently apply a rounding procedure to obtain a discrete representation. But what will we gain by rounding? Will this yield lower reconstruction errors? Is it easy
to find a low-rank matrix that rounds to a given binary matrix? Does it matter which threshold we use for rounding? Does it
matter if we allow for only non-negative factorizations? In this paper, we approach these and further questions by presenting
and studying the concept of rounding rank. We show that rounding rank is related to linear classification, dimensionality
reduction, and nested matrices. We also report on an extensive experimental study that compares different algorithms for finding good factorizations under the rounding rank model
Effective dimension of finite semigroups
In this paper we discuss various aspects of the problem of determining the
minimal dimension of an injective linear representation of a finite semigroup
over a field. We outline some general techniques and results, and apply them to
numerous examples.Comment: To appear in J. Pure Appl. Al
Tables of subspace codes
One of the main problems of subspace coding asks for the maximum possible
cardinality of a subspace code with minimum distance at least over
, where the dimensions of the codewords, which are vector
spaces, are contained in . In the special case of
one speaks of constant dimension codes. Since this (still) emerging
field is very prosperous on the one hand side and there are a lot of
connections to classical objects from Galois geometry it is a bit difficult to
keep or to obtain an overview about the current state of knowledge. To this end
we have implemented an on-line database of the (at least to us) known results
at \url{subspacecodes.uni-bayreuth.de}. The aim of this recurrently updated
technical report is to provide a user guide how this technical tool can be used
in research projects and to describe the so far implemented theoretic and
algorithmic knowledge.Comment: 44 pages, 6 tables, 7 screenshot
- …