5,420 research outputs found

    Universal Approximation Depth and Errors of Narrow Belief Networks with Discrete Units

    Full text link
    We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks that can approximate any probability distribution on the states of their visible units arbitrarily well. We relax the setting of binary units (Sutskever and Hinton, 2008; Le Roux and Bengio, 2008, 2010; Mont\'ufar and Ay, 2011) to units with arbitrary finite state spaces, and the vanishing approximation error to an arbitrary approximation error tolerance. For example, we show that a qq-ary deep belief network with L2+qmδ1q1L\geq 2+\frac{q^{\lceil m-\delta \rceil}-1}{q-1} layers of width nm+logq(m)+1n \leq m + \log_q(m) + 1 for some mNm\in \mathbb{N} can approximate any probability distribution on {0,1,,q1}n\{0,1,\ldots,q-1\}^n without exceeding a Kullback-Leibler divergence of δ\delta. Our analysis covers discrete restricted Boltzmann machines and na\"ive Bayes models as special cases.Comment: 19 pages, 5 figures, 1 tabl

    Top-Down Composition of Software Architectures

    Get PDF
    This paper discusses an approach for top-down composition of software architectures. First, an architecture is derived that addresses functional requirements only. This architecture contains a number of variability points which are next filled in to address quality concerns. The quality requirements and associated architectural solution fragments are captured in a so-called Feature-Solution (FS) graph. The solution fragments captured in this graph are used to iteratively compose an architecture. Our versatile composition technique allows for pre- and post-refinements, and refinements that involve multiple variability points. In addition, the usage of the FS graph supports Aspect-Oriented Programming (AOP) at the architecture level
    corecore