35,352 research outputs found
Algorithmic information and incompressibility of families of multidimensional networks
This article presents a theoretical investigation of string-based generalized
representations of families of finite networks in a multidimensional space.
First, we study the recursive labeling of networks with (finite) arbitrary node
dimensions (or aspects), such as time instants or layers. In particular, we
study these networks that are formalized in the form of multiaspect graphs. We
show that, unlike classical graphs, the algorithmic information of a
multidimensional network is not in general dominated by the algorithmic
information of the binary sequence that determines the presence or absence of
edges. This universal algorithmic approach sets limitations and conditions for
irreducible information content analysis in comparing networks with a large
number of dimensions, such as multilayer networks. Nevertheless, we show that
there are particular cases of infinite nesting families of finite
multidimensional networks with a unified recursive labeling such that each
member of these families is incompressible. From these results, we study
network topological properties and equivalences in irreducible information
content of multidimensional networks in comparison to their isomorphic
classical graph.Comment: Extended preprint version of the pape
Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity
The relationship between the Bayesian approach and the minimum description
length approach is established. We sharpen and clarify the general modeling
principles MDL and MML, abstracted as the ideal MDL principle and defined from
Bayes's rule by means of Kolmogorov complexity. The basic condition under which
the ideal principle should be applied is encapsulated as the Fundamental
Inequality, which in broad terms states that the principle is valid when the
data are random, relative to every contemplated hypothesis and also these
hypotheses are random relative to the (universal) prior. Basically, the ideal
principle states that the prior probability associated with the hypothesis
should be given by the algorithmic universal probability, and the sum of the
log universal probability of the model plus the log of the probability of the
data given the model should be minimized. If we restrict the model class to the
finite sets then application of the ideal principle turns into Kolmogorov's
minimal sufficient statistic. In general we show that data compression is
almost always the best strategy, both in hypothesis identification and
prediction.Comment: 35 pages, Latex. Submitted IEEE Trans. Inform. Theor
Counting smaller elements in the Tamari and m-Tamari lattices
We introduce new combinatorial objects, the interval- posets, that encode
intervals of the Tamari lattice. We then find a combinatorial interpretation of
the bilinear operator that appears in the functional equation of Tamari
intervals described by Chapoton. Thus, we retrieve this functional equation and
prove that the polynomial recursively computed from the bilinear operator on
each tree T counts the number of trees smaller than T in the Tamari order. Then
we show that a similar m + 1-linear operator is also used in the functionnal
equation of m-Tamari intervals. We explain how the m-Tamari lattices can be
interpreted in terms of m+1-ary trees or a certain class of binary trees. We
then use the interval-posets to recover the functional equation of m-Tamari
intervals and to prove a generalized formula that counts the number of elements
smaller than or equal to a given tree in the m-Tamari lattice.Comment: 46 pages + 3 pages of code appendix, 27 figures. Long version of
arXiv:1212.0751. To appear in Journal of Combinatorial Theory, Series
- …