21,888 research outputs found

    Geometrical complexity of data approximators

    Full text link
    There are many methods developed to approximate a cloud of vectors embedded in high-dimensional space by simpler objects: starting from principal points and linear manifolds to self-organizing maps, neural gas, elastic maps, various types of principal curves and principal trees, and so on. For each type of approximators the measure of the approximator complexity was developed too. These measures are necessary to find the balance between accuracy and complexity and to define the optimal approximations of a given type. We propose a measure of complexity (geometrical complexity) which is applicable to approximators of several types and which allows comparing data approximations of different types.Comment: 10 pages, 3 figures, minor correction and extensio

    Data complexity measured by principal graphs

    Full text link
    How to measure the complexity of a finite set of vectors embedded in a multidimensional space? This is a non-trivial question which can be approached in many different ways. Here we suggest a set of data complexity measures using universal approximators, principal cubic complexes. Principal cubic complexes generalise the notion of principal manifolds for datasets with non-trivial topologies. The type of the principal cubic complex is determined by its dimension and a grammar of elementary graph transformations. The simplest grammar produces principal trees. We introduce three natural types of data complexity: 1) geometric (deviation of the data's approximator from some "idealized" configuration, such as deviation from harmonicity); 2) structural (how many elements of a principal graph are needed to approximate the data), and 3) construction complexity (how many applications of elementary graph transformations are needed to construct the principal object starting from the simplest one). We compute these measures for several simulated and real-life data distributions and show them in the "accuracy-complexity" plots, helping to optimize the accuracy/complexity ratio. We discuss various issues connected with measuring data complexity. Software for computing data complexity measures from principal cubic complexes is provided as well.Comment: Computers and Mathematics with Applications, in pres

    The Prediction value

    Full text link
    We introduce the prediction value (PV) as a measure of players' informational importance in probabilistic TU games. The latter combine a standard TU game and a probability distribution over the set of coalitions. Player ii's prediction value equals the difference between the conditional expectations of v(S)v(S) when ii cooperates or not. We characterize the prediction value as a special member of the class of (extended) values which satisfy anonymity, linearity and a consistency property. Every nn-player binomial semivalue coincides with the PV for a particular family of probability distributions over coalitions. The PV can thus be regarded as a power index in specific cases. Conversely, some semivalues -- including the Banzhaf but not the Shapley value -- can be interpreted in terms of informational importance.Comment: 26 pages, 2 table

    Theoretical framework for quantum networks

    Full text link
    We present a framework to treat quantum networks and all possible transformations thereof, including as special cases all possible manipulations of quantum states, measurements, and channels, such as, e.g., cloning, discrimination, estimation, and tomography. Our framework is based on the concepts of quantum comb-which describes all transformations achievable by a given quantum network-and link product-the operation of connecting two quantum networks. Quantum networks are treated both from a constructive point of view-based on connections of elementary circuits-and from an axiomatic one-based on a hierarchy of admissible quantum maps. In the axiomatic context a fundamental property is shown, which we call universality of quantum memory channels: any admissible transformation of quantum networks can be realized by a suitable sequence of memory channels. The open problem whether this property fails for some nonquantum theory, e.g., for no-signaling boxes, is posed.Comment: 23 pages, revtex

    Lorenz, G\"{o}del and Penrose: New perspectives on determinism and causality in fundamental physics

    Full text link
    Despite being known for his pioneering work on chaotic unpredictability, the key discovery at the core of meteorologist Ed Lorenz's work is the link between space-time calculus and state-space fractal geometry. Indeed, properties of Lorenz's fractal invariant set relate space-time calculus to deep areas of mathematics such as G\"{o}del's Incompleteness Theorem. These properties, combined with some recent developments in theoretical and observational cosmology, motivate what is referred to as the `cosmological invariant set postulate': that the universe UU can be considered a deterministic dynamical system evolving on a causal measure-zero fractal invariant set IUI_U in its state space. Symbolic representations of IUI_U are constructed explicitly based on permutation representations of quaternions. The resulting `invariant set theory' provides some new perspectives on determinism and causality in fundamental physics. For example, whilst the cosmological invariant set appears to have a rich enough structure to allow a description of quantum probability, its measure-zero character ensures it is sparse enough to prevent invariant set theory being constrained by the Bell inequality (consistent with a partial violation of the so-called measurement independence postulate). The primacy of geometry as embodied in the proposed theory extends the principles underpinning general relativity. As a result, the physical basis for contemporary programmes which apply standard field quantisation to some putative gravitational lagrangian is questioned. Consistent with Penrose's suggestion of a deterministic but non-computable theory of fundamental physics, a `gravitational theory of the quantum' is proposed based on the geometry of IUI_U, with potential observational consequences for the dark universe.Comment: This manuscript has been accepted for publication in Contemporary Physics and is based on the author's 9th Dennis Sciama Lecture, given in Oxford and Triest
    • …
    corecore