53,833 research outputs found

    Representation Learning: A Review and New Perspectives

    Full text link
    The success of machine learning algorithms generally depends on data representation, and we hypothesize that this is because different representations can entangle and hide more or less the different explanatory factors of variation behind the data. Although specific domain knowledge can be used to help design representations, learning with generic priors can also be used, and the quest for AI is motivating the design of more powerful representation-learning algorithms implementing such priors. This paper reviews recent work in the area of unsupervised feature learning and deep learning, covering advances in probabilistic models, auto-encoders, manifold learning, and deep networks. This motivates longer-term unanswered questions about the appropriate objectives for learning good representations, for computing representations (i.e., inference), and the geometrical connections between representation learning, density estimation and manifold learning

    Groupoid Semantics for Thermal Computing

    Full text link
    A groupoid semantics is presented for systems with both logical and thermal degrees of freedom. We apply this to a syntactic model for encryption, and obtain an algebraic characterization of the heat produced by the encryption function, as predicted by Landauer's principle. Our model has a linear representation theory that reveals an underlying quantum semantics, giving for the first time a functorial classical model for quantum teleportation and other quantum phenomena.Comment: We describe a groupoid model for thermodynamic computation, and a quantization procedure that turns encrypted communication into quantum teleportation. Everything is done using higher category theor

    Factor Graphs for Quantum Probabilities

    Full text link
    A factor-graph representation of quantum-mechanical probabilities (involving any number of measurements) is proposed. Unlike standard statistical models, the proposed representation uses auxiliary variables (state variables) that are not random variables. All joint probability distributions are marginals of some complex-valued function qq, and it is demonstrated how the basic concepts of quantum mechanics relate to factorizations and marginals of qq.Comment: To appear in IEEE Transactions on Information Theory, 201

    Receiver Architectures for MIMO-OFDM Based on a Combined VMP-SP Algorithm

    Get PDF
    Iterative information processing, either based on heuristics or analytical frameworks, has been shown to be a very powerful tool for the design of efficient, yet feasible, wireless receiver architectures. Within this context, algorithms performing message-passing on a probabilistic graph, such as the sum-product (SP) and variational message passing (VMP) algorithms, have become increasingly popular. In this contribution, we apply a combined VMP-SP message-passing technique to the design of receivers for MIMO-ODFM systems. The message-passing equations of the combined scheme can be obtained from the equations of the stationary points of a constrained region-based free energy approximation. When applied to a MIMO-OFDM probabilistic model, we obtain a generic receiver architecture performing iterative channel weight and noise precision estimation, equalization and data decoding. We show that this generic scheme can be particularized to a variety of different receiver structures, ranging from high-performance iterative structures to low complexity receivers. This allows for a flexible design of the signal processing specially tailored for the requirements of each specific application. The numerical assessment of our solutions, based on Monte Carlo simulations, corroborates the high performance of the proposed algorithms and their superiority to heuristic approaches
    • …
    corecore