127,687 research outputs found
Computation of distances for regular and context-free probabilistic languages
Several mathematical distances between probabilistic languages have been investigated in the literature, motivated by applications in language modeling, computational biology, syntactic pattern matching and machine learning. In most cases, only pairs of probabilistic regular languages were considered. In this paper we extend the previous results to pairs of languages generated by a probabilistic context-free grammar and a probabilistic finite automaton.PostprintPeer reviewe
Recent Advances in Optimal Transport for Machine Learning
Recently, Optimal Transport has been proposed as a probabilistic framework in
Machine Learning for comparing and manipulating probability distributions. This
is rooted in its rich history and theory, and has offered new solutions to
different problems in machine learning, such as generative modeling and
transfer learning. In this survey we explore contributions of Optimal Transport
for Machine Learning over the period 2012 -- 2022, focusing on four sub-fields
of Machine Learning: supervised, unsupervised, transfer and reinforcement
learning. We further highlight the recent development in computational Optimal
Transport, and its interplay with Machine Learning practice.Comment: 20 pages,5 figures,under revie
Unsupervised Generative Modeling Using Matrix Product States
Generative modeling, which learns joint probability distribution from data
and generates samples according to it, is an important task in machine learning
and artificial intelligence. Inspired by probabilistic interpretation of
quantum physics, we propose a generative model using matrix product states,
which is a tensor network originally proposed for describing (particularly
one-dimensional) entangled quantum states. Our model enjoys efficient learning
analogous to the density matrix renormalization group method, which allows
dynamically adjusting dimensions of the tensors and offers an efficient direct
sampling approach for generative tasks. We apply our method to generative
modeling of several standard datasets including the Bars and Stripes, random
binary patterns and the MNIST handwritten digits to illustrate the abilities,
features and drawbacks of our model over popular generative models such as
Hopfield model, Boltzmann machines and generative adversarial networks. Our
work sheds light on many interesting directions of future exploration on the
development of quantum-inspired algorithms for unsupervised machine learning,
which are promisingly possible to be realized on quantum devices.Comment: 11 pages, 12 figures (not including the TNs) GitHub Page:
https://congzlwag.github.io/UnsupGenModbyMPS
Probabilistic modeling and machine learning in structural and systems biology
This supplement contains extended versions of a selected subset of papers presented at the workshop PMSB 2007, Probabilistic Modeling and Machine Learning in Structural and Systems Biology, Tuusula, Finland, from June 17 to 18, 2006
- …