19 research outputs found
Absolute Convergence of Rational Series is Semi-decidable
International audienceWe study \emph{real-valued absolutely convergent rational series}, i.e. functions , defined over a free monoid , that can be computed by a multiplicity automaton and such that . We prove that any absolutely convergent rational series can be computed by a multiplicity automaton which has the property that is simply convergent, where is the series computed by the automaton derived from by taking the absolute values of all its parameters. Then, we prove that the set composed of all absolutely convergent rational series is semi-decidable and we show that the sum can be estimated to any accuracy rate for any . We also introduce a spectral radius-like parameter which satisfies the following property: is absolutely convergent iff
Sequential Density Estimation via Nonlinear Continuous Weighted Finite Automata
Weighted finite automata (WFAs) have been widely applied in many fields. One
of the classic problems for WFAs is probability distribution estimation over
sequences of discrete symbols. Although WFAs have been extended to deal with
continuous input data, namely continuous WFAs (CWFAs), it is still unclear how
to approximate density functions over sequences of continuous random variables
using WFA-based models, due to the limitation on the expressiveness of the
model as well as the tractability of approximating density functions via CWFAs.
In this paper, we propose a nonlinear extension to the CWFA model to first
improve its expressiveness, we refer to it as the nonlinear continuous WFAs
(NCWFAs). Then we leverage the so-called RNADE method, which is a well-known
density estimator based on neural networks, and propose the RNADE-NCWFA model.
The RNADE-NCWFA model computes a density function by design. We show that this
model is strictly more expressive than the Gaussian HMM model, which CWFA
cannot approximate. Empirically, we conduct a synthetic experiment using
Gaussian HMM generated data. We focus on evaluating the model's ability to
estimate densities for sequences of varying lengths (longer length than the
training data). We observe that our model performs the best among the compared
baseline methods
Some improvements of the spectral learning approach for probabilistic grammatical inference
International audienceSpectral methods propose new and elegant solutions in probabilistic grammatical inference. We propose two ways to improve them. We show how a linear representation, or equivalently a weighted automata, output by the spectral learning algorithm can be taken as an initial point for the Baum Welch algorithm, in order to increase the likelihood of the observation data. Secondly, we show how the inference problem can naturally be expressed in the framework of Structured Low-Rank Approximation. Both ideas are tested on a benchmark extracted from the PAutomaC challenge
Residual Nominal Automata
Nominal automata are models for accepting languages over infinite alphabets.
In this paper we refine the hierarchy of nondeterministic nominal automata, by
developing the theory of residual nominal automata. In particular, we show that
they admit canonical minimal representatives, and that the universality problem
becomes decidable. We also study exact learning of these automata, and settle
questions that were left open about their learnability via observations
Residual Nominal Automata
We are motivated by the following question: which nominal languages admit an active learning algorithm? This question was left open in previous work, and is particularly challenging for languages recognised by nondeterministic automata. To answer it, we develop the theory of residual nominal automata, a subclass of nondeterministic nominal automata. We prove that this class has canonical representatives, which can always be constructed via a finite number of observations. This property enables active learning algorithms, and makes up for the fact that residuality - a semantic property - is undecidable for nominal automata. Our construction for canonical residual automata is based on a machine-independent characterisation of residual languages, for which we develop new results in nominal lattice theory. Studying residuality in the context of nominal languages is a step towards a better understanding of learnability of automata with some sort of nondeterminism