60 research outputs found
A Geometric Approach to the stabilisation of certain sequences of Kronecker coefficients
We give another proof, using tools from Geometric Invariant Theory, of a
result due to S. Sam and A. Snowden in 2014, concerning the stability of
Kro-necker coefficients. This result states that some sequences of Kronecker
coefficients eventually stabilise, and our method gives a nice geometric bound
from which the stabilisation occurs. We perform the explicit computation of
such a bound on two examples, one being the classical case of Murnaghan's
stability. Moreover, we see that our techniques apply to other coefficients
arising in Representation Theory: namely to some plethysm coefficients and in
the case of the tensor product of representations of the hyperoctahedral group.Comment: Manuscripta mathematica, Springer Verlag, In press,
\&\#x3008;https://doi.org/10.1007/s00229-018-1021-4\&\#x300
Subword balance, position indices and power sums
AbstractIn this paper, we investigate various ways of characterizing words, mainly over a binary alphabet, using information about the positions of occurrences of letters in words. We introduce two new measures associated with words, the position index and sum of position indices. We establish some characterizations, connections with Parikh matrices, and connections with power sums. One particular emphasis concerns the effect of morphisms and iterated morphisms on words
Equivalence Problems for Tree Transducers: A Brief Survey
The decidability of equivalence for three important classes of tree
transducers is discussed. Each class can be obtained as a natural restriction
of deterministic macro tree transducers (MTTs): (1) no context parameters,
i.e., top-down tree transducers, (2) linear size increase, i.e., MSO definable
tree transducers, and (3) monadic input and output ranked alphabets. For the
full class of MTTs, decidability of equivalence remains a long-standing open
problem.Comment: In Proceedings AFL 2014, arXiv:1405.527
Ten Conferences WORDS: Open Problems and Conjectures
In connection to the development of the field of Combinatorics on Words, we
present a list of open problems and conjectures that were stated during the ten
last meetings WORDS. We wish to continually update the present document by
adding informations concerning advances in problems solving
Stability of machine learning algorithms
In the literature, the predictive accuracy is often the primary criterion for evaluating a learning algorithm. In this thesis, I will introduce novel concepts of stability into the machine learning community. A learning algorithm is said to be stable if it produces consistent predictions with respect to small perturbation of training samples. Stability is an important aspect of a learning procedure because unstable predictions can potentially reduce users\u27 trust in the system and also harm the reproducibility of scientific conclusions. As a prototypical example, stability of the classification procedure will be discussed extensively. In particular, I will present two new concepts of classification stability. ^ The first one is the decision boundary instability (DBI) which measures the variability of linear decision boundaries generated from homogenous training samples. Incorporating DBI with the generalization error (GE), we propose a two-stage algorithm for selecting the most accurate and stable classifier. The proposed classifier selection method introduces the statistical inference thinking into the machine learning society. Our selection method is shown to be consistent in the sense that the optimal classifier simultaneously achieves the minimal GE and the minimal DBI. Various simulations and real examples further demonstrate the superiority of our method over several alternative approaches. ^ The second one is the classification instability (CIS). CIS is a general measure of stability and generalizes DBI to nonlinear classifiers. This allows us to establish a sharp convergence rate of CIS for general plug-in classifiers under a low-noise condition. As one of the simplest plug-in classifiers, the nearest neighbor classifier is extensively studied. Motivated by an asymptotic expansion formula of the CIS of the weighted nearest neighbor classifier, we propose a new classifier called stabilized nearest neighbor (SNN) classifier. Our theoretical developments further push the frontier of statistical theory in machine learning. In particular, we prove that SNN attains the minimax optimal convergence rate in the risk, and the established sharp convergence rate in CIS. Extensive simulation and real experiments demonstrate that SNN achieves a considerable improvement in stability over existing classifiers with no sacrifice of predictive accuracy
Conferences WORDS, years 1997-2017: Open Problems and Conjectures
International audienceIn connection with the development of the field of Combinatorics on Words, we present a list of open problems and conjectures which were stated in the context of the eleven international meetings WORDS, which held from 1997 to 2017
- …