13,091 research outputs found
A Classification of Trapezoidal Words
Trapezoidal words are finite words having at most n+1 distinct factors of
length n, for every n>=0. They encompass finite Sturmian words. We distinguish
trapezoidal words into two disjoint subsets: open and closed trapezoidal words.
A trapezoidal word is closed if its longest repeated prefix has exactly two
occurrences in the word, the second one being a suffix of the word. Otherwise
it is open. We show that open trapezoidal words are all primitive and that
closed trapezoidal words are all Sturmian. We then show that trapezoidal
palindromes are closed (and therefore Sturmian). This allows us to characterize
the special factors of Sturmian palindromes. We end with several open problems.Comment: In Proceedings WORDS 2011, arXiv:1108.341
Enumeration and Structure of Trapezoidal Words
Trapezoidal words are words having at most distinct factors of length
for every . They therefore encompass finite Sturmian words. We give
combinatorial characterizations of trapezoidal words and exhibit a formula for
their enumeration. We then separate trapezoidal words into two disjoint
classes: open and closed. A trapezoidal word is closed if it has a factor that
occurs only as a prefix and as a suffix; otherwise it is open. We investigate
open and closed trapezoidal words, in relation with their special factors. We
prove that Sturmian palindromes are closed trapezoidal words and that a closed
trapezoidal word is a Sturmian palindrome if and only if its longest repeated
prefix is a palindrome. We also define a new class of words, \emph{semicentral
words}, and show that they are characterized by the property that they can be
written as , for a central word and two different letters .
Finally, we investigate the prefixes of the Fibonacci word with respect to the
property of being open or closed trapezoidal words, and show that the sequence
of open and closed prefixes of the Fibonacci word follows the Fibonacci
sequence.Comment: Accepted for publication in Theoretical Computer Scienc
Rich, Sturmian, and trapezoidal words
In this paper we explore various interconnections between rich words,
Sturmian words, and trapezoidal words. Rich words, first introduced in
arXiv:0801.1656 by the second and third authors together with J. Justin and S.
Widmer, constitute a new class of finite and infinite words characterized by
having the maximal number of palindromic factors. Every finite Sturmian word is
rich, but not conversely. Trapezoidal words were first introduced by the first
author in studying the behavior of the subword complexity of finite Sturmian
words. Unfortunately this property does not characterize finite Sturmian words.
In this note we show that the only trapezoidal palindromes are Sturmian. More
generally we show that Sturmian palindromes can be characterized either in
terms of their subword complexity (the trapezoidal property) or in terms of
their palindromic complexity. We also obtain a similar characterization of rich
palindromes in terms of a relation between palindromic complexity and subword
complexity.Comment: 7 page
Classification of bijections between 321- and 132-avoiding permutations
It is well-known, and was first established by Knuth in 1969, that the number
of 321-avoiding permutations is equal to that of 132-avoiding permutations. In
the literature one can find many subsequent bijective proofs of this fact. It
turns out that some of the published bijections can easily be obtained from
others. In this paper we describe all bijections we were able to find in the
literature and show how they are related to each other via ``trivial''
bijections. We classify the bijections according to statistics preserved (from
a fixed, but large, set of statistics), obtaining substantial extensions of
known results. Thus, we give a comprehensive survey and a systematic analysis
of these bijections. We also give a recursive description of the algorithmic
bijection given by Richards in 1988 (combined with a bijection by Knuth from
1969). This bijection is equivalent to the celebrated bijection of Simion and
Schmidt (1985), as well as to the bijection given by Krattenthaler in 2001, and
it respects 11 statistics--the largest number of statistics any of the
bijections respects
Robust learning with implicit residual networks
In this effort, we propose a new deep architecture utilizing residual blocks
inspired by implicit discretization schemes. As opposed to the standard
feed-forward networks, the outputs of the proposed implicit residual blocks are
defined as the fixed points of the appropriately chosen nonlinear
transformations. We show that this choice leads to the improved stability of
both forward and backward propagations, has a favorable impact on the
generalization power and allows to control the robustness of the network with
only a few hyperparameters. In addition, the proposed reformulation of ResNet
does not introduce new parameters and can potentially lead to a reduction in
the number of required layers due to improved forward stability. Finally, we
derive the memory-efficient training algorithm, propose a stochastic
regularization technique and provide numerical results in support of our
findings
- âŚ