19,427 research outputs found
Toward an Energy Efficient Language and Compiler for (Partially) Reversible Algorithms
We introduce a new programming language for expressing reversibility,
Energy-Efficient Language (Eel), geared toward algorithm design and
implementation. Eel is the first language to take advantage of a partially
reversible computation model, where programs can be composed of both reversible
and irreversible operations. In this model, irreversible operations cost energy
for every bit of information created or destroyed. To handle programs of
varying degrees of reversibility, Eel supports a log stack to automatically
trade energy costs for space costs, and introduces many powerful control logic
operators including protected conditional, general conditional, protected
loops, and general loops. In this paper, we present the design and compiler for
the three language levels of Eel along with an interpreter to simulate and
annotate incurred energy costs of a program.Comment: 17 pages, 0 additional figures, pre-print to be published in The 8th
Conference on Reversible Computing (RC2016
The "handedness" of language: Directional symmetry breaking of sign usage in words
Language, which allows complex ideas to be communicated through symbolic
sequences, is a characteristic feature of our species and manifested in a
multitude of forms. Using large written corpora for many different languages
and scripts, we show that the occurrence probability distributions of signs at
the left and right ends of words have a distinct heterogeneous nature.
Characterizing this asymmetry using quantitative inequality measures, viz.
information entropy and the Gini index, we show that the beginning of a word is
less restrictive in sign usage than the end. This property is not simply
attributable to the use of common affixes as it is seen even when only word
roots are considered. We use the existence of this asymmetry to infer the
direction of writing in undeciphered inscriptions that agrees with the
archaeological evidence. Unlike traditional investigations of phonotactic
constraints which focus on language-specific patterns, our study reveals a
property valid across languages and writing systems. As both language and
writing are unique aspects of our species, this universal signature may reflect
an innate feature of the human cognitive phenomenon.Comment: 10 pages, 4 figures + Supplementary Information (15 pages, 8
figures), final corrected versio
On Hilberg's Law and Its Links with Guiraud's Law
Hilberg (1990) supposed that finite-order excess entropy of a random human
text is proportional to the square root of the text length. Assuming that
Hilberg's hypothesis is true, we derive Guiraud's law, which states that the
number of word types in a text is greater than proportional to the square root
of the text length. Our derivation is based on some mathematical conjecture in
coding theory and on several experiments suggesting that words can be defined
approximately as the nonterminals of the shortest context-free grammar for the
text. Such operational definition of words can be applied even to texts
deprived of spaces, which do not allow for Mandelbrot's ``intermittent
silence'' explanation of Zipf's and Guiraud's laws. In contrast to
Mandelbrot's, our model assumes some probabilistic long-memory effects in human
narration and might be capable of explaining Menzerath's law.Comment: To appear in Journal of Quantitative Linguistic
- …