6,690 research outputs found

    Upper Bound on the Products of Particle Interactions in Cellular Automata

    Full text link
    Particle-like objects are observed to propagate and interact in many spatially extended dynamical systems. For one of the simplest classes of such systems, one-dimensional cellular automata, we establish a rigorous upper bound on the number of distinct products that these interactions can generate. The upper bound is controlled by the structural complexity of the interacting particles---a quantity which is defined here and which measures the amount of spatio-temporal information that a particle stores. Along the way we establish a number of properties of domains and particles that follow from the computational mechanics analysis of cellular automata; thereby elucidating why that approach is of general utility. The upper bound is tested against several relatively complex domain-particle cellular automata and found to be tight.Comment: 17 pages, 12 figures, 3 tables, http://www.santafe.edu/projects/CompMech/papers/ub.html V2: References and accompanying text modified, to comply with legal demands arising from on-going intellectual property litigation among third parties. V3: Accepted for publication in Physica D. References added and other small changes made per referee suggestion

    Multi-Agent Simulation of Emergence of Schwa Deletion Pattern in Hindi

    Get PDF
    Recently, there has been a revival of interest in multi-agent simulation techniques for exploring the nature of language change. However, a lack of appropriate validation of simulation experiments against real language data often calls into question the general applicability of these methods in modeling realistic language change. We try to address this issue here by making an attempt to model the phenomenon of schwa deletion in Hindi through a multi-agent simulation framework. The pattern of Hindi schwa deletion and its diachronic nature are well studied, not only out of general linguistic inquiry, but also to facilitate Hindi grapheme-to-phoneme conversion, which is a preprocessing step to text-to-speech synthesis. We show that under certain conditions, the schwa deletion pattern observed in modern Hindi emerges in the system from an initial state of no deletion. The simulation framework described in this work can be extended to model other phonological changes as well.Language Change, Linguistic Agent, Language Game, Multi-Agent Simulation, Schwa Deletion

    Optimal coding and the origins of Zipfian laws

    Full text link
    The problem of compression in standard information theory consists of assigning codes as short as possible to numbers. Here we consider the problem of optimal coding -- under an arbitrary coding scheme -- and show that it predicts Zipf's law of abbreviation, namely a tendency in natural languages for more frequent words to be shorter. We apply this result to investigate optimal coding also under so-called non-singular coding, a scheme where unique segmentation is not warranted but codes stand for a distinct number. Optimal non-singular coding predicts that the length of a word should grow approximately as the logarithm of its frequency rank, which is again consistent with Zipf's law of abbreviation. Optimal non-singular coding in combination with the maximum entropy principle also predicts Zipf's rank-frequency distribution. Furthermore, our findings on optimal non-singular coding challenge common beliefs about random typing. It turns out that random typing is in fact an optimal coding process, in stark contrast with the common assumption that it is detached from cost cutting considerations. Finally, we discuss the implications of optimal coding for the construction of a compact theory of Zipfian laws and other linguistic laws.Comment: in press in the Journal of Quantitative Linguistics; definition of concordant pair corrected, proofs polished, references update

    On the possible Computational Power of the Human Mind

    Full text link
    The aim of this paper is to address the question: Can an artificial neural network (ANN) model be used as a possible characterization of the power of the human mind? We will discuss what might be the relationship between such a model and its natural counterpart. A possible characterization of the different power capabilities of the mind is suggested in terms of the information contained (in its computational complexity) or achievable by it. Such characterization takes advantage of recent results based on natural neural networks (NNN) and the computational power of arbitrary artificial neural networks (ANN). The possible acceptance of neural networks as the model of the human mind's operation makes the aforementioned quite relevant.Comment: Complexity, Science and Society Conference, 2005, University of Liverpool, UK. 23 page

    Density Matching for Bilingual Word Embedding

    Full text link
    Recent approaches to cross-lingual word embedding have generally been based on linear transformations between the sets of embedding vectors in the two languages. In this paper, we propose an approach that instead expresses the two monolingual embedding spaces as probability densities defined by a Gaussian mixture model, and matches the two densities using a method called normalizing flow. The method requires no explicit supervision, and can be learned with only a seed dictionary of words that have identical strings. We argue that this formulation has several intuitively attractive properties, particularly with the respect to improving robustness and generalization to mappings between difficult language pairs or word pairs. On a benchmark data set of bilingual lexicon induction and cross-lingual word similarity, our approach can achieve competitive or superior performance compared to state-of-the-art published results, with particularly strong results being found on etymologically distant and/or morphologically rich languages.Comment: Accepted by NAACL-HLT 201

    Dysfunctions of highly parallel real-time machines as 'developmental disorders': Security concerns and a Caveat Emptor

    Get PDF
    A cognitive paradigm for gene expression in developmental biology that is based on rigorous application of the asymptotic limit theorems of information theory can be adapted to highly parallel real-time computing. The coming Brave New World of massively parallel 'autonomic' and 'Self-X' machines driven by the explosion of multiple core and molecular computing technologies will not be spared patterns of canonical and idiosyncratic failure analogous to the developmental disorders affecting organisms that have had the relentless benefit of a billion years of evolutionary pruning. This paper provides a warning both to potential users of these machines and, given that many such disorders can be induced by external agents, to those concerned with larger scale matters of homeland security
    • …
    corecore