804 research outputs found

    Generating All Permutations by Context-Free Grammars in Greibach Normal Form

    Get PDF
    We consider context-free grammars GnG_n in Greibach normal form and, particularly, in Greibach mm-form (m=1,2m=1,2) which generates the finite language LnL_n of all n!n! strings that are permutations of nn different symbols (n≥1n\geq 1). These grammars are investigated with respect to their descriptional complexity, i.e., we determine the number of nonterminal symbols and the number of production rules of GnG_n as functions of nn. As in the case of Chomsky normal form these descriptional complexity measures grow faster than any polynomial function

    Generating all permutations by context-free grammars in Chomsky normal form

    Get PDF
    Let Ln be the finite language of all n! strings that are permutations of n different symbols (n1). We consider context-free grammars Gn in Chomsky normal form that generate Ln. In particular we study a few families {Gn}n1, satisfying L(Gn)=Ln for n1, with respect to their descriptional complexity, i.e. we determine the number of nonterminal symbols and the number of production rules of Gn as functions of n

    Permuting operations on strings and their relation to prime numbers

    Get PDF
    Some length-preserving operations on strings only permute the symbol positions in strings; such an operation XX gives rise to a family {Xn}n≥2\{X_n\}_{n\geq2} of similar permutations. We investigate the structure and the order of the cyclic group generated by XnX_n. We call an integer nn XX-{\em prime} if XnX_n consists of a single cycle of length nn (n≥2n\geq2). Then we show some properties of these XX-primes, particularly, how XX-primes are related to X′X^\prime-primes as well as to ordinary prime numbers. Here XX and X′X^\prime range over well-known examples (reversal, cyclic shift, shuffle, twist) and some new ones based on Archimedes spiral and on the Josephus problem

    Linguistics

    Get PDF
    Contains reports on seven research projects.National Science Foundation (Grant G-16526)National Institutes of Health (Grant MH-04737-03)U. S. Air Force (Electronics Systems Division) under Contract AF19(628)-248

    Permuting operations on strings: Their permutations and their primes

    Get PDF
    We study some length-preserving operations on strings that permute the symbol positions in strings. These operations include some well-known examples (reversal, circular or cyclic shift, shuffle, twist, operations induced by the Josephus problem) and some new ones based on Archimedes spiral. Such a permuting operation XX gives rise to a family {p(X,n)}n≥2\{p(X,n)\}_{n\geq2} of similar permutations. We investigate the structure and the order of the cyclic group generated by such a permutation p(X,n)p(X,n). We call an integer nn XX-prime if p(X,n)p(X,n) consists of a single cycle of length nn (n≥2n\geq2). Then we show some properties of these XX-primes, particularly, how XX-primes are related to X′X^\prime-primes as well as to ordinary prime numbers

    Is linguistics a part of psychology?

    Get PDF
    Noam Chomsky, the founding father of generative grammar and the instigator of some of its core research programs, claims that linguistics is a part of psychology, concerned with a class of cognitive structures employed in speaking and understanding. In a recent book, Ignorance of Language, Michael Devitt has challenged certain core aspects of linguistics, as prominent practitioners of the science conceive of it. Among Devitt’s major conclusions is that linguistics is not a part of psychology. In this thesis I defend Chomsky’s psychological conception of grammatical theory. My case for the psychological conception involves defending a set of psychological goals for generative grammars, centring on conditions of descriptive and explanatory adequacy. I argue that generative grammar makes an explanatory commitment to a distinction between a psychological system of grammatical competence and the performance systems engaged in putting that competence to use. I then defend the view that this distinction can be investigated by probing speakers’ linguistic intuitions. Building on the psychological goals of generative grammar and its explanatory commitment to a psychological theory of grammatical competence, I argue that generative grammar neither targets nor presupposes non-psychological grammatical properties. The latter nonpsychological properties are dispensable to grammarians’ explanations because their explanatory goals can be met by the theory of grammatical competence to which they are committed. So generative grammars have psychological properties as their subject matter and linguistics is a part of psychology

    Topics in Programming Languages, a Philosophical Analysis through the case of Prolog

    Get PDF
    [EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well. In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some: - the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, GĂśdel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical ĂŠlan of logic programming and Prolog

    Generative Linguistics Meets Normative Inferentialism: Part 2

    Get PDF
    This is the second installment of a two-part essay. Limitations of space prevented the publication of the full essay in a previous issue of the Journal (Pereplyotchik 2020). My overall goal is to outline a strategy for integrating generative linguistics with a broadly pragmatist approach to meaning and communication. Two immensely useful guides in this venture are Robert Brandom and Paul Pietroski. Squarely in the Chomskyan tradition, Pietroski’s recent book, Conjoining Meanings, offers an approach to natural-language semantics that rejects foundational assumptions widely held amongst philosophers and linguists. In particular, he argues against extensionalism—the view that meanings are (or determine) truth and satisfaction conditions. Having arrived at the same conclusion by way of Brandom’s deflationist account of truth and reference, I’ll argue that both theorists have important contributions to make to a broader anti-extensionalist approach to language. Part 1 of the essay was largely exegetical, laying out what I see as the core aspects of Brandom’s normative inferentialism (1) and Pietroski’s naturalistic semantics (2). Now, in Part 2, I argue that there are many convergences between these two theoretical frameworks and, contrary to first appearances, very few points of substantive disagreement between them. If the integration strategy that I propose is correct, then what appear to be sharply contrasting commitments are better seen as interrelated verbal differences that come down to different—but complementary—explanatory goals. The residual disputes are, however, stubborn. I end by discussing how to square Pietroski’s commitment to predicativism with Brandom’s argument that a predicativist language is in principle incapable of expressing ordinary conditionals
    • …
    corecore