2,757 research outputs found

    Exploring the N-th Dimension of Language

    Get PDF
    This paper is aimed at exploring the hidden fundamental\ud computational property of natural language that has been so elusive that it has made all attempts to characterize its real computational property ultimately fail. Earlier natural language was thought to be context-free. However, it was gradually realized that this does not hold much water given that a range of natural language phenomena have been found as being of non-context-free character that they have almost scuttled plans to brand natural language contextfree. So it has been suggested that natural language is mildly context-sensitive and to some extent context-free. In all, it seems that the issue over the exact computational property has not yet been solved. Against this background it will be proposed that this exact computational property of natural language is perhaps the N-th dimension of language, if what we mean by dimension is\ud nothing but universal (computational) property of natural language

    On Fodor on Darwin on Evolution

    Get PDF
    Jerry Fodor argues that Darwin was wrong about "natural selection" because (1) it is only a tautology rather than a scientific law that can support counterfactuals ("If X had happened, Y would have happened") and because (2) only minds can select. Hence Darwin's analogy with "artificial selection" by animal breeders was misleading and evolutionary explanation is nothing but post-hoc historical narrative. I argue that Darwin was right on all counts. Until Darwin's "tautology," it had been believed that either (a) God had created all organisms as they are, or (b) organisms had always been as they are. Darwin revealed instead that (c) organisms have heritable traits that evolved across time through random variation, with survival and reproduction in (changing) environments determining (mindlessly) which variants were successfully transmitted to the next generation. This not only provided the (true) alternative (c), but also the methodology for investigating which traits had been adaptive, how and why; it also led to the discovery of the genetic mechanism of the encoding, variation and evolution of heritable traits. Fodor also draws erroneous conclusions from the analogy between Darwinian evolution and Skinnerian reinforcement learning. Fodor’s skepticism about both evolution and learning may be motivated by an overgeneralization of Chomsky’s “poverty of the stimulus argument” -- from the origin of Universal Grammar (UG) to the origin of the “concepts” underlying word meaning, which, Fodor thinks, must be “endogenous,” rather than evolved or learned

    Construction Grammar and Language Models

    Get PDF
    Recent progress in deep learning and natural language processing has given rise to powerful models that are primarily trained on a cloze-like task and show some evidence of having access to substantial linguistic information, including some constructional knowledge. This groundbreaking discovery presents an exciting opportunity for a synergistic relationship between computational methods and Construction Grammar research. In this chapter, we explore three distinct approaches to the interplay between computational methods and Construction Grammar: (i) computational methods for text analysis, (ii) computational Construction Grammar, and (iii) deep learning models, with a particular focus on language models. We touch upon the first two approaches as a contextual foundation for the use of computational methods before providing an accessible, yet comprehensive overview of deep learning models, which also addresses reservations construction grammarians may have. Additionally, we delve into experiments that explore the emergence of constructionally relevant information within these models while also examining the aspects of Construction Grammar that may pose challenges for these models. This chapter aims to foster collaboration between researchers in the fields of natural language processing and Construction Grammar. By doing so, we hope to pave the way for new insights and advancements in both these fields

    Construction Grammar and Language Models

    Full text link
    Recent progress in deep learning and natural language processing has given rise to powerful models that are primarily trained on a cloze-like task and show some evidence of having access to substantial linguistic information, including some constructional knowledge. This groundbreaking discovery presents an exciting opportunity for a synergistic relationship between computational methods and Construction Grammar research. In this chapter, we explore three distinct approaches to the interplay between computational methods and Construction Grammar: (i) computational methods for text analysis, (ii) computational Construction Grammar, and (iii) deep learning models, with a particular focus on language models. We touch upon the first two approaches as a contextual foundation for the use of computational methods before providing an accessible, yet comprehensive overview of deep learning models, which also addresses reservations construction grammarians may have. Additionally, we delve into experiments that explore the emergence of constructionally relevant information within these models while also examining the aspects of Construction Grammar that may pose challenges for these models. This chapter aims to foster collaboration between researchers in the fields of natural language processing and Construction Grammar. By doing so, we hope to pave the way for new insights and advancements in both these fields.Comment: Accepted for publication in The Cambridge Handbook of Construction Grammar, edited by Mirjam Fried and Kiki Nikiforidou. To appear in 202

    Languages, machines, and classical computation

    Get PDF
    3rd ed, 2021. A circumscription of the classical theory of computation building up from the Chomsky hierarchy. With the usual topics in formal language and automata theory
    • …
    corecore