2 research outputs found

    Grammar, Ontology, and the Unity of Meaning

    Get PDF
    Words have meaning. Sentences also have meaning, but their meaning is different in kind from any collection of the meanings of the words they contain. I discuss two puzzles related to this difference. The first is how the meanings of the parts of a sentence combine to give rise to a unified sentential meaning, as opposed to a mere collection of disparate meanings (UP1). The second is why the formal ontology of linguistic meaning changes when grammatical structure is built up (UP2). For example, the meaning of a sentence is a proposition evaluable for truth and falsity. In contrast, a collection of the meanings of its parts does not constitute a proposition and is not evaluable for truth. These two puzzles are closely related, since change in formal ontology is the clearest sign of the unity of meaning. The most popular strategy for answering them is taking the meanings of the parts as abstractions from primitive sentence meanings. However, I argue that, given plausible psychological constraints, sentence meanings cannot be taken as explanatory primitives. Drawing on recent work in Generative Grammar and its philosophy, I suggest that the key to both unity questions is to distinguish strictly between lexical and grammatical meaning. The latter is irreducible and determines how lexical content is used in referential acts. I argue that these referential properties determine a formal ontology, which explains why and how formal ontology changes when grammatical structure is built up (UP2). As for UP1, I suggest that, strictly speaking, lexical meanings never combine. Instead, whenever grammar specifies a formal ontology for the lexical meanings entering a grammatical derivation, further lexical (or phrasal) meanings can only specify aspects of this recursive grammatical process. In this way, contemporary grammatical theory can be used to address old philosophical problems
    corecore