2,915 research outputs found

    Limited Attention and Discourse Structure

    Full text link
    This squib examines the role of limited attention in a theory of discourse structure and proposes a model of attentional state that relates current hierarchical theories of discourse structure to empirical evidence about human discourse processing capabilities. First, I present examples that are not predicted by Grosz and Sidner's stack model of attentional state. Then I consider an alternative model of attentional state, the cache model, which accounts for the examples, and which makes particular processing predictions. Finally I suggest a number of ways that future research could distinguish the predictions of the cache model and the stack model.Comment: 9 pages, uses twoside,cl,lingmacro

    PERFORMING ANAPHORA IN MODERN GREEK: A NEO-GRICEAN PRAGMATIC ANALYSIS

    Get PDF
    The paper addresses the problem of interpreting anaphoric NPs in Modern Greek. It includes a proposal of a novel analysis based on the systematic interaction of the neo- Gricean pragmatic principles of communication, which provides a neat and elegant approach to NP-anaphora resolution. The findings of this study provide evidence for an account of NP-anaphora in terms of the division of labour between syntax and pragmatics and more accurately in terms of the systematic interaction of the neo-Gricean pragmatic principles

    'Now' with Subordinate Clauses

    Get PDF
    We investigate a novel use of the English temporal modifier ‘now’, in which it combines with a subordinate clause. We argue for a univocal treatment of the expression, on which the subordinating use is taken as basic and the non-subordinating uses are derived. We start by surveying central features of the latter uses which have been discussed in previous work, before introducing key observations regarding the subordinating use of ‘now’ and its relation to deictic and anaphoric uses. All of these data, it is argued, can be accounted for on our proposed analysis. We conclude by comparing ‘now’ to a range of other expressions which exhibit similar behavior

    Centering, Anaphora Resolution, and Discourse Structure

    Full text link
    Centering was formulated as a model of the relationship between attentional state, the form of referring expressions, and the coherence of an utterance within a discourse segment (Grosz, Joshi and Weinstein, 1986; Grosz, Joshi and Weinstein, 1995). In this chapter, I argue that the restriction of centering to operating within a discourse segment should be abandoned in order to integrate centering with a model of global discourse structure. The within-segment restriction causes three problems. The first problem is that centers are often continued over discourse segment boundaries with pronominal referring expressions whose form is identical to those that occur within a discourse segment. The second problem is that recent work has shown that listeners perceive segment boundaries at various levels of granularity. If centering models a universal processing phenomenon, it is implausible that each listener is using a different centering algorithm.The third issue is that even for utterances within a discourse segment, there are strong contrasts between utterances whose adjacent utterance within a segment is hierarchically recent and those whose adjacent utterance within a segment is linearly recent. This chapter argues that these problems can be eliminated by replacing Grosz and Sidner's stack model of attentional state with an alternate model, the cache model. I show how the cache model is easily integrated with the centering algorithm, and provide several types of data from naturally occurring discourses that support the proposed integrated model. Future work should provide additional support for these claims with an examination of a larger corpus of naturally occurring discourses.Comment: 35 pages, uses elsart12, lingmacros, named, psfi

    Natural Language and its Ontology

    Get PDF
    This paper gives a characterization of the ontology implicit in natural language and the entities it involves, situates natural language ontology within metaphysics, and responds to Chomskys' dismissal of externalist semantics

    Reviving the parameter revolution in semantics

    Get PDF
    Montague and Kaplan began a revolution in semantics, which promised to explain how a univocal expression could make distinct truth-conditional contributions in its various occurrences. The idea was to treat context as a parameter at which a sentence is semantically evaluated. But the revolution has stalled. One salient problem comes from recurring demonstratives: "He is tall and he is not tall". For the sentence to be true at a context, each occurrence of the demonstrative must make a different truth-conditional contribution. But this difference cannot be accounted for by standard parameter sensitivity. Semanticists, consoled by the thought that this ambiguity would ultimately be needed anyhow to explain anaphora, have been too content to posit massive ambiguities in demonstrative pronouns. This article aims to revived the parameter revolution by showing how to treat demonstrative pronouns as univocal while providing an account of anaphora that doesn't end up re-introducing the ambiguity

    Generating Abstractive Summaries from Meeting Transcripts

    Full text link
    Summaries of meetings are very important as they convey the essential content of discussions in a concise form. Generally, it is time consuming to read and understand the whole documents. Therefore, summaries play an important role as the readers are interested in only the important context of discussions. In this work, we address the task of meeting document summarization. Automatic summarization systems on meeting conversations developed so far have been primarily extractive, resulting in unacceptable summaries that are hard to read. The extracted utterances contain disfluencies that affect the quality of the extractive summaries. To make summaries much more readable, we propose an approach to generating abstractive summaries by fusing important content from several utterances. We first separate meeting transcripts into various topic segments, and then identify the important utterances in each segment using a supervised learning approach. The important utterances are then combined together to generate a one-sentence summary. In the text generation step, the dependency parses of the utterances in each segment are combined together to create a directed graph. The most informative and well-formed sub-graph obtained by integer linear programming (ILP) is selected to generate a one-sentence summary for each topic segment. The ILP formulation reduces disfluencies by leveraging grammatical relations that are more prominent in non-conversational style of text, and therefore generates summaries that is comparable to human-written abstractive summaries. Experimental results show that our method can generate more informative summaries than the baselines. In addition, readability assessments by human judges as well as log-likelihood estimates obtained from the dependency parser show that our generated summaries are significantly readable and well-formed.Comment: 10 pages, Proceedings of the 2015 ACM Symposium on Document Engineering, DocEng' 201
    corecore