776 research outputs found
An Abstract Machine for Unification Grammars
This work describes the design and implementation of an abstract machine,
Amalia, for the linguistic formalism ALE, which is based on typed feature
structures. This formalism is one of the most widely accepted in computational
linguistics and has been used for designing grammars in various linguistic
theories, most notably HPSG. Amalia is composed of data structures and a set of
instructions, augmented by a compiler from the grammatical formalism to the
abstract instructions, and a (portable) interpreter of the abstract
instructions. The effect of each instruction is defined using a low-level
language that can be executed on ordinary hardware.
The advantages of the abstract machine approach are twofold. From a
theoretical point of view, the abstract machine gives a well-defined
operational semantics to the grammatical formalism. This ensures that grammars
specified using our system are endowed with well defined meaning. It enables,
for example, to formally verify the correctness of a compiler for HPSG, given
an independent definition. From a practical point of view, Amalia is the first
system that employs a direct compilation scheme for unification grammars that
are based on typed feature structures. The use of amalia results in a much
improved performance over existing systems.
In order to test the machine on a realistic application, we have developed a
small-scale, HPSG-based grammar for a fragment of the Hebrew language, using
Amalia as the development platform. This is the first application of HPSG to a
Semitic language.Comment: Doctoral Thesis, 96 pages, many postscript figures, uses pstricks,
pst-node, psfig, fullname and a macros fil
Classification-based phrase structure grammar: an extended revised version of HPSG
This thesis is concerned with a presentation of Classification -based Phrase Structure
Grammar (or cPSG), a grammatical theory that has grown out of extensive revisions
of, and extensions to, HPSG. The fundamental difference between this theory and HPSG
concerns the central role that classification plays in the grammar: the grammar classifies
strings, according to their feature structure descriptions, as being of various types.
Apart from the role of classification, the theory bears a close resemblance to HPSG,
though it is by no means a direct translation, including numerous revisions and extensions.
A central goal in the development of the theory has been its computational
implementation, which is included in the thesis.The presentation may be divided into four parts. In the first, chapters 1 and 2, we
present the grammatical formalism within which the theory is stated. This consists of a
development of the notion of a classificatory system (chapter 1), and the incorporation
of hierarchality into that notion (chapter 2).The second part concerns syntactic issues. Chapter 3 revises the HPSG treatment of
specifiers, complements and adjuncts, incorporating ideas that specifiers and complements
should be distinguished and presenting a treatment of adjuncts whereby the
head is selected for by the adjunct. Chapter 4 presents several options for an account of
unbounded dependencies. The accounts are based loosely on that of GPSG, and a reconstruction
of GPSG's Foot Feature Principle is presented which does not involve a notion
of default. Chapter 5 discusses coordination, employing an extension of Rounds- Kasper
logic to allow a treatment of cross -categorial coordination.In the third part, chapters 6, 7 and 8, we turn to semantic issues. We begin (Chapter 6)
with a discussion of Situation Theory, the background semantic theory, attempting to
establish a precise and coherent version of the theory within which to work. Chapter 7
presents the bulk of the treatment of semantics, and can be seen as an extensive revision
of the HPSG treatment of semantics. The aim is to provide a semantic treatment which
is faithful to the version of Situation Theory presented in Chapter 6. Chapter 8 deals
with quantification, discussing the nature of quantification in Situation Theory before
presenting a treatment of quantification in CPSG. Some residual questions about the
semantics of coordinated noun phrases are also addressed in this chapter.The final part, Chapter 9, concerns the actual computational implementation of the
theory. A parsing algorithm based on hierarchical classification is presented, along with
four strategies that might be adopted given that algorithm. Also discussed are some
implementation details. A concluding chapter summarises the arguments of the thesis
and outlines some avenues for future research
Morphology Within the Parallel Architecture Framework : the Centrality of the Lexicon Below the Word Level
The Parallel Architecture (PA) framework (Jackendoff 2002, 2007, Culicover & Jackendoff 2005) is one of the most complete constraint-based linguistic theories that encompasses phonology, syntax and semantics. However, it lacks a fully developed model of word formation. More recently, a theory called Relational Morphology (RM) (Jackendoff & Audring 2020) has been developed, that integrates into the PA. The current study shows how the Slot Structure model (Benavides 2003, 2009, 2010), which is compatible with the PA and is based on the dual-route model and percolation of features (Pinker 1999, 2006; Huang & Pinker 2010), can provide a better account of morphology than RM, and can also be incorporated into the PA, thus contributing to make this a more explanatory framework. Spanish data are used as the basis to demonstrate the implementation of the SSM. The current paper demonstrates two key problems for RM: inconsistent and confusing coindexation, and a proliferation of schemas, and shows that these issues do not arise in the Slot Structure model. Overall, the paper points out significant drawbacks in the RM framework, while at the same time showing how the PA's morphological component can be enriched with the Slot Structure model
Inheritance and complementation : a case study of easy adjectives and related nouns
Mechanisms for representing lexically the bulk of syntactic and semantic information for a language have been under active development, as is evident in the recent studies contained in this volume. Our study serves to highlight some of the most useful tools available for structured lexical representation, in particular, (multiple) inheritance, default specification, and lexical rules. It then illustrates the value of these mechanisms in illuminating one corner of the lexicon involving an unusual kind of complementation among a group of adjectives exemplified by easy. The virtures of the structured lexicon are its succinctness and its tendency to highlight significant clusters of linguistic properties. From its succinctness follow two practical advantages, namely its ease of maintenance and modifiability. In order to suggest how important these may be practically, we extend the analysis of adjectival complementation in several directions. These further illustrate how the use of inheritance in lexical representation permits exact and explicit characterizations of phenomena in the language under study. We demonstrate how the use of the mechanisms employed in the analysis of easy enable us to give a unified account of related phenomena featuring nouns like pleasure, and even the adverbs (adjectival specifiers) too and enough. Along the way we motivate some elaborations of the Head-Driven Phrase Structure Grammar (HPSG) framework in which we couch our analysis, and offer several avenues for further study of this part of the English lexicon
- …