5 research outputs found
Without Specifiers: Phrase Structure and Events
This dissertation attempts to unify two reductionist hypotheses: that there is no relational difference between specifiers and complements, and that verbs do not have thematic arguments. I argue that these two hypotheses actually bear on each other and that we get a better theory if we pursue both of them.
The thesis is centered around the following hypothesis: Each application of Spell-Out corresponds to a conjunct at logical form. In order to create such a system, it is necessary to provide a syntax that is designed such that each Spell-Out domain is mapped into a conjunct. This is done by eliminating the relational difference between specifiers and complements. The conjuncts are then conjoined into Neo-Davidsonian representations that constitute logical forms. The theory is argued to provide a transparent mapping from syntactic structures to logical forms, such that the syntax gives you a logical form where the verb does not have any thematic arguments. In essence, the thesis is therefore an investigation into the structure of verbs.
This theory of Spell-Out raises a number of questions and it makes strong predictions about the structure of possible derivations. The thesis discusses a number of these: the nature of linearization and movement, left-branch extractions, serial verb constructions, among others. It is shown how the present theory can capture these phenomena, and sometimes in better ways than previous analyses.
The thesis closes by discussing some more foundational issues related to transparency, the syntax-semantics interface, and the nature of basic semantic composition operations
Coordination and comparatives
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Linguistics and Philosophy, 1992.Includes bibliographical references (v. 2, leaves 382-385).by Friederike Moltmann.Ph.D
Recommended from our members
On the Formal Flexibility of Syntactic Categories
This dissertation explores the formal flexibility of syntactic categories. The main proposal is that Universal Grammar (UG) only provides templatic guidance for syntactic category formation and organization but leaves many other issues open, including issues internal to a single category and issues at the intercategorial, system level: these points that UG "does not care about" turn out to enrich the categorial ontology of human language in important ways.
The dissertation consists of seven chapters. After a general introduction in Chapter 1, I lay out some foundational issues regarding features and categories in Chapter 2 and delineate a featural metalanguage comprising four components: specification, valuation, typing, and granularity. Based on that I put forward a templatic definition for syntactic categories, which unifies the combinatorial and taxonomic perspectives under the notion mergeme. Then, a detailed overview of the "categorial universe" I work with is presented, which shows that the syntactic category system (SCS) is an intricate web structured by five layers of abstraction divided into three broad levels of concern: the individual level (layers 1–2), the global level (layers 3–4), and the supraglobal level (layer 5). In the subsequent chapters I explore the template-flexibility pairs at each abstraction layer, with Chapters 3–4 focusing on the first layer, Chapter 5 on the second layer, and Chapter 6 on the third and fourth layers; the fifth layer is not in the scope of this dissertation.
Chapter 3 examines a special type of category defined by an underspecified mergeme, the defective category, which behaves like a "chameleon" in that it gets assimilated into whatever nondefective category it merges with. This characteristic makes it potentially useful in analyzing certain adjunction structures, and I explore this potential by two case studies, one focusing on modifier-head compounds and the other on sentence-final particles. Chapter 4 examines another special type of category defined by the absence of a mergeme, the Root category. Deductive reasoning leads me to propose a generalized root syntax, according to which roots are not confined to lexical categorial environments but may legally merge with and hence "support" any non-Root category. I demonstrate the empirical consequences of this theory by a comprehensive study of the half-lexical–half-functional vocabulary items in Chinese.
Chapter 5 ascends to the second abstraction layer and raises the question of whether the categorial sequences (or projection hierarchies) in human language are necessarily totally ordered, as certain analytical devices (e.g., "flavored" categories) can only be theoretically maintained if we also allow categorial sequences to be partially ordered. After a diachronic study of the flavored verbalizer (stative) in Chinese resultative compounds, I conclude that while "flavoring" is indeed a possible type of flexibility in the SCS, it is the deviation rather than the norm due to non-UG or "third" factors and hence should be cautiously used in syntactic analyses.
Chapter 6 ascends even higher on the ladder of abstraction and examines the global interconnection in the SCS ontology with the aid of mathematical Category theory. I formalize the functional parallelism across major parts of speech and the inheritance-based relations across granularity levels as Category-theoretic structures, which reveal further and more abstract templates and flexibility types in the SCS. A crucial mathematical concept in the formalization is epi-Adjunction. Finally, in Chapter 7 I summarize the main results of this dissertation and briefly discuss some potential directions of future research.My PhD is funded by Cambridge Trust and China Scholarship Council. I have also received travel grants and financial aids from Gonville and Caius College and the Faculty of Modern and Medieval Languages
Recommended from our members
Aspects of emergent cyclicity in language and computation
This thesis has four parts, which correspond to the presentation and development of a theoretical
framework for the study of cognitive capacities qua physical phenomena, and a case study of locality conditions over natural languages.
Part I deals with computational considerations, setting the tone of the rest of the thesis, and introducing and defining critical concepts like ‘grammar’, ‘automaton’, and the relations between them
. Fundamental questions concerning the place of formal language theory in
linguistic inquiry, as well as the expressibility of linguistic and computational concepts in
common terms, are raised in this part.
Part II further explores the issues addressed in Part I with particular emphasis on how
grammars are implemented by means of automata, and the properties of the formal languages
that these automata generate. We will argue against the equation between effective computation
and function-based computation, and introduce examples of computable procedures which are
nevertheless impossible to capture using traditional function-based theories. The connection
with cognition will be made in the light of dynamical frustrations: the irreconciliable tension
between mutually incompatible tendencies that hold for a given dynamical system. We will
provide arguments in favour of analyzing natural language as emerging from a tension between
different systems (essentially, semantics and morpho-phonology) which impose orthogonal
requirements over admissible outputs. The concept of level of organization or scale comes to
the foreground here; and apparent contradictions and incommensurabilities between concepts
and theories are revisited in a new light: that of dynamical nonlinear systems which are
fundamentally frustrated. We will also characterize the computational system that emerges from
such an architecture: the goal is to get a syntactic component which assigns the simplest
possible structural description to sub-strings, in terms of its computational complexity. A
system which can oscillate back and forth in the hierarchy of formal languages in assigning
structural representations to local domains will be referred to as a computationally mixed
system.
Part III is where the really fun stuff starts. Field theory is introduced, and its applicability to
neurocognitive phenomena is made explicit, with all due scale considerations. Physical and
mathematical concepts are permanently interacting as we analyze phrase structure in terms of
pseudo-fractals (in Mandelbrot’s sense) and define syntax as a (possibly unary) set of
topological operations over completely Hausdorff (CH) ultrametric spaces. These operations, which makes field perturbations interfere, transform that initial completely Hausdorff
ultrametric space into a metric, Hausdorff space with a weaker separation axiom. Syntax, in this
proposal, is not ‘generative’ in any traditional sense –except the ‘fully explicit theory’ one-:
rather, it partitions (technically, ‘parametrizes’) a topological space. Syntactic dependencies are
defined as interferences between perturbations over a field, which reduce the total entropy of
the system per cycles, at the cost of introducing further dimensions where attractors
corresponding to interpretations for a phrase marker can be found.
Part IV is a sample of what we can gain by further pursuing the physics of language approach,
both in terms of empirical adequacy and theoretical elegance, not to mention the unlimited
possibilities of interdisciplinary collaboration. In this section we set our focus on island
phenomena as defined by Ross (1967), critically revisiting the most relevant literature on this
topic, and establishing a typology of constructions that are strong islands, which cannot be
violated. These constructions are particularly interesting because they limit the phase space of
what is expressible via natural language, and thus reveal crucial aspects of its underlying
dynamics. We will argue that a dynamically frustrated system which is characterized by
displaying mixed computational dependencies can provide straightforward characterizations of
cyclicity in terms of changes in dependencies in local domains
Handbook of Lexical Functional Grammar
Lexical Functional Grammar (LFG) is a nontransformational theory of
linguistic structure, first developed in the 1970s by Joan Bresnan and
Ronald M. Kaplan, which assumes that language is best described and
modeled by parallel structures representing different facets of
linguistic organization and information, related by means of
functional correspondences. This volume has five parts. Part I,
Overview and Introduction, provides an introduction to core syntactic
concepts and representations. Part II, Grammatical Phenomena, reviews
LFG work on a range of grammatical phenomena or constructions. Part
III, Grammatical modules and interfaces, provides an overview of LFG
work on semantics, argument structure, prosody, information structure,
and morphology. Part IV, Linguistic disciplines, reviews LFG work in
the disciplines of historical linguistics, learnability,
psycholinguistics, and second language learning. Part V, Formal and
computational issues and applications, provides an overview of
computational and formal properties of the theory, implementations,
and computational work on parsing, translation, grammar induction, and
treebanks. Part VI, Language families and regions, reviews LFG work
on languages spoken in particular geographical areas or in particular
language families. The final section, Comparing LFG with other
linguistic theories, discusses LFG work in relation to other
theoretical approaches