107,422 research outputs found

    From surface dependencies towards deeper semantic representations [Semantic representations]

    Get PDF
    In the past, a divide could be seen between ’deep’ parsers on the one hand, which construct a semantic representation out of their input, but usually have significant coverage problems, and more robust parsers on the other hand, which are usually based on a (statistical) model derived from a treebank and have larger coverage, but leave the problem of semantic interpretation to the user. More recently, approaches have emerged that combine the robustness of datadriven (statistical) models with more detailed linguistic interpretation such that the output could be used for deeper semantic analysis. Cahill et al. (2002) use a PCFG-based parsing model in combination with a set of principles and heuristics to derive functional (f-)structures of Lexical-Functional Grammar (LFG). They show that the derived functional structures have a better quality than those generated by a parser based on a state-of-the-art hand-crafted LFG grammar. Advocates of Dependency Grammar usually point out that dependencies already are a semantically meaningful representation (cf. Menzel, 2003). However, parsers based on dependency grammar normally create underspecified representations with respect to certain phenomena such as coordination, apposition and control structures. In these areas they are too "shallow" to be directly used for semantic interpretation. In this paper, we adopt a similar approach to Cahill et al. (2002) using a dependency-based analysis to derive functional structure, and demonstrate the feasibility of this approach using German data. A major focus of our discussion is on the treatment of coordination and other potentially underspecified structures of the dependency data input. F-structure is one of the two core levels of syntactic representation in LFG (Bresnan, 2001). Independently of surface order, it encodes abstract syntactic functions that constitute predicate argument structure and other dependency relations such as subject, predicate, adjunct, but also further semantic information such as the semantic type of an adjunct (e.g. directional). Normally f-structure is captured as a recursive attribute value matrix, which is isomorphic to a directed graph representation. Figure 5 depicts an example target f-structure. As mentioned earlier, these deeper-level dependency relations can be used to construct logical forms as in the approaches of van Genabith and Crouch (1996), who construct underspecified discourse representations (UDRSs), and Spreyer and Frank (2005), who have robust minimal recursion semantics (RMRS) as their target representation. We therefore think that f-structures are a suitable target representation for automatic syntactic analysis in a larger pipeline of mapping text to interpretation. In this paper, we report on the conversion from dependency structures to fstructure. Firstly, we evaluate the f-structure conversion in isolation, starting from hand-corrected dependencies based on the TüBa-D/Z treebank and Versley (2005)´s conversion. Secondly, we start from tokenized text to evaluate the combined process of automatic parsing (using Foth and Menzel (2006)´s parser) and f-structure conversion. As a test set, we randomly selected 100 sentences from TüBa-D/Z which we annotated using a scheme very close to that of the TiGer Dependency Bank (Forst et al., 2004). In the next section, we sketch dependency analysis, the underlying theory of our input representations, and introduce four different representations of coordination. We also describe Weighted Constraint Dependency Grammar (WCDG), the dependency parsing formalism that we use in our experiments. Section 3 characterises the conversion of dependencies to f-structures. Our evaluation is presented in section 4, and finally, section 5 summarises our results and gives an overview of problems remaining to be solved

    Quantum Aspects of Semantic Analysis and Symbolic Artificial Intelligence

    Full text link
    Modern approaches to semanic analysis if reformulated as Hilbert-space problems reveal formal structures known from quantum mechanics. Similar situation is found in distributed representations of cognitive structures developed for the purposes of neural networks. We take a closer look at similarites and differences between the above two fields and quantum information theory.Comment: version accepted in J. Phys. A (Letter to the Editor

    Predication at the interface

    Get PDF
    We try to show that predication plays a greater role in syntax than commonly assumed. Specifically, we wil argue that predication to a large extent determines both the phrase structure of clauses and trigger syntactic processes that take place in clauses. If we are on the right path, this implies that syntax is basically semantically driven, given that predication is semantically construed

    Methodological Principles of Investigating Semantic Structure of Ukrainian Axionomens of the Danube Region

    Get PDF
    The article is dedicated to the description of the procedure of axionomens’ formalizedanalysis. Matrix method of investigating words denoting spiritual values in the modern Ukrainianlanguage is proposed. Matrix is defined to be a two-dimensional structure which replacesoversimplified notation systems used in componential analysis. Matrix enables a researcher to studyall the interconnections between the related meanings of different lexical units as well as betweendifferent meanings of a specific lexical unit. It consists of two axes – a vertical one indicates a lexicalstock and a horizontal one means a seme stock of the collected language material. The application ofmatrix method in practice proves that the structural organization of axiovocabulary considerablybecomes complicated; internal mechanisms and dynamics of semantic cooperations of axionomensare revealed under the influence of extra-linguistic factors. Matrix presentation of non-material valuesgives an opportunity to describe in detail the structure of axionouns’ lexical meanings which are notin chaotic order, but clearly organized, to distinguish the degree of their related semantics, to exposethe functional character of semes forming definite structures within the framework of analyzed words.The proposed methodology of researching the relations between lexico-semantic groups is consideredto be perspective in studying all lexical sub-systems of the value paradigms of the English and Frenchlanguage societies

    Small clause results revisited

    Get PDF
    The main purpose of this paper is to show that argument structure constructions like complex telic path of motion constructions (John walked to the store) or complex resultative constructions (The dog barked the chickens awake) are not to be regarded as "theoretical entities" (Jackendoff (1997b); Goldberg (1995)). As an alternative to these semanticocentric accounts, I argue that their epiphenomenal status can be shown iff we take into account some important insights from three syntactically-oriented works: (i) Hoekstra's (1988, 1992) analysis of SC R, (ii) Hale & Keyser's (1993f.) configurational theory of argument structure, and (iii) Mateu & Rigau’s (1999; i.p.) syntactic account of Talmy's (1991) typological distinction between 'satellite framed languages' (e.g., English, German, Dutch, etc.) and 'verb-framed languages' (e.g., Catalan, Spanish, French, etc.). In particular, it is argued that the formation of the abovementioned constructions involves a conflation process of two different syntactic argument structures, this process being carried out via a 'generalized transformation'. Accordingly, the so-called 'lexical subordination process' (Levin & Rapoport (1988)) is argued to involve a syntactic operation, rather than a semantic one. Due to our assuming that the parametric variation involved in the constructions under study cannot be explained in purely semantic terms (Mateu & Rigau (1999)), Talmy's (1991) typological distinction is argued to be better stated in lexical syntactic terms

    Proceedings of the Workshop Semantic Content Acquisition and Representation (SCAR) 2007

    Get PDF
    This is the proceedings of the Workshop on Semantic Content Acquisition and Representation, held in conjunction with NODALIDA 2007, on May 24 2007 in Tartu, Estonia.</p
    • …
    corecore