4,038 research outputs found

    Assessing the contribution of shallow and deep knowledge sources for word sense disambiguation

    No full text
    Corpus-based techniques have proved to be very beneficial in the development of efficient and accurate approaches to word sense disambiguation (WSD) despite the fact that they generally represent relatively shallow knowledge. It has always been thought, however, that WSD could also benefit from deeper knowledge sources. We describe a novel approach to WSD using inductive logic programming to learn theories from first-order logic representations that allows corpus-based evidence to be combined with any kind of background knowledge. This approach has been shown to be effective over several disambiguation tasks using a combination of deep and shallow knowledge sources. Is it important to understand the contribution of the various knowledge sources used in such a system. This paper investigates the contribution of nine knowledge sources to the performance of the disambiguation models produced for the SemEval-2007 English lexical sample task. The outcome of this analysis will assist future work on WSD in concentrating on the most useful knowledge sources

    Negation, 'presupposition' and the semantics/pragmatics distinction

    Get PDF
    A cognitive pragmatic approach is taken to some long-standing problem cases of negation, the so-called presupposition denial cases. It is argued that a full account of the processes and levels of representation involved in their interpretation typically requires the sequential pragmatic derivation of two different propositions expressed. The first is one in which the presupposition is preserved and, following the rejection of this, the second involves the echoic (metalinguistic) use of material falling in the scope of the negation. The semantic base for these processes is the standard anti-presuppositionalist wide-scope negation. A different view, developed by Burton-Roberts (1989a, b), takes presupposition to be a semantic relation encoded in natural language and so argues for a negation operator that does not cancel presuppositions. This view is shown to be flawed, in that it makes the false prediction that presupposition denial cases are semantic contradictions and it is based on too narrow a view of the role of pragmatic inferencing

    Remarks on the architecture of OT syntax grammars

    Get PDF
    This paper argues for a particular architecture of OT syntax. This architecture hasthree core features: i) it is bidirectional, the usual production-oriented optimisation (called ‘first optimisation’ here) is accompanied by a second step that checks the recoverability of an underlying form; ii) this underlying form already contains a full-fledged syntactic specification; iii) especially the procedure checking for recoverability makes crucial use of semantic and pragmatic factors. The first section motivates the basic architecture. The second section shows with two examples, how contextual factors are integrated. The third section examines its implications for learning theory, and the fourth section concludes with a broader discussion of the advantages and disadvantages of the proposed model

    Connectionist Inference Models

    Get PDF
    The performance of symbolic inference tasks has long been a challenge to connectionists. In this paper, we present an extended survey of this area. Existing connectionist inference systems are reviewed, with particular reference to how they perform variable binding and rule-based reasoning, and whether they involve distributed or localist representations. The benefits and disadvantages of different representations and systems are outlined, and conclusions drawn regarding the capabilities of connectionist inference systems when compared with symbolic inference systems or when used for cognitive modeling

    Specifying Logic Programs in Controlled Natural Language

    Full text link
    Writing specifications for computer programs is not easy since one has to take into account the disparate conceptual worlds of the application domain and of software development. To bridge this conceptual gap we propose controlled natural language as a declarative and application-specific specification language. Controlled natural language is a subset of natural language that can be accurately and efficiently processed by a computer, but is expressive enough to allow natural usage by non-specialists. Specifications in controlled natural language are automatically translated into Prolog clauses, hence become formal and executable. The translation uses a definite clause grammar (DCG) enhanced by feature structures. Inter-text references of the specification, e.g. anaphora, are resolved with the help of discourse representation theory (DRT). The generated Prolog clauses are added to a knowledge base. We have implemented a prototypical specification system that successfully processes the specification of a simple automated teller machine.Comment: 16 pages, compressed, uuencoded Postscript, published in Proceedings CLNLP 95, COMPULOGNET/ELSNET/EAGLES Workshop on Computational Logic for Natural Language Processing, Edinburgh, April 3-5, 199

    The Galileo PPS expert monitoring and diagnostic prototype

    Get PDF
    The Galileo PPS Expert Monitoring Module (EMM) is a prototype system implemented on the SUN workstation that will demonstrate a knowledge-based approach to monitoring and diagnosis for the Galileo spacecraft Power/Pyro subsystems. The prototype will simulate an analysis module functioning within the SFOC Engineering Analysis Subsystem Environment (EASE). This document describes the implementation of a prototype EMM for the Galileo spacecraft Power Pyro Subsystem. Section 2 of this document provides an overview of the issues in monitoring and diagnosis and comparison between traditional and knowledge-based solutions to this problem. Section 3 describes various tradeoffs which must be considered when designing a knowledge-based approach to monitoring and diagnosis, and section 4 discusses how these issues were resolved in constructing the prototype. Section 5 presents conclusions and recommendations for constructing a full-scale demonstration of the EMM. A Glossary provides definitions of terms used in this text

    Adversarial Sets for Regularising Neural Link Predictors

    Get PDF
    In adversarial training, a set of models learn together by pursuing competing goals, usually defined on single data instances. However, in relational learning and other non-i.i.d domains, goals can also be defined over sets of instances. For example, a link predictor for the is-a relation needs to be consistent with the transitivity property: if is-a(x_1, x_2) and is-a(x_2, x_3) hold, is-a(x_1, x_3) needs to hold as well. Here we use such assumptions for deriving an inconsistency loss, measuring the degree to which the model violates the assumptions on an adversarially-generated set of examples. The training objective is defined as a minimax problem, where an adversary finds the most offending adversarial examples by maximising the inconsistency loss, and the model is trained by jointly minimising a supervised loss and the inconsistency loss on the adversarial examples. This yields the first method that can use function-free Horn clauses (as in Datalog) to regularise any neural link predictor, with complexity independent of the domain size. We show that for several link prediction models, the optimisation problem faced by the adversary has efficient closed-form solutions. Experiments on link prediction benchmarks indicate that given suitable prior knowledge, our method can significantly improve neural link predictors on all relevant metrics.Comment: Proceedings of the 33rd Conference on Uncertainty in Artificial Intelligence (UAI), 201

    A framework for integrating syntax, semantics and pragmatics for computer-aided professional practice: With application of costing in construction industry

    Get PDF
    Producing a bill of quantity is a knowledge-based, dynamic and collaborative process, and evolves with variances and current evidence. However, within the context of information system practice in BIM, knowledge of cost estimation has not been represented, nor has it been integrated into the processes based on BIM. This paper intends to establish an innovative means of taking data from the BIM linked to a project, and using it to create the necessary items for a bill of quantity that will enable cost estimation to be undertaken for the project. Our framework is founded upon the belief that three components are necessary to gain a full awareness of the domain which is being computerised; the information type which is to be assessed for compatibility (syntax), the definition for the pricing domain (semantics), and the precise implementation environment for the standards being taken into account (pragmatics). In order to achieve this, a prototype is created that allows a cost item for the bill of quantity to be spontaneously generated, by means of the semantic web ontology and a forward chain algorithm. Within this paper, ‘cost items’ signify the elements included in a bill of quantity, including details of their description, quantity and price. As a means of authenticating the process being developed, the authors of this work effectively implemented it in the production of cost items. In addition, the items created were contrasted with those produced by specialists. For this reason, this innovative framework introduces the possibility of a new means of applying semantic web ontology and forward chain algorithm to construction professional practice resulting in automatic cost estimation. These key outcomes demonstrate that, decoupling the professional practice into three key components of syntax, semantics and pragmatics can provide tangible benefits to domain use
    corecore