448 research outputs found

    Integrated plan generation and recognition : a logic-based approach

    Get PDF
    The work we present in this paper is settled within the field of intelligent help systems. Intelligent help systems aim at supporting users of application systems by the achievements of qualified experts. In order to provide such qualified support our approach is based on the integration of plan generation and plan recognition components. Plan recognition in this context serves to identify the users goals and so forms the basis for an active user support. The planning component dynamically generates plans which are proposed for the user to reach her goal. We introduce a logic-based approach where plan generation and plan recognition is done on a common logical basis and both components work in some kind of cross-talk

    Entwurf einer Patternbeschreibungssprache für die Informationsextraktion in der Dokumentanalyse

    Get PDF
    Dokumentanalyse befaßt sich mit der Extraktion von relevanten Informationen aus Dokumenten, die in Papierform vorliegen. Um die gewünschten Informationen in einem Text zu finden, können verschiedene Techniken angewendet werden. Sie reichen von einfachen Suchverfahren hin zum Versuch des vollständigen Parsens eines Textes. Häufig stammen diese Techniken aus dem Bereich der NLP, wo sie zur Verarbeitung von elektronischen Texten eingesetzt werden. Unabhängig von der eingesetzten Technik benötigt man jedoch immer eine Sprache, mit der die Syntax und die Semantik der gesuchten Informationen beschrieben werden können. In diesem Dokument wird eine solche Sprache vorgestellt, die insbesondere den Erfordernissen der Dokumentanalyse Rechnung trägt, aber allerdings auch für die Verarbeitung elektronischer Texte genutzt werden kann. Derzeit wird die Sprache zur Informationsextraktion in und zur Klassifikation von deutschen Geschäftsbriefen eingesetzt

    Uncertainty-valued horn clauses

    Get PDF
    There are many forms of uncertainty, each usually again having more than one theoretical model. Therefore, a very flexible kind of uncertainty-valued Horn clauses is introduced in RELFUN in section 1. They have a head, several premises and an uncertainty factor, which represents the uncertainty of the clause. The premises are all "functional" in the sense that their returned value is again an uncertainty value. These premises and the uncertainty factor of an uncertainty rule become embedded into the arguments of a combination function when translating uncertainty clauses into footed clauses (non-ground, non-deterministic functions in RELFUN, which can then be compiled as usual). The combination function can be modified by the user. It may be a built-in or a user-defined function, either of which may be computed as the value of a higher-order function. In section 2, an application of uncertainty clauses to the uncertain concept of a "pet holder", according to German law, is described. This and another example are then fully demonstrated in appendix A. Finally, appendix B gives a listing of the complete uncertainty translator in LISP

    A cognitive analysis of event structure

    Get PDF
    Events occupy a central place in natural language. Accordingly, an understanding of them is crucial if one is to have any kind of a theoretically well-motivated account of natural language understanding and generation. It is proposed here that speakers create a cognitive structure for each discourse and process it as they introduce sentences into the discourse. The structure for each sentence depends systematically on its tense, aspect and the situation type; its effect on the discourse also depends on the structures of the sentences that precede it. It is also argued that the perfective aspect introduces the structure of the given event in its entirety. The progressive, by contrast, introduces only the core of the structure of the given event excluding, in particular, its preparatory processes and resultant state. Similarly, the perfect and the perfective can be distinguished on the basis of the temporal schemata they introduce. While the perfective presents the event as complete, the perfect presents it as complete and closed; i.e., the perfect prevents succeeding discourse from being interpreted as falling during the given event. This is surprising since the perfect is otherwise simply the combination of the perfective and a tense. This paper also provides a key motivation for distinguishing between the preparatory processes and the preliminary stages of an event. This observation, which is crucial in distinguishing between the perfective and the progressive has not been made in the literature

    Linking flat predicate argument structures

    Get PDF
    This report presents an approach to enriching flat and robust predicate argument structures with more fine-grained semantic information, extracted from underspecified semantic representations and encoded in Minimal Recursion Semantics (MRS). Such representations are provided by a hand-built HPSG grammar with a wide linguistic coverage. A specific semantic representation, called linked predicate argument structure (LPAS), has been worked out, which describes the explicit embedding relationships among predicate argument structures. LPAS can be used as a generic interface language for integrating semantic representations with different granularities. Some initial experiments have been conducted to convert MRS expressions into LPASs. A simple constraint solver is developed to resolve the underspecified dominance relations between the predicates and their arguments in MRS expressions. LPASs are useful for high-precision information extraction and question answering tasks because of their fine-grained semantic structures. In addition, I have attempted to extend the lexicon of the HPSG English Resource Grammar (ERG) exploiting WordNet and to disambiguate the readings of HPSG parsing with the help of a probabilistic parser, in order to process texts from application domains. Following the presented approach, the HPSG ERG grammar can be used for annotating some standard treebank, e.g., the Penn Treebank, with its fine-grained semantics. In this vein, I point out opportunities for a fruitful cooperation of the HPSG annotated Redwood Treebank and the Penn PropBank. In my current work, I exploit HPSG as an additional knowledge resource for the automatic learning of LPASs from dependency structures

    Natural language semantics and compiler technology

    Get PDF
    This paper recommends an approach to the implementation of semantic representation languages (SRLs) which exploits a parallelism between SRLs and programming languages (PLs). The design requirements of SRLs for natural language are similar to those of PLs in their goals. First, in both cases we seek modules in which both the surface representation (print form) and the underlying data structures are important. This requirement highlights the need for general tools allowing the printing and reading of expressions (data structures). Second, these modules need to cooperate with foreign modules, so that the importance of interface technology (compilation) is paramount; and third, both compilers and semantic modules need "inferential" facilities for transforming (simplifying) complex expressions in order to ease subsequent processing. But the most important parallel is the need in both fields for tools which are useful in combination with a variety of concrete languages -- general purpose parsers, printers, simplifiers (transformation facilities) and compilers. This arises in PL technology from (among other things) the need for experimentation in language design, which is again parallel to the case of SRLs. Using a compiler-based approach, we have implemented NLL, a public domain software package for computational natural language semantics. Several interfaces exist both for grammar modules and for applications, using a variety of interface technologies, including especially compilation. We review here a variety of NLL, applications, focusing on COSMA, an NL interface to a distributed appointment manager

    The myth of domain-independent persistence

    Get PDF
    The frame problem can be reduced to the problem of inferring the non-existence of causes for change. This paper concerns how these non-existence inferences are made, and shows how many popular approaches lack generality because they rely on a domain-independent assumption of occurrence omniscience. Also, this paper shows how to represent and use appropriate domain-dependent knowledge in three successively more expressive versions, where the causal theories are deductive, non-monotonic, and statistical

    Using hierarchical constraint satisfaction for lathe-tool selection in a CIM environment

    Get PDF
    In this paper we shall discuss how to treat the automatic selection of appropriate lathe tools in a computer-aided production planning (CAPP) application as a constraint satisfaction problem (CSP) over hierarchically structured finite domains. Conceptually it is straightforward to formulate lathe-tool selection in terms of a CSP, however the choice of constraint and domain representations and of the order in which the constraints are applied is nontrivial if a computationally tractable system design is to be achieved. Since the domains appearing in technical applications often can be modeled as a hierarchy, we investigate how constraint satisfaction algorithms can make use of this hierarchical structure. Moreover, many real-life problems are formulated in a way that no optimal solution can be found which satisfies all the given constraints. Therefore, in order to bring AI technology into real-world applications, it becomes very important to be able to cope with conflicting constraints and to relax the given CSP until a (suboptimal) solution can be found. For these reasons, the constraint system CONTAX has been developed, which incorporates an extended hierarchical arc-consistency algorithm together with discrete constraint relaxation and has been used to implement the lathe-tool selection module of the ARC-TEC planning system

    Extended logic-plus-functional programming

    Get PDF
    Extensions of logic and functional programming are integrated in RELFUN. Its valued clauses comprise Horn clauses (true\u27-valued) and clauses with a distinguished foot\u27 premise (returning arbitrary values). Both the logic and functional components permit LISP-like varying-arity and higher-order operators. The DATAFUN sublanguage of the functional component is shown to be preferable to relational encodings of functions in DATALOG. RELFUN permits non-ground, non-deterministic functions, hence certain functions can be inverted using an is\u27-primitive generalizing that of PROLOG. For function nestings a strict call-by-value strategy is employed. The reduction of these extensions to a relational sublanguage is discussed and their WAM compilation is sketched. Three examples (serialise\u27, wang\u27, and eval\u27) demonstrate the relational/functional style in use. The list expressions of RELFUN\u27s LISP implementation are presented in an extended PROLOG-like syntax

    Plan generation using a method of deductive program synthesis

    Get PDF
    In this paper we introduce a planning approach based on a method of deductive program synthesis. The program synthesis system we rely upon takes first-order specifications and from these derives recursive programs automatically. It uses a set of transformation rules whose applications are guided by an overall strategy. Additionally several heuristics are involved which considerably reduce the search space. We show by means of an example taken from the blocks world how even recursive plans can be obtained with this method. Some modifications of the synthesis strategy and heuristics are discussed, which are necessary to obtain a powerful and automatic planning system. Finally it is shown how subplans can be introduced and generated separately
    corecore