42,371 research outputs found

    Twin Networks: Matching the Future for Sequence Generation

    Full text link
    We propose a simple technique for encouraging generative RNNs to plan ahead. We train a "backward" recurrent network to generate a given sequence in reverse order, and we encourage states of the forward model to predict cotemporal states of the backward model. The backward network is used only during training, and plays no role during sampling or inference. We hypothesize that our approach eases modeling of long-term dependencies by implicitly forcing the forward states to hold information about the longer-term future (as contained in the backward states). We show empirically that our approach achieves 9% relative improvement for a speech recognition task, and achieves significant improvement on a COCO caption generation task.Comment: 12 pages, 3 figures, published at ICLR 201

    Abstract Canonical Inference

    Full text link
    An abstract framework of canonical inference is used to explore how different proof orderings induce different variants of saturation and completeness. Notions like completion, paramodulation, saturation, redundancy elimination, and rewrite-system reduction are connected to proof orderings. Fairness of deductive mechanisms is defined in terms of proof orderings, distinguishing between (ordinary) "fairness," which yields completeness, and "uniform fairness," which yields saturation.Comment: 28 pages, no figures, to appear in ACM Trans. on Computational Logi

    Clips as a knowledge based language

    Get PDF
    CLIPS is a language for writing expert systems applications on a personal or small computer. Here, the CLIPS programming language is described and compared to three other artificial intelligence (AI) languages (LISP, Prolog, and OPS5) with regard to the processing they provide for the implementation of a knowledge based system (KBS). A discussion is given on how CLIPS would be used in a control system
    • …
    corecore