71 research outputs found

    Lexical attraction for text compression

    Get PDF
    New methods of acquiring structural information in text documents may support better compression by identifying an appropriate prediction context for each symbol. The method of “lexical attraction” infers syntactic dependency structures from statistical analysis of large corpora. We describe the generation of a lexical attraction model, discuss its application to text compression, and explore its potential to outperform fixed-context models such as word-level PPM. Perhaps the most exciting aspect of this work is the prospect of using compression as a metric for structure discovery in text

    Principles of Synthetic Intelligence

    No full text
    Abstract. Understanding why the original project of Artificial Intelligence is widely regarded as a failure and has been abandoned even by most of contemporary AI research itself may prove crucial to achieving synthetic intelligence. Here, we take a brief look at some principles that we might consider to be lessons from the past five decades of AI. The author's own AI architectureMicroPsi -attempts to contribute to that discussion

    The MicroPsi Agent Architecture

    No full text
    The MicroPsi agent architecture describes the interaction of emotion, motivation and cognition of situated agents based on the Psi theory of Dietrich Drner. This theory touches on a number of questions, particularly about perception, representation and bounded rationality, in very interesting ways, but being formulated within psychology, has had relatively little impact on the discussion of agents within computer science. MicroPsi is an attempt to address this by formulating the original theory in a more abstract and formal way, at the same time enhancing it with additional concepts for building of ontological categories and attention. This paper gives an introduction into MicroPsi components and representation
    • 

    corecore