548 research outputs found

    A new way of linking information theory with cognitive science

    Get PDF
    The relationship between the notion of *information* in information theory, and the notion of *information processing* in cognitive science, has long been controversial. But as the present paper shows, part of the disagreement arises from conflating different formulations of measurement. Clarifying distinctions reveals it is the context-free nature of Shannon's information average that is particular problematic from the cognitive point of view. Context-sensitive evaluation is then shown to be a way of addressing the problems that arise

    A natural cure for the pet fish problem: feature emergence as classificatory composition

    Get PDF
    Where do emergent features come from? This has long been an intriging puzzle. The concept of pet fish illustrates the difficulty. Most people expect pet fish to live in bowls, even though this is not something either pets or fish normally do. The inference that pet fish have the feature of living in bowls cannot be explained purely in terms of the constituents themselves. The feature seems to emerge. The present paper aims to explain this effect using notions of classificatory composition. Adjoined concept references are taken to construct classifications rather than combinations; a pet fish is taken to be a fish classified as a pet rather than a combination of a pet a fish. It is also shown that, where concepts have a compositional representation, feature emergence can be accounted for in terms of compositional accommodation

    Quantitative abstraction theory

    Get PDF
    A quantitative theory of abstraction is presented. The central feature of this is a growth formula defining the number of abstractions which may be formed by an individual agent in a given context. Implications of the theory for artificial intelligence and cognitive psychology are explored. Its possible applications to the issue of implicit v. explicit learning are also discussed

    Attainment Scotland Fund evaluation. School case studies

    Get PDF

    Indirect sensing through abstractive learning.

    Get PDF
    The paper discusses disparity issues in sensing tasks involving the production of a 'high-level' signal from 'low-level' signal sources. It introduces an abstraction theory which helps to explain the nature of the problem and point the way to a solution. It proposes a solution based on the use of supervised adaptive methods drawn from artificial intelligence. Finally, it describes a set of empirical experiments which were carried out to evaluate the efficacy of the method

    Evaluation of the Attainment Scotland Fund. Headteacher survey 2018

    Get PDF

    Three ways to link merge with hierarchical concept-combination

    Get PDF
    In the Minimalist Program, language competence is seen to stem from a fundamental ability to construct hierarchical structure, an operation dubbed `Merge'. This raises the problem of how to view hierarchical concept-combination. This is a conceptual operation which also builds hierarchical structure. We can conceive of a garden that consists of a lawn and a flower-bed, for example, or a salad consisting of lettuce, fennel and rocket, or a crew consisting of a pilot and engineer. In such cases, concepts are put together in a way that makes one the accommodating element with respect to the others taken in combination. The accommodating element becomes the root of a hierarchical unit. Since this unit is itself a concept, the operation is inherently recursive. Does this mean the mind has two independent systems of hierarchical construction? Or is some form of integration more likely? Following a detailed examination of the operations involved, this paper shows there are three main ways in which Merge might be linked to hierarchical concept-combination. Also examined are the architectural implications that arise in each case

    Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms

    Full text link
    Many different machine learning algorithms exist; taking into account each algorithm's hyperparameters, there is a staggeringly large number of possible alternatives overall. We consider the problem of simultaneously selecting a learning algorithm and setting its hyperparameters, going beyond previous work that addresses these issues in isolation. We show that this problem can be addressed by a fully automated approach, leveraging recent innovations in Bayesian optimization. Specifically, we consider a wide range of feature selection techniques (combining 3 search and 8 evaluator methods) and all classification approaches implemented in WEKA, spanning 2 ensemble methods, 10 meta-methods, 27 base classifiers, and hyperparameter settings for each classifier. On each of 21 popular datasets from the UCI repository, the KDD Cup 09, variants of the MNIST dataset and CIFAR-10, we show classification performance often much better than using standard selection/hyperparameter optimization methods. We hope that our approach will help non-expert users to more effectively identify machine learning algorithms and hyperparameter settings appropriate to their applications, and hence to achieve improved performance.Comment: 9 pages, 3 figure
    • ā€¦
    corecore