24,963 research outputs found

    Cognitive processes in categorical and associative priming: a diffusion model analysis

    Get PDF
    Cognitive processes and mechanisms underlying different forms of priming were investigated using a diffusion model approach. In a series of 6 experiments, effects of prime-target associations and of a semantic and affective categorical match of prime and target were analyzed for different tasks. Significant associative and categorical priming effects were found in standard analyses of response times (RTs) and error frequencies. Results of diffusion model analyses revealed that priming effects of associated primes were mapped on the drift rate parameter (v), while priming effects of a categorical match on a task-relevant dimension were mapped on the extradecisional parameters (t(0) and d). These results support a spreading activation account of associative priming and an explanation of categorical priming in terms of response competition. Implications for the interpretation of priming effects and the use of priming paradigms in cognitive psychology and social cognition are discussed

    Parameter-dependent associative Yang-Baxter equations and Poisson brackets

    Get PDF
    We discuss associative analogues of classical Yang-Baxter equation meromorphically dependent on parameters. We discover that such equations enter in a description of a general class of parameter-dependent Poisson structures and double Lie and Poisson structures in sense of M. Van den Bergh. We propose a classification of all solutions for one-dimensional associative Yang-Baxter equations.Comment: 18 pages, LATEX2, ws-ijgmmp style. Few typos corrected, aknowledgements adde

    The exceptional holonomy groups and calibrated geometry

    Full text link
    The exceptional holonomy groups are G2 in 7 dimensions, and Spin(7) in 8 dimensions. Riemannian manifolds with these holonomy groups are Ricci-flat. This is a survey paper on exceptional holonomy, in two parts. Part I introduces the exceptional holonomy groups, and explains constructions for compact 7- and 8-manifolds with holonomy G2 and Spin(7). The simplest such constructions work by using techniques from complex geometry and Calabi-Yau analysis to resolve the singularities of a torus orbifold T^7/G or T^8/G, for G a finite group preserving a flat G2 or Spin(7)-structure on T^7 or T^8. There are also more complicated constructions which begin with a Calabi-Yau manifold or orbifold. Part II discusses the calibrated submanifolds of G2 and Spin(7)-manifolds: associative 3-folds and coassociative 4-folds for G2, and Cayley 4-folds for Spin(7). We explain the general theory, following Harvey and Lawson, and the known examples. Finally we describe the deformation theory of compact calibrated submanifolds, following McLean.Comment: 32 pages. Lectures given at a conference in Gokova, Turkey, May 200

    Network Sketching: Exploiting Binary Structure in Deep CNNs

    Full text link
    Convolutional neural networks (CNNs) with deep architectures have substantially advanced the state-of-the-art in computer vision tasks. However, deep networks are typically resource-intensive and thus difficult to be deployed on mobile devices. Recently, CNNs with binary weights have shown compelling efficiency to the community, whereas the accuracy of such models is usually unsatisfactory in practice. In this paper, we introduce network sketching as a novel technique of pursuing binary-weight CNNs, targeting at more faithful inference and better trade-off for practical applications. Our basic idea is to exploit binary structure directly in pre-trained filter banks and produce binary-weight models via tensor expansion. The whole process can be treated as a coarse-to-fine model approximation, akin to the pencil drawing steps of outlining and shading. To further speedup the generated models, namely the sketches, we also propose an associative implementation of binary tensor convolutions. Experimental results demonstrate that a proper sketch of AlexNet (or ResNet) outperforms the existing binary-weight models by large margins on the ImageNet large scale classification task, while the committed memory for network parameters only exceeds a little.Comment: To appear in CVPR201
    • …
    corecore