89,677 research outputs found
Deep Neural Networks Rival the Representation of Primate IT Cortex for Core Visual Object Recognition
The primate visual system achieves remarkable visual object recognition
performance even in brief presentations and under changes to object exemplar,
geometric transformations, and background variation (a.k.a. core visual object
recognition). This remarkable performance is mediated by the representation
formed in inferior temporal (IT) cortex. In parallel, recent advances in
machine learning have led to ever higher performing models of object
recognition using artificial deep neural networks (DNNs). It remains unclear,
however, whether the representational performance of DNNs rivals that of the
brain. To accurately produce such a comparison, a major difficulty has been a
unifying metric that accounts for experimental limitations such as the amount
of noise, the number of neural recording sites, and the number trials, and
computational limitations such as the complexity of the decoding classifier and
the number of classifier training examples. In this work we perform a direct
comparison that corrects for these experimental limitations and computational
considerations. As part of our methodology, we propose an extension of "kernel
analysis" that measures the generalization accuracy as a function of
representational complexity. Our evaluations show that, unlike previous
bio-inspired models, the latest DNNs rival the representational performance of
IT cortex on this visual object recognition task. Furthermore, we show that
models that perform well on measures of representational performance also
perform well on measures of representational similarity to IT and on measures
of predicting individual IT multi-unit responses. Whether these DNNs rely on
computational mechanisms similar to the primate visual system is yet to be
determined, but, unlike all previous bio-inspired models, that possibility
cannot be ruled out merely on representational performance grounds.Comment: 35 pages, 12 figures, extends and expands upon arXiv:1301.353
Enabling computation of correlation bounds for finite-dimensional quantum systems via symmetrisation
We present a technique for reducing the computational requirements by several
orders of magnitude in the evaluation of semidefinite relaxations for bounding
the set of quantum correlations arising from finite-dimensional Hilbert spaces.
The technique, which we make publicly available through a user-friendly
software package, relies on the exploitation of symmetries present in the
optimisation problem to reduce the number of variables and the block sizes in
semidefinite relaxations. It is widely applicable in problems encountered in
quantum information theory and enables computations that were previously too
demanding. We demonstrate its advantages and general applicability in several
physical problems. In particular, we use it to robustly certify the
non-projectiveness of high-dimensional measurements in a black-box scenario
based on self-tests of -dimensional symmetric informationally complete
POVMs.Comment: A. T. and D. R. contributed equally for this projec
Efficient solvability of Hamiltonians and limits on the power of some quantum computational models
We consider quantum computational models defined via a Lie-algebraic theory.
In these models, specified initial states are acted on by Lie-algebraic quantum
gates and the expectation values of Lie algebra elements are measured at the
end. We show that these models can be efficiently simulated on a classical
computer in time polynomial in the dimension of the algebra, regardless of the
dimension of the Hilbert space where the algebra acts. Similar results hold for
the computation of the expectation value of operators implemented by a
gate-sequence. We introduce a Lie-algebraic notion of generalized mean-field
Hamiltonians and show that they are efficiently ("exactly") solvable by means
of a Jacobi-like diagonalization method. Our results generalize earlier ones on
fermionic linear optics computation and provide insight into the source of the
power of the conventional model of quantum computation.Comment: 6 pages; no figure
A Deep Representation for Invariance And Music Classification
Representations in the auditory cortex might be based on mechanisms similar
to the visual ventral stream; modules for building invariance to
transformations and multiple layers for compositionality and selectivity. In
this paper we propose the use of such computational modules for extracting
invariant and discriminative audio representations. Building on a theory of
invariance in hierarchical architectures, we propose a novel, mid-level
representation for acoustical signals, using the empirical distributions of
projections on a set of templates and their transformations. Under the
assumption that, by construction, this dictionary of templates is composed from
similar classes, and samples the orbit of variance-inducing signal
transformations (such as shift and scale), the resulting signature is
theoretically guaranteed to be unique, invariant to transformations and stable
to deformations. Modules of projection and pooling can then constitute layers
of deep networks, for learning composite representations. We present the main
theoretical and computational aspects of a framework for unsupervised learning
of invariant audio representations, empirically evaluated on music genre
classification.Comment: 5 pages, CBMM Memo No. 002, (to appear) IEEE 2014 International
Conference on Acoustics, Speech, and Signal Processing (ICASSP 2014
- …