37 research outputs found

    Mathematics of the Neural Response

    Get PDF
    We propose a natural image representation, the neural response, motivated by the neuroscience of the visual cortex. The inner product defined by the neural response leads to a similarity measure between functions which we call the derived kernel. Based on a hierarchical architecture, we give a recursive definition of the neural response and associated derived kernel. The derived kernel can be used in a variety of application domains such as classification of images, strings of text and genomics data

    Does invariant recognition predict tuning of neurons in sensory cortex?

    Get PDF
    Tuning properties of simple cells in cortical V1 can be described in terms of a "universal shape" characterized by parameter values which hold across different species. This puzzling set of findings begs for a general explanation grounded on an evolutionarily important computational function of the visual cortex. We ask here whether these properties are predicted by the hypothesis that the goal of the ventral stream is to compute for each image a "signature" vector which is invariant to geometric transformations, with the the additional assumption that the mechanism for continuously learning and maintaining invariance consists of the memory storage of a sequence of neural images of a few objects undergoing transformations (such as translation, scale changes and rotation) via Hebbian synapses. For V1 simple cells the simplest version of this hypothesis is the online Oja rule which implies that the tuning of neurons converges to the eigenvectors of the covariance of their input. Starting with a set of dendritic fields spanning a range of sizes, simulations supported by a direct mathematical analysis show that the solution of the associated "cortical equation" provides a set of Gabor-like wavelets with parameter values that are in broad agreement with the physiology data. We show however that the simple version of the Hebbian assumption does not predict all the physiological properties. The same theoretical framework also provides predictions about the tuning of cells in V4 and in the face patch AL which are in qualitative agreement with physiology data

    GURLS: A Least Squares Library for Supervised Learning

    Get PDF
    We present GURLS, a least squares, modular, easy-to-extend software library for efficient supervised learning. GURLS is targeted to machine learning practitioners, as well as non- specialists. It offers a number state-of-the-art training strategies for medium and large-scale learning, and routines for efficient model selection. The library is particularly well suited for multi-output problems (multi-category/multi-label). GURLS is currently available in two independent implementations: Matlab and C++. It takes advantage of the favorable properties of regularized least squares algorithm to exploit advanced tools in linear algebra. Routines to handle computations with very large matrices by means of memory-mapped storage and distributed task execution are available. The package is distributed under the BSD license and is available for download at https://github.com/LCSL/GURLS

    Unsupervised learning of invariant representations

    Get PDF
    The present phase of Machine Learning is characterized by supervised learning algorithms relying on large sets of labeled examples (. n\u2192 1e). The next phase is likely to focus on algorithms capable of learning from very few labeled examples (. n\u21921), like humans seem able to do. We propose an approach to this problem and describe the underlying theory, based on the unsupervised, automatic learning of a "good" representation for supervised learning, characterized by small sample complexity. We consider the case of visual object recognition, though the theory also applies to other domains like speech. The starting point is the conjecture, proved in specific cases, that image representations which are invariant to translation, scaling and other transformations can considerably reduce the sample complexity of learning. We prove that an invariant and selective signature can be computed for each image or image patch: the invariance can be exact in the case of group transformations and approximate under non-group transformations. A module performing filtering and pooling, like the simple and complex cells described by Hubel and Wiesel, can compute such signature. The theory offers novel unsupervised learning algorithms for "deep" architectures for image and speech recognition. We conjecture that the main computational goal of the ventral stream of visual cortex is to provide a hierarchical representation of new objects/images which is invariant to transformations, stable, and selective for recognition-and show how this representation may be continuously learned in an unsupervised way during development and visual experienc

    Learning new physics efficiently with nonparametric methods

    Full text link
    We present a machine learning approach for model-independent new physics searches. The corresponding algorithm is powered by recent large-scale implementations of kernel methods, nonparametric learning algorithms that can approximate any continuous function given enough data. Based on the original proposal by D'Agnolo and Wulzer (arXiv:1806.02350), the model evaluates the compatibility between experimental data and a reference model, by implementing a hypothesis testing procedure based on the likelihood ratio. Model-independence is enforced by avoiding any prior assumption about the presence or shape of new physics components in the measurements. We show that our approach has dramatic advantages compared to neural network implementations in terms of training times and computational resources, while maintaining comparable performances. In particular, we conduct our tests on higher dimensional datasets, a step forward with respect to previous studies.Comment: 22 pages, 13 figure

    A biology-driven approach identifies the hypoxia gene signature as a predictor of the outcome of neuroblastoma patients

    Get PDF
    Background Hypoxia is a condition of low oxygen tension occurring in the tumor microenvironment and it is related to poor prognosis in human cancer. To examine the relationship between hypoxia and neuroblastoma, we generated and tested an in vitro derived hypoxia gene signature for its ability to predict patients' outcome. Results We obtained the gene expression profile of 11 hypoxic neuroblastoma cell lines and we derived a robust 62 probesets signature (NB-hypo) taking advantage of the strong discriminating power of the l1-l2 feature selection technique combined with the analysis of differential gene expression. We profiled gene expression of the tumors of 88 neuroblastoma patients and divided them according to the NB-hypo expression values by K-means clustering. The NB-hypo successfully stratifies the neuroblastoma patients into good and poor prognosis groups. Multivariate Cox analysis revealed that the NB-hypo is a significant independent predictor after controlling for commonly used risk factors including the amplification of MYCN oncogene. NB-hypo increases the resolution of the MYCN stratification by dividing patients with MYCN not amplified tumors in good and poor outcome suggesting that hypoxia is associated with the aggressiveness of neuroblastoma tumor independently from MYCN amplification. Conclusions Our results demonstrate that the NB-hypo is a novel and independent prognostic factor for neuroblastoma and support the view that hypoxia is negatively correlated with tumors' outcome. We show the power of the biology-driven approach in defining hypoxia as a critical molecular program in neuroblastoma and the potential for improvement in the current criteria for risk stratification.Foundation KiKaChildren's Neuroblastoma Cancer FoundationSKK FoundationDutch Cancer Societ
    corecore