93,247 research outputs found

    Python for Information Theoretic Analysis of Neural Data

    Get PDF
    Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources

    From cognitive science to cognitive neuroscience to neuroeconomics

    Get PDF
    As an emerging discipline, neuroeconomics faces considerable methodological and practical challenges. In this paper, I suggest that these challenges can be understood by exploring the similarities and dissimilarities between the emergence of neuroeconomics and the emergence of cognitive and computational neuroscience two decades ago. From these parallels, I suggest the major challenge facing theory formation in the neural and behavioural sciences is that of being under-constrained by data, making a detailed understanding of physical implementation necessary for theory construction in neuroeconomics. Rather than following a top-down strategy, neuroeconomists should be pragmatic in the use of available data from animal models, information regarding neural pathways and projections, computational models of neural function, functional imaging and behavioural data. By providing convergent evidence across multiple levels of organization, neuroeconomics will have its most promising prospects of success

    Quantum Genetics and Quantum Automata Models of Quantum-Molecular Evolution Involved in the Evolution of Organisms and Species

    Get PDF
    Previous theoretical or general approaches to the problems of Quantum Genetics and Molecular Evolution are considered in this article from the point of view of Quantum Automata Theory first published by the author in 1971 and further developed in several recent articles. The representation of genomes and Interactome networks in categories of many-valued logic LMn –algebras that are naturally transformed during biological evolution, or evolve through interactions with the environment provide a new insight into the mechanisms of molecular evolution, as well as organismal evolution, in terms of sequences of quantum automata. Phenotypic changes are expressed only when certain environmentally-induced quantum-molecular changes are coupled with an internal re-structuring of major submodules of the genome and Interactome networks related to cell cycling and cell growth. Contrary to the commonly held view of `standard’ Darwinist models of evolution, the evolution of organisms and species occurs through coupled multi-molecular transformations induced not only by the environment but actually realized through internal re-organizations of genome and interactome networks. The biological, evolutionary processes involve certain epigenetic transformations that are responsible for phenotypic expression of the genome and Interactome transformations initiated at the quantum-molecular level. It can thus be said that only quantum genetics can provide correct explanations of evolutionary processes that are initiated at the quantum--multi-molecular levels and propagate to the higher levels of organismal and species evolution.

Biological evolution should be therefore regarded as a multi-scale process which is initiated by underlying quantum (coupled) multi-molecular transformations of the genomic and interactomic networks, followed by specific phenotypic transformations at the level of organism and the variable biogroupoids associated with the evolution of species which are essential to the survival of the species. The theoretical framework introduced in this article also paves the way to a Quantitative Biology approach to biological evolution at the quantum-molecular, as well as at the organismal and species levels. This is quite a substantial modification of the 'established’ modern Darwinist, and also of several so-called `molecular evolution’ theories

    Quantifying Resource Use in Computations

    Get PDF
    It is currently not possible to quantify the resources needed to perform a computation. As a consequence, it is not possible to reliably evaluate the hardware resources needed for the application of algorithms or the running of programs. This is apparent in both computer science, for instance, in cryptanalysis, and in neuroscience, for instance, comparative neuro-anatomy. A System versus Environment game formalism is proposed based on Computability Logic that allows to define a computational work function that describes the theoretical and physical resources needed to perform any purely algorithmic computation. Within this formalism, the cost of a computation is defined as the sum of information storage over the steps of the computation. The size of the computational device, eg, the action table of a Universal Turing Machine, the number of transistors in silicon, or the number and complexity of synapses in a neural net, is explicitly included in the computational cost. The proposed cost function leads in a natural way to known computational trade-offs and can be used to estimate the computational capacity of real silicon hardware and neural nets. The theory is applied to a historical case of 56 bit DES key recovery, as an example of application to cryptanalysis. Furthermore, the relative computational capacities of human brain neurons and the C. elegans nervous system are estimated as an example of application to neural nets.Comment: 26 pages, no figure

    Local and global gestalt laws: A neurally based spectral approach

    Get PDF
    A mathematical model of figure-ground articulation is presented, taking into account both local and global gestalt laws. The model is compatible with the functional architecture of the primary visual cortex (V1). Particularly the local gestalt law of good continuity is described by means of suitable connectivity kernels, that are derived from Lie group theory and are neurally implemented in long range connectivity in V1. Different kernels are compatible with the geometric structure of cortical connectivity and they are derived as the fundamental solutions of the Fokker Planck, the Sub-Riemannian Laplacian and the isotropic Laplacian equations. The kernels are used to construct matrices of connectivity among the features present in a visual stimulus. Global gestalt constraints are then introduced in terms of spectral analysis of the connectivity matrix, showing that this processing can be cortically implemented in V1 by mean field neural equations. This analysis performs grouping of local features and individuates perceptual units with the highest saliency. Numerical simulations are performed and results are obtained applying the technique to a number of stimuli.Comment: submitted to Neural Computatio

    Winner-Relaxing Self-Organizing Maps

    Full text link
    A new family of self-organizing maps, the Winner-Relaxing Kohonen Algorithm, is introduced as a generalization of a variant given by Kohonen in 1991. The magnification behaviour is calculated analytically. For the original variant a magnification exponent of 4/7 is derived; the generalized version allows to steer the magnification in the wide range from exponent 1/2 to 1 in the one-dimensional case, thus provides optimal mapping in the sense of information theory. The Winner Relaxing Algorithm requires minimal extra computations per learning step and is conveniently easy to implement.Comment: 14 pages (6 figs included). To appear in Neural Computatio

    Role of homeostasis in learning sparse representations

    Full text link
    Neurons in the input layer of primary visual cortex in primates develop edge-like receptive fields. One approach to understanding the emergence of this response is to state that neural activity has to efficiently represent sensory data with respect to the statistics of natural scenes. Furthermore, it is believed that such an efficient coding is achieved using a competition across neurons so as to generate a sparse representation, that is, where a relatively small number of neurons are simultaneously active. Indeed, different models of sparse coding, coupled with Hebbian learning and homeostasis, have been proposed that successfully match the observed emergent response. However, the specific role of homeostasis in learning such sparse representations is still largely unknown. By quantitatively assessing the efficiency of the neural representation during learning, we derive a cooperative homeostasis mechanism that optimally tunes the competition between neurons within the sparse coding algorithm. We apply this homeostasis while learning small patches taken from natural images and compare its efficiency with state-of-the-art algorithms. Results show that while different sparse coding algorithms give similar coding results, the homeostasis provides an optimal balance for the representation of natural images within the population of neurons. Competition in sparse coding is optimized when it is fair. By contributing to optimizing statistical competition across neurons, homeostasis is crucial in providing a more efficient solution to the emergence of independent components

    Multiscale Discriminant Saliency for Visual Attention

    Full text link
    The bottom-up saliency, an early stage of humans' visual attention, can be considered as a binary classification problem between center and surround classes. Discriminant power of features for the classification is measured as mutual information between features and two classes distribution. The estimated discrepancy of two feature classes very much depends on considered scale levels; then, multi-scale structure and discriminant power are integrated by employing discrete wavelet features and Hidden markov tree (HMT). With wavelet coefficients and Hidden Markov Tree parameters, quad-tree like label structures are constructed and utilized in maximum a posterior probability (MAP) of hidden class variables at corresponding dyadic sub-squares. Then, saliency value for each dyadic square at each scale level is computed with discriminant power principle and the MAP. Finally, across multiple scales is integrated the final saliency map by an information maximization rule. Both standard quantitative tools such as NSS, LCC, AUC and qualitative assessments are used for evaluating the proposed multiscale discriminant saliency method (MDIS) against the well-know information-based saliency method AIM on its Bruce Database wity eye-tracking data. Simulation results are presented and analyzed to verify the validity of MDIS as well as point out its disadvantages for further research direction.Comment: 16 pages, ICCSA 2013 - BIOCA sessio
    • 

    corecore