889 research outputs found

    Gene selection and disease prediction from gene expression data using a two-stage hetero-associative memory

    Get PDF
    In general, gene expression microarrays consist of a vast number of genes and very few samples, which represents a critical challenge for disease prediction and diagnosis. This paper develops a two-stage algorithm that integrates feature selection and prediction by extending a type of hetero-associative neural networks. In the first level, the algorithm generates the associative memory, whereas the second level picks the most relevant genes.With the purpose of illustrating the applicability and efficiency of the method proposed here, we use four different gene expression microarray databases and compare their classification performance against that of other renowned classifiers built on the whole (original) feature (gene) space. The experimental results show that the two-stage hetero-associative memory is quite competitive with standard classification models regarding the overall accuracy, sensitivity and specificity. In addition, it also produces a significant decrease in computational efforts and an increase in the biological interpretability of microarrays because worthless (irrelevant and/or redundant) genes are discarded

    Corticonic models of brain mechanisms underlying cognition and intelligence

    Get PDF
    The concern of this review is brain theory or more specifically, in its first part, a model of the cerebral cortex and the way it:(a) interacts with subcortical regions like the thalamus and the hippocampus to provide higher-level-brain functions that underlie cognition and intelligence, (b) handles and represents dynamical sensory patterns imposed by a constantly changing environment, (c) copes with the enormous number of such patterns encountered in a lifetime bymeans of dynamic memory that offers an immense number of stimulus-specific attractors for input patterns (stimuli) to select from, (d) selects an attractor through a process of ā€œconjugationā€ of the input pattern with the dynamics of the thalamoā€“cortical loop, (e) distinguishes between redundant (structured)and non-redundant (random) inputs that are void of information, (f) can do categorical perception when there is access to vast associative memory laid out in the association cortex with the help of the hippocampus, and (g) makes use of ā€œcomputationā€ at the edge of chaos and information driven annealing to achieve all this. Other features and implications of the concepts presented for the design of computational algorithms and machines with brain-like intelligence are also discussed. The material and results presented suggest, that a Parametrically Coupled Logistic Map network (PCLMN) is a minimal model of the thalamoā€“cortical complex and that marrying such a network to a suitable associative memory with re-entry or feedback forms a useful, albeit, abstract model of a cortical module of the brain that could facilitate building a simple artificial brain. In the second part of the review, the results of numerical simulations and drawn conclusions in the first part are linked to the most directly relevant works and views of other workers. What emerges is a picture of brain dynamics on the mesoscopic and macroscopic scales that gives a glimpse of the nature of the long sought after brain code underlying intelligence and other higher level brain functions. Physics of Life Reviews 4 (2007) 223ā€“252 Ā© 2007 Elsevier B.V. All rights reserved

    Pattern Turnover within Synaptically Perturbed Neural Systems

    Get PDF
    AbstractA critical level of synaptic perturbation within a trained, artificial neural system induces the nucleation of novel activation patterns, many of which could qualify as viable ideas or action plans. In building massively parallel connectionist architectures requiring myriad, coupled neural modules driven to ideate in this manner, the need has arisen to shift the attention of computational critics to only those portions of the neural ā€œreal estateā€ generating sufficiently novel activation patterns. The search for a suitable affordance to guide such attention has revealed that the rhythm of pattern generation by synaptically perturbed neural nets is a quantitative indicator of the novelty of their conceptual output, that cadence in turn characterized by a frequency and a corresponding temporal clustering that is discernible through fractal dimension. Anticipating that synaptic fluctuations are tantamount in effect to volume neurotransmitter release within cortex, a novel theory of both cognition and consciousness arises that is reliant upon the rate of transitions within cortical activation topologies

    Latching dynamics in Potts neural networks

    Get PDF
    One purpose of Computational Neuroscience is to try to understand by using models how at least some parts in the brain work or how cognitive phenomena occur and are organized in terms of neuronal activity. The Hopfield model of a neural network, rooted in Statistical Physics, put forward by J. Hopfield in the 1980s, was one of the first attempts to explain how associative memory could work. It was successful in guiding experiments, e.g., in the hippocampus and primate inferotemporal cortex. However, some higher level cognitive functions that the brain accomplishes require, to be approached quantitaively, by more advanced models beyond simple cued retrieval..

    Memories for Life: A Review of the Science and Technology

    No full text
    This paper discusses scientific, social and technological aspects of memory. Recent developments in our understanding of memory processes and mechanisms, and their digital implementation, have placed the encoding, storage, management and retrieval of information at the forefront of several fields of research. At the same time, the divisions between the biological, physical and the digital worlds seem to be dissolving. Hence opportunities for interdisciplinary research into memory are being created, between the life sciences, social sciences and physical sciences. Such research may benefit from immediate application into information management technology as a testbed. The paper describes one initiative, Memories for Life, as a potential common problem space for the various interested disciplines

    Hierarchical Associative Memory Based on Oscillatory Neural Network

    Get PDF
    In this thesis we explore algorithms and develop architectures based on emerging nano-device technologies for cognitive computing tasks such as recognition, classification, and vision. In particular we focus on pattern matching in high dimensional vector spaces to address the nearest neighbor search problem. Recent progress in nanotechnology provides us novel nano-devices with special nonlinear response characteristics that fit cognitive tasks better than general purpose computing. We build an associative memory (AM) by weakly coupling nano-oscillators as an oscillatory neural network and design a hierarchical tree structure to organize groups of AM units. For hierarchical recognition, we first examine an architecture where image patterns are partitioned into different receptive fields and processed by individual AM units in lower levels, and then abstracted using sparse coding techniques for recognition at higher levels. A second tree structure model is developed as a more scalable AM architecture for large data sets. In this model, patterns are classified by hierarchical k-means clustering and organized in hierarchical clusters. Then the recognition process is done by comparison between the input patterns and centroids identified in the clustering process. The tree is explored in a "depth-only" manner until the closest image pattern is output. We also extend this search technique to incorporate a branch-and-bound algorithm. The models and corresponding algorithms are tested on two standard face recognition data-sets. We show that the depth-only hierarchical model is very data-set dependent and performs with 97% or 67% recognition when compared to a single large associative memory, while the branch and bound search increases time by only a factor of two compared to the depth-only search
    • ā€¦
    corecore