826 research outputs found

    Image Compression Using Cascaded Neural Networks

    Get PDF
    Images are forming an increasingly large part of modern communications, bringing the need for efficient and effective compression. Many techniques developed for this purpose include transform coding, vector quantization and neural networks. In this thesis, a new neural network method is used to achieve image compression. This work extends the use of 2-layer neural networks to a combination of cascaded networks with one node in the hidden layer. A redistribution of the gray levels in the training phase is implemented in a random fashion to make the minimization of the mean square error applicable to a broad range of images. The computational complexity of this approach is analyzed in terms of overall number of weights and overall convergence. Image quality is measured objectively, using peak signal-to-noise ratio and subjectively, using perception. The effects of different image contents and compression ratios are assessed. Results show the performance superiority of cascaded neural networks compared to that of fixedarchitecture training paradigms especially at high compression ratios. The proposed new method is implemented in MATLAB. The results obtained, such as compression ratio and computing time of the compressed images, are presented

    Image Compression Using Cascaded Neural Networks

    Get PDF
    Images are forming an increasingly large part of modern communications, bringing the need for efficient and effective compression. Many techniques developed for this purpose include transform coding, vector quantization and neural networks. In this thesis, a new neural network method is used to achieve image compression. This work extends the use of 2-layer neural networks to a combination of cascaded networks with one node in the hidden layer. A redistribution of the gray levels in the training phase is implemented in a random fashion to make the minimization of the mean square error applicable to a broad range of images. The computational complexity of this approach is analyzed in terms of overall number of weights and overall convergence. Image quality is measured objectively, using peak signal-to-noise ratio and subjectively, using perception. The effects of different image contents and compression ratios are assessed. Results show the performance superiority of cascaded neural networks compared to that of fixedarchitecture training paradigms especially at high compression ratios. The proposed new method is implemented in MATLAB. The results obtained, such as compression ratio and computing time of the compressed images, are presented

    Neural Encoding and Decoding with Deep Learning for Natural Vision

    Get PDF
    The overarching objective of this work is to bridge neuroscience and artificial intelligence to ultimately build machines that learn, act, and think like humans. In the context of vision, the brain enables humans to readily make sense of the visual world, e.g. recognizing visual objects. Developing human-like machines requires understanding the working principles underlying the human vision. In this dissertation, I ask how the brain encodes and represents dynamic visual information from the outside world, whether brain activity can be directly decoded to reconstruct and categorize what a person is seeing, and whether neuroscience theory can be applied to artificial models to advance computer vision. To address these questions, I used deep neural networks (DNN) to establish encoding and decoding models for describing the relationships between the brain and the visual stimuli. Using the DNN, the encoding models were able to predict the functional magnetic resonance imaging (fMRI) responses throughout the visual cortex given video stimuli; the decoding models were able to reconstruct and categorize the visual stimuli based on fMRI activity. To further advance the DNN model, I have implemented a new bidirectional and recurrent neural network based on the predictive coding theory. As a theory in neuroscience, predictive coding explains the interaction among feedforward, feedback, and recurrent connections. The results showed that this brain-inspired model significantly outperforms feedforward-only DNNs in object recognition. These studies have positive impact on understanding the neural computations under human vision and improving computer vision with the knowledge from neuroscience

    Construction of a multi-scale spiking model of macaque visual cortex

    Get PDF
    Understanding the relationship between structure and dynamics of the mammalian cortex is a key challenge of neuroscience. So far, it has been tackled in two ways: by modeling neurons or small circuits in great detail, and through large-scale models representing each area with a small number of differential equations. To bridge the gap between these two approaches, we construct a spiking network model extending earlier work on the cortical microcircuit by Potjans & Diesmann (2014) to all 32 areas of the macaque visual cortex in the parcellation of Felleman & Van Essen (1991). The model takes into account spe- cific neuronal densities and laminar thicknesses of the individual areas. The connectivity of the model combines recently updated binary tracing data from the CoCoMac database (Stephan et al., 2001) with quantitative tracing data providing connection densities (Markov et al., 2014a) and laminar connection patterns (Stephan et al., 2001; Markov et al., 2014b). We estimate missing data using structural regular- ities such as the exponential decay of connection densities with distance between areas (Ercsey-Ravasz et al., 2013) and a fit of laminar patterns versus logarithmic ratios of neuron densities. The model integrates a large body of knowledge on the structure of macaque visual cortex into a consistent framework that allows for progressive refinement

    Micro-, Meso- and Macro-Connectomics of the Brain

    Get PDF
    Neurosciences, Neurolog

    27th Annual Computational Neuroscience Meeting (CNS*2018): Part One

    Get PDF

    Prefrontal rhythms for cognitive control

    Get PDF
    Goal-directed behavior requires flexible selection among action plans and updating behavioral strategies when they fail to achieve desired goals. Lateral prefrontal cortex (LPFC) is implicated in the execution of behavior-guiding rule-based cognitive control while anterior cingulate cortex (ACC) is implicated in monitoring processes and updating rules. Rule-based cognitive control requires selective processing while process monitoring benefits from combinatorial processing. I used a combination of computational and experimental methods to investigate how network oscillations and neuronal heterogeneity contribute to cognitive control through their effects on selective versus combinatorial processing modes in LPFC and ACC. First, I adapted an existing LPFC model to explore input frequency- and coherence-based output selection mechanisms for flexible routing of rate-coded signals. I show that the oscillatory states of input encoding populations can exhibit a stronger influence over downstream competition than their activity levels. This enables an output driven by a weaker resonant input signal to suppress lower-frequency competing responses to stronger, less resonant (though possibly higher-frequency) input signals. While signals are encoded in population firing rates, output selection and signal routing can be governed independently by the frequency and coherence of oscillatory inputs and their correspondence with output resonant properties. Flexible response selection and gating can be achieved by oscillatory state control mechanisms operating on input encoding populations. These dynamic mechanisms enable experimentally-observed LPFC beta and gamma oscillations to flexibly govern the selection and gating of rate-coded signals for downstream read-out. Furthermore, I demonstrate how differential drives to distinct interneuron populations can switch working memory representations between asynchronous and oscillatory states that support rule-based selection. Next, I analyzed physiological data from the LeBeau laboratory and built a de novo model constrained by the biological data. Experimental data demonstrated that fast network oscillations at both the beta- and gamma frequency bands could be elicited in vitro in ACC and neurons exhibited a wide range of intrinsic properties. Computational modeling of the ACC network revealed that the frequency of network oscillation generated was dependent upon the time course of inhibition. Principal cell heterogeneity broadened the range of frequencies generated by the model network. In addition, with different frequency inputs to two neuronal assemblies, heterogeneity decreased competition and increased spike coherence between the networks thus conferring a combinatorial advantage to the network. These findings suggest that oscillating neuronal populations can support either response selection (routing), or combination, depending on the interplay between the kinetics of synaptic inhibition and the degree of heterogeneity of principal cell intrinsic conductances. Such differences may support functional differences between the roles of LPFC and ACC in cognitive control
    corecore