234 research outputs found

    Homeostatic structural plasticity increases the efficiency of small-world networks

    Get PDF
    In networks with small-world topology, which are characterized by a high clustering coefficient and a short characteristic path length, information can be transmitted efficiently and at relatively low costs. The brain is composed of small-world networks, and evolution may have optimized brain connectivity for efficient information processing. Despite many studies on the impact of topology on information processing in neuronal networks, little is known about the development of network topology and the emergence of efficient small-world networks. We investigated how a simple growth process that favors short-range connections over long-range connections in combination with a synapse formation rule that generates homeostasis in post-synaptic firing rates shapes neuronal network topology. Interestingly, we found that small-world networks benefited from homeostasis by an increase in efficiency, defined as the averaged inverse of the shortest paths through the network. Efficiency particularly increased as small-world networks approached the desired level of electrical activity. Ultimately, homeostatic small-world networks became almost as efficient as random networks. The increase in efficiency was caused by the emergent property of the homeostatic growth process that neurons started forming more long-range connections, albeit at a low rate, when their electrical activity was close to the homeostatic set-point. Although global network topology continued to change when neuronal activities were around the homeostatic equilibrium, the small-world property of the network was maintained over the entire course of development. Our results may help understand how complex systems such as the brain could set up an efficient network topology in a self-organizing manner. Insights from our work may also lead to novel techniques for constructing large-scale neuronal networks by self-organization

    An Algorithm for Finding Candidate Synaptic Sites in Computer Generated Networks of Neurons with Realistic Morphologies

    Get PDF
    Neurons make synaptic connections at locations where axons and dendrites are sufficiently close in space. Typically the required proximity is based on the dimensions of dendritic spines and axonal boutons. Based on this principle one can search those locations in networks formed by reconstructed neurons or computer generated neurons. Candidate synapses are then located where axons and dendrites are within a given criterion distance from each other. Both experimentally reconstructed and model generated neurons are usually represented morphologically by piecewise-linear structures (line pieces or cylinders). Proximity tests are then performed on all pairs of line pieces from both axonal and dendritic branches. Applying just a test on the distance between line pieces may result in local clusters of synaptic sites when more than one pair of nearby line pieces from axonal and dendritic branches is sufficient close, and may introduce a dependency on the length scale of the individual line pieces. The present paper describes a new algorithm for defining locations of candidate synapses which is based on the crossing requirement of a line piece pair, while the length of the orthogonal distance between the line pieces is subjected to the distance criterion for testing 3D proximity

    Mathematical modelling and numerical simulation of the morphological development of neurons

    Get PDF
    BACKGROUND: The morphological development of neurons is a very complex process involving both genetic and environmental components. Mathematical modelling and numerical simulation are valuable tools in helping us unravel particular aspects of how individual neurons grow their characteristic morphologies and eventually form appropriate networks with each other. METHODS: A variety of mathematical models that consider (1) neurite initiation (2) neurite elongation (3) axon pathfinding, and (4) neurite branching and dendritic shape formation are reviewed. The different mathematical techniques employed are also described. RESULTS: Some comparison of modelling results with experimental data is made. A critique of different modelling techniques is given, leading to a proposal for a unified modelling environment for models of neuronal development. CONCLUSION: A unified mathematical and numerical simulation framework should lead to an expansion of work on models of neuronal development, as has occurred with compartmental models of neuronal electrical activity

    Volumetric measurement of pulmonary nodules at low-dose chest CT: effect of reconstruction setting on measurement variability

    Get PDF
    To assess volumetric measurement variability in pulmonary nodules detected at low-dose chest CT with three reconstruction settings. The volume of 200 solid pulmonary nodules was measured three times using commercially available semi-automated software of low-dose chest CT data-sets reconstructed with 1 mm section thickness and a soft kernel (A), 2 mm and a soft kernel (B), and 2 mm and a sharp kernel (C), respectively. Repeatability coefficients of the three measurements within each setting were calculated by the Bland and Altman method. A three-level model was applied to test the impact of reconstruction setting on the measured volume. The repeatability coefficients were 8.9, 22.5 and 37.5% for settings A, B and C. Three-level analysis showed that settings A and C yielded a 1.29 times higher estimate of nodule volume compared with setting B (P = 0.03). The significant interaction among setting, nodule location and morphology demonstrated that the effect of the reconstruction setting was different for different types of nodules. Low-dose CT reconstructed with 1 mm section thickness and a soft kernel provided the most repeatable volume measurement. A wide, nodule-type-dependent range of agreement between volume measurements with different reconstruction settings suggests strict consistency is required for serial CT studies

    Novel Candidate Genes Associated with Hippocampal Oscillations

    Get PDF
    The hippocampus is critical for a wide range of emotional and cognitive behaviors. Here, we performed the first genome-wide search for genes influencing hippocampal oscillations. We measured local field potentials (LFPs) using 64-channel multi-electrode arrays in acute hippocampal slices of 29 BXD recombinant inbred mouse strains. Spontaneous activity and carbachol-induced fast network oscillations were analyzed with spectral and cross-correlation methods and the resulting traits were used for mapping quantitative trait loci (QTLs), i.e., regions on the genome that may influence hippocampal function. Using genome-wide hippocampal gene expression data, we narrowed the QTLs to eight candidate genes, including Plcb1, a phospholipase that is known to influence hippocampal oscillations. We also identified two genes coding for calcium channels, Cacna1b and Cacna1e, which mediate presynaptic transmitter release and have not been shown to regulate hippocampal network activity previously. Furthermore, we showed that the amplitude of the hippocampal oscillations is genetically correlated with hippocampal volume and several measures of novel environment exploration

    Dynamic Hebbian Cross-Correlation Learning Resolves the Spike Timing Dependent Plasticity Conundrum

    Get PDF
    Spike Timing-Dependent Plasticity has been found to assume many different forms. The classic STDP curve, with one potentiating and one depressing window, is only one of many possible curves that describe synaptic learning using the STDP mechanism. It has been shown experimentally that STDP curves may contain multiple LTP and LTD windows of variable width, and even inverted windows. The underlying STDP mechanism that is capable of producing such an extensive, and apparently incompatible, range of learning curves is still under investigation. In this paper, it is shown that STDP originates from a combination of two dynamic Hebbian cross-correlations of local activity at the synapse. The correlation of the presynaptic activity with the local postsynaptic activity is a robust and reliable indicator of the discrepancy between the presynaptic neuron and the postsynaptic neuron's activity. The second correlation is between the local postsynaptic activity with dendritic activity which is a good indicator of matching local synaptic and dendritic activity. We show that this simple time-independent learning rule can give rise to many forms of the STDP learning curve. The rule regulates synaptic strength without the need for spike matching or other supervisory learning mechanisms. Local differences in dendritic activity at the synapse greatly affect the cross-correlation difference which determines the relative contributions of different neural activity sources. Dendritic activity due to nearby synapses, action potentials, both forward and back-propagating, as well as inhibitory synapses will dynamically modify the local activity at the synapse, and the resulting STDP learning rule. The dynamic Hebbian learning rule ensures furthermore, that the resulting synaptic strength is dynamically stable, and that interactions between synapses do not result in local instabilities. The rule clearly demonstrates that synapses function as independent localized computational entities, each contributing to the global activity, not in a simply linear fashion, but in a manner that is appropriate to achieve local and global stability of the neuron and the entire dendritic structure

    The role of biases in on-line learning of two-layer networks

    Get PDF
    The influence of biases on the learning dynamics of a two-layer neural network, a normalized soft-committee machine, is studied for on-line gradient descent learning. Within a statistical mechanics framework, numerical studies show that the inclusion of adjustable biases dramatically alters the learning dynamics found previously. The symmetric phase which has often been predominant in the original model all but disappears for a non-degenerate bias task. The extended model furthermore exhibits a much richer dynamical behavior, e.g. attractive suboptimal symmetric phases even for realizable cases and noiseless data
    • …
    corecore