311 research outputs found
Turing Pattern Dynamics for Spatiotemporal Models with Growth and Curvature
Turing theory plays an important role in real biological pattern formation problems, such as solid tumor growth and animal coat patterns. To understand how patterns form and develop over time due to growth, we consider spatiotemporal patterns, in particular Turing patterns, for reaction diffusion systems on growing surfaces with curvature. Of particular interest is isotropic growth of the sphere, where growth of the domain occurs in the same proportion in all directions. Applying a modified linear stability analysis and a separation of timescales argument, we derive the necessary and sufficient conditions for a diffusion driven instability of the steady state and for the emergence of spatial patterns. Finally, we explore these results using numerical simulations
A circuit mechanism for independent modulation of excitatory and inhibitory firing rates after sensory deprivation
Diverse interneuron subtypes shape sensory processing in mature cortical circuits. During development, sensory deprivation evokes powerful synaptic plasticity that alters circuitry, but how different inhibitory subtypes modulate circuit dynamics in response to this plasticity remains unclear. We investigate how deprivation-induced synaptic changes affect excitatory and inhibitory firing rates in a microcircuit model of the sensory cortex with multiple interneuron subtypes. We find that with a single interneuron subtype (parvalbumin-expressing [PV]), excitatory and inhibitory firing rates can only be comodulated-increased or decreased together. To explain the experimentally observed independent modulation, whereby one firing rate increases and the other decreases, requires strong feedback from a second interneuron subtype (somatostatin-expressing [SST]). Our model applies to the visual and somatosensory cortex, suggesting a general mechanism across sensory cortices. Therefore, we provide a mechanistic explanation for the differential role of interneuron subtypes in regulating firing rates, contributing to the already diverse roles they serve in the cortex
Turing Patterns on Growing Spheres: The Exponential Case
We consider Turing patterns for reaction-diffusion systems on the surface of a growing sphere. In particular, we are interested in the effect of dynamic growth on the pattern formation. We consider exponential isotropic growth of the sphere and perform a linear stability analysis and compare the results with numerical simulations
Implications of single-neuron gain scaling for information transmission in networks
Summary: 

Many neural systems are equipped with mechanisms to efficiently encode sensory information. To represent natural stimuli with time-varying statistical properties, neural systems should adjust their gain to the inputs' statistical distribution. Such matching of dynamic range to input statistics has been shown to maximize the information transmitted by the output spike trains (Brenner et al., 2000, Fairhall et al., 2001). Gain scaling has not only been observed as a system response property, but also in single neurons in developing somatosensory cortex stimulated with currents of different amplitude (Mease et al., 2010). While gain scaling holds for cortical neurons at the end of the first post-natal week, at birth these neurons lack this property. The observed improvement in gain scaling coincides with the disappearance of spontaneous waves of activity in cortex (Conheim et al., 2010).

We studied how single-neuron gain scaling affects the dynamics of signal transmission in networks, using the developing cortex as a model. In a one-layer feedforward network, we showed that the absence of gain control made the network relatively insensitive to uncorrelated local input fluctuations. As a result, these neurons selectively and synchronously responded to large slowly-varying correlated input--the slow build up of synaptic noise generated in pacemaker circuits which most likely triggers waves. Neurons in gain scaling networks were more sensitive to the small-scale input fluctuations, and responded asynchronously to the slow envelope. Thus, gain scaling both increases information in individual neurons about private inputs and allows the population average to encode the slow fluctuations in the input. Paradoxically, the synchronous firing that corresponds to wave propagation is associated with low information transfer. We therefore suggest that the emergence of gain scaling may help the system to increase information transmission on multiple timescales as sensory stimuli become important later in development. 

Methods:

Networks with one and two layers consisting of hundreds of model neurons were constructed. The ability of single neurons to gain scale was controlled by changing the ratio of sodium to potassium conductances in Hodgkin-Huxley neurons (Mainen et al., 1995). The response of single layer networks was studied with ramp-like stimuli with slopes that varied over several hundreds of milliseconds. Fast fluctuations were superimposed on this slowly-varying mean. Then the response to these networks was tested with continuous stimuli. Gain scaling networks captured the slow fluctuations in the inputs, while non-scaling networks simply thresholded the input. Quantifying information transmission confirmed that gain scaling neurons transmit more information about the stimulus. With the two-layer networks we simulated a cortical network where waves could spontaneously emerge, propagate and degrade, based on the gain scaling properties of the neurons in the network
Emergence of synaptic organization and computation in dendrites
Single neurons in the brain exhibit astounding computational capabilities, which gradually emerge throughout development and enable them to become integrated into complex neural circuits. These capabilities derive in part from the precise arrangement of synaptic inputs on the neurons’ dendrites. While the full computational benefits of this arrangement are still unknown, a picture emerges in which synapses organize according to their functional properties across multiple spatial scales. In particular, on the local scale (tens of microns), excitatory synaptic inputs tend to form clusters according to their functional similarity, whereas on the scale of individual dendrites or the entire tree, synaptic inputs exhibit dendritic maps where excitatory synapse function varies smoothly with location on the tree. The development of this organization is supported by inhibitory synapses, which are carefully interleaved with excitatory synapses and can flexibly modulate activity and plasticity of excitatory synapses. Here, we summarize recent experimental and theoretical research on the developmental emergence of this synaptic organization and its impact on neural computations
Regulation of circuit organization and function through inhibitory synaptic plasticity
Diverse inhibitory neurons in the mammalian brain shape circuit connectivity and dynamics through mechanisms of synaptic plasticity. Inhibitory plasticity can establish excitation/inhibition (E/I) balance, control neuronal firing, and affect local calcium concentration, hence regulating neuronal activity at the network, single neuron, and dendritic level. Computational models can synthesize multiple experimental results and provide insight into how inhibitory plasticity controls circuit dynamics and sculpts connectivity by identifying phenomenological learning rules amenable to mathematical analysis. We highlight recent studies on the role of inhibitory plasticity in modulating excitatory plasticity, forming structured networks underlying memory formation and recall, and implementing adaptive phenomena and novelty detection. We conclude with experimental and modeling progress on the role of interneuron-specific plasticity in circuit computation and context-dependent learning
Computational implications of biophysical diversity and multiple timescales in neurons and synapses for circuit performance
Despite advances in experimental and theoretical neuroscience, we are still trying to identify key biophysical details that are important for characterizing the operation of brain circuits. Biological mechanisms at the level of single neurons and synapses can be combined as ‘building blocks’ to generate circuit function. We focus on the importance of capturing multiple timescales when describing these intrinsic and synaptic components. Whether inherent in the ionic currents, the neuron’s complex morphology, or the neurotransmitter composition of synapses, these multiple timescales prove crucial for capturing the variability and richness of circuit output and enhancing the information-carrying capacity observed across nervous systems
Optimal Sensory Coding By Populations Of ON And OFF Neurons
In many sensory systems the neural signal is coded by multiple parallel pathways, suggesting an evolutionary fitness benefit of general nature. A common pathway splitting is that into ON and OFF cells, responding to stimulus increments and decrements, respectively. According to efficient coding theory, sensory neurons have evolved to an optimal configuration for maximizing information transfer given the structure of natural stimuli and circuit constraints. Using the efficient coding framework, we describe two aspects of neural coding: how to optimally split a population into ON and OFF pathways, and how to allocate the firing thresholds of individual neurons given realistic noise levels, stimulus distributions and optimality measures. We find that populations of ON and OFF neurons convey equal information about the stimulus regardless of the ON/OFF mixture, once the thresholds are chosen optimally, independent of stimulus statistics and noise. However, an equal ON/OFF mixture is the most efficient as it uses the fewest spikes to convey this information. The optimal thresholds and coding efficiency, however, depend on noise and stimulus statistics if information is decoded by an optimal linear readout. With non-negligible noise, mixed ON/OFF populations reap significant advantages compared to a homogeneous population. The best coding performance is achieved by a unique mixture of ON/OFF neurons tuned to stimulus asymmetries and noise. We provide a theory for how different cell types work together to encode the full stimulus range using a diversity of response thresholds. The optimal ON/OFF mixtures derived from the theory accord with certain biases observed experimentally
Benefits of Pathway Splitting in Sensory Coding
In many sensory systems, the neural signal splits into multiple parallel pathways. For example, in the mammalian retina, ∼20 types of retinal ganglion cells transmit information about the visual scene to the brain. The purpose of this profuse and early pathway splitting remains unknown. We examine a common instance of splitting into ON and OFF neurons excited by increments and decrements of light intensity in the visual scene, respectively. We test the hypothesis that pathway splitting enables more efficient encoding of sensory stimuli. Specifically, we compare a model system with an ON and an OFF neuron to one with two ON neurons. Surprisingly, the optimal ON–OFF system transmits the same information as the optimal ON–ON system, if one constrains the maximal firing rate of the neurons. However, the ON–OFF system uses fewer spikes on average to transmit this information. This superiority of the ON–OFF system is also observed when the two systems are optimized while constraining their mean firing rate. The efficiency gain for the ON–OFF split is comparable with that derived from decorrelation, a well known processing strategy of early sensory systems. The gain can be orders of magnitude larger when the ecologically important stimuli are rare but large events of either polarity. The ON–OFF system also provides a better code for extracting information by a linear downstream decoder. The results suggest that the evolution of ON–OFF diversification in sensory systems may be driven by the benefits of lowering average metabolic cost, especially in a world in which the relevant stimuli are sparse
ON THE NEED OF MOVING THE BOUNDARIES OF THE PRINCIPLE OF PROCESS ECONOMY IN CIVIL LITIGATION
With the longest and fastest commercial airline flight from one end of the world to theother only 17 and a half hours, it is still unthinkable for some litigation to take years. At atime when almost every second teenager has a mobile phone with a million applications, itis inadmissible for the procedural legislator to “fail” to regulate the legal regime for onlinetrials. Hence the litigation of the XXI century in terms of speed must adapt to the trends ofmodern times to survive. Exactly because of this the subject of analysis of this paper are thenew frontiers of the principle of process economy in litigation in accordance with the newdraft Law on Civil Litigation of North Macedonia, as well as the need for its amendment inorder to comply with modern European procedural standards
- …