140 research outputs found

    Neural Decision Boundaries for Maximal Information Transmission

    Get PDF
    We consider here how to separate multidimensional signals into two categories, such that the binary decision transmits the maximum possible information transmitted about those signals. Our motivation comes from the nervous system, where neurons process multidimensional signals into a binary sequence of responses (spikes). In a small noise limit, we derive a general equation for the decision boundary that locally relates its curvature to the probability distribution of inputs. We show that for Gaussian inputs the optimal boundaries are planar, but for non-Gaussian inputs the curvature is nonzero. As an example, we consider exponentially distributed inputs, which are known to approximate a variety of signals from natural environment.Comment: 5 pages, 3 figure

    Exact Resummations in the Theory of Hydrodynamic Turbulence: III. Scenarios for Anomalous Scaling and Intermittency

    Full text link
    Elements of the analytic structure of anomalous scaling and intermittency in fully developed hydrodynamic turbulence are described. We focus here on the structure functions of velocity differences that satisfy inertial range scaling laws Sn(R)RζnS_n(R)\sim R^{\zeta_n}, and the correlation of energy dissipation Kϵϵ(R)RμK_{\epsilon\epsilon}(R) \sim R^{-\mu}. The goal is to understand the exponents ζn\zeta_n and μ\mu from first principles. In paper II of this series it was shown that the existence of an ultraviolet scale (the dissipation scale η\eta) is associated with a spectrum of anomalous exponents that characterize the ultraviolet divergences of correlations of gradient fields. The leading scaling exponent in this family was denoted Δ\Delta. The exact resummation of ladder diagrams resulted in the calculation of Δ\Delta which satisfies the scaling relation Δ=2ζ2\Delta=2-\zeta_2. In this paper we continue our analysis and show that nonperturbative effects may introduce multiscaling (i.e. ζn\zeta_n not being linear in nn) with the renormalization scale being the infrared outer scale of turbulence LL. It is shown that deviations from K41 scaling of Sn(R)S_n(R) (ζnn/3\zeta_n\neq n/3) must appear if the correlation of dissipation is mixing (i.e. μ>0\mu>0). We derive an exact scaling relation μ=2ζ2ζ4\mu = 2\zeta_2-\zeta_4. We present analytic expressions for ζn\zeta_n for all nn and discuss their relation to experimental data. One surprising prediction is that the time decay constant τn(R)Rzn\tau_n(R)\propto R^{z_n} of Sn(R)S_n(R) scales independently of nn: the dynamic scaling exponent znz_n is the same for all nn-order quantities, zn=ζ2z_n=\zeta_2.Comment: PRE submitted, 22 pages + 11 figures, REVTeX. The Eps files of figures will be FTPed by request to [email protected]

    Intrinsic gain modulation and adaptive neural coding

    Get PDF
    In many cases, the computation of a neural system can be reduced to a receptive field, or a set of linear filters, and a thresholding function, or gain curve, which determines the firing probability; this is known as a linear/nonlinear model. In some forms of sensory adaptation, these linear filters and gain curve adjust very rapidly to changes in the variance of a randomly varying driving input. An apparently similar but previously unrelated issue is the observation of gain control by background noise in cortical neurons: the slope of the firing rate vs current (f-I) curve changes with the variance of background random input. Here, we show a direct correspondence between these two observations by relating variance-dependent changes in the gain of f-I curves to characteristics of the changing empirical linear/nonlinear model obtained by sampling. In the case that the underlying system is fixed, we derive relationships relating the change of the gain with respect to both mean and variance with the receptive fields derived from reverse correlation on a white noise stimulus. Using two conductance-based model neurons that display distinct gain modulation properties through a simple change in parameters, we show that coding properties of both these models quantitatively satisfy the predicted relationships. Our results describe how both variance-dependent gain modulation and adaptive neural computation result from intrinsic nonlinearity.Comment: 24 pages, 4 figures, 1 supporting informatio

    Renormalization group and anomalous scaling in a simple model of passive scalar advection in compressible flow

    Full text link
    Field theoretical renormalization group methods are applied to a simple model of a passive scalar quantity advected by the Gaussian non-solenoidal (``compressible'') velocity field with the covariance δ(tt)xxϵ\propto\delta(t-t')| x-x'|^{\epsilon}. Convective range anomalous scaling for the structure functions and various pair correlators is established, and the corresponding anomalous exponents are calculated to the order ϵ2\epsilon^2 of the ϵ\epsilon expansion. These exponents are non-universal, as a result of the degeneracy of the RG fixed point. In contrast to the case of a purely solenoidal velocity field (Obukhov--Kraichnan model), the correlation functions in the case at hand exhibit nontrivial dependence on both the IR and UV characteristic scales, and the anomalous scaling appears already at the level of the pair correlator. The powers of the scalar field without derivatives, whose critical dimensions determine the anomalous exponents, exhibit multifractal behaviour. The exact solution for the pair correlator is obtained; it is in agreement with the result obtained within the ϵ\epsilon expansion. The anomalous exponents for passively advected magnetic fields are also presented in the first order of the ϵ\epsilon expansion.Comment: 31 pages, REVTEX file. More detailed discussion of the one-dimensional case and comparison to the previous paper [20] are given; references updated. Results and formulas unchange

    Particles and fields in fluid turbulence

    Full text link
    The understanding of fluid turbulence has considerably progressed in recent years. The application of the methods of statistical mechanics to the description of the motion of fluid particles, i.e. to the Lagrangian dynamics, has led to a new quantitative theory of intermittency in turbulent transport. The first analytical description of anomalous scaling laws in turbulence has been obtained. The underlying physical mechanism reveals the role of statistical integrals of motion in non-equilibrium systems. For turbulent transport, the statistical conservation laws are hidden in the evolution of groups of fluid particles and arise from the competition between the expansion of a group and the change of its geometry. By breaking the scale-invariance symmetry, the statistically conserved quantities lead to the observed anomalous scaling of transported fields. Lagrangian methods also shed new light on some practical issues, such as mixing and turbulent magnetic dynamo.Comment: 165 pages, review article for Rev. Mod. Phy

    Stimulus-dependent maximum entropy models of neural population codes

    Get PDF
    Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. To be able to infer a model for this distribution from large-scale neural recordings, we introduce a stimulus-dependent maximum entropy (SDME) model---a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. The model is able to capture the single-cell response properties as well as the correlations in neural spiking due to shared stimulus and due to effective neuron-to-neuron connections. Here we show that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. As a result, the SDME model gives a more accurate account of single cell responses and in particular outperforms uncoupled models in reproducing the distributions of codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like surprise and information transmission in a neural population.Comment: 11 pages, 7 figure

    The Natural Variation of a Neural Code

    Get PDF
    The way information is represented by sequences of action potentials of spiking neurons is determined by the input each neuron receives, but also by its biophysics, and the specifics of the circuit in which it is embedded. Even the “code” of identified neurons can vary considerably from individual to individual. Here we compared the neural codes of the identified H1 neuron in the visual systems of two families of flies, blow flies and flesh flies, and explored the effect of the sensory environment that the flies were exposed to during development on the H1 code. We found that the two families differed considerably in the temporal structure of the code, its content and energetic efficiency, as well as the temporal delay of neural response. The differences in the environmental conditions during the flies' development had no significant effect. Our results may thus reflect an instance of a family-specific design of the neural code. They may also suggest that individual variability in information processing by this specific neuron, in terms of both form and content, is regulated genetically

    Learning with a network of competing synapses

    Get PDF
    Competition between synapses arises in some forms of correlation-based plasticity. Here we propose a game theory-inspired model of synaptic interactions whose dynamics is driven by competition between synapses in their weak and strong states, which are characterized by different timescales. The learning of inputs and memory are meaningfully definable in an effective description of networked synaptic populations. We study, numerically and analytically, the dynamic responses of the effective system to various signal types, particularly with reference to an existing empirical motor adaptation model. The dependence of the system-level behavior on the synaptic parameters, and the signal strength, is brought out in a clear manner, thus illuminating issues such as those of optimal performance, and the functional role of multiple timescales.Comment: 16 pages, 9 figures; published in PLoS ON

    Spatially uninformative sounds increase sensitivity for visual motion change

    Get PDF
    It has recently been shown that spatially uninformative sounds can cause a visual stimulus to pop out from an array of similar distractor stimuli when that sound is presented in temporal proximity to a feature change in the visual stimulus. Until now, this effect has predominantly been demonstrated by using stationary stimuli. Here, we extended these results by showing that auditory stimuli can also improve the sensitivity of visual motion change detection. To accomplish this, we presented moving visual stimuli (small dots) on a computer screen. At a random moment during a trial, one of these stimuli could abruptly move in an orthogonal direction. Participants’ task was to indicate whether such an abrupt motion change occurred or not by making a corresponding button press. If a sound (a short 1,000 Hz tone pip) co-occurred with the abrupt motion change, participants were able to detect this motion change more frequently than when the sound was not present. Using measures derived from signal detection theory, we were able to demonstrate that the effect on accuracy was due to increased sensitivity rather than to changes in response bias
    corecore