3,521 research outputs found

    The sign rule and beyond: Boundary effects, flexibility, and noise correlations in neural population codes

    Get PDF
    Over repeat presentations of the same stimulus, sensory neurons show variable responses. This "noise" is typically correlated between pairs of cells, and a question with rich history in neuroscience is how these noise correlations impact the population's ability to encode the stimulus. Here, we consider a very general setting for population coding, investigating how information varies as a function of noise correlations, with all other aspects of the problem - neural tuning curves, etc. - held fixed. This work yields unifying insights into the role of noise correlations. These are summarized in the form of theorems, and illustrated with numerical examples involving neurons with diverse tuning curves. Our main contributions are as follows. (1) We generalize previous results to prove a sign rule (SR) - if noise correlations between pairs of neurons have opposite signs vs. their signal correlations, then coding performance will improve compared to the independent case. This holds for three different metrics of coding performance, and for arbitrary tuning curves and levels of heterogeneity. This generality is true for our other results as well. (2) As also pointed out in the literature, the SR does not provide a necessary condition for good coding. We show that a diverse set of correlation structures can improve coding. Many of these violate the SR, as do experimentally observed correlations. There is structure to this diversity: we prove that the optimal correlation structures must lie on boundaries of the possible set of noise correlations. (3) We provide a novel set of necessary and sufficient conditions, under which the coding performance (in the presence of noise) will be as good as it would be if there were no noise present at all.Comment: 41 pages, 5 figure

    Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels

    Full text link
    Within the framework of linear vector Gaussian channels with arbitrary signaling, closed-form expressions for the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, closed-form expressions for the Hessian of the mutual information and the differential entropy are derived. These expressions are then used to assess the concavity properties of mutual information and differential entropy under different channel conditions and also to derive a multivariate version of the entropy power inequality due to Costa.Comment: 33 pages, 2 figures. A shorter version of this paper is to appear in IEEE Transactions on Information Theor

    One-way quantum key distribution: Simple upper bound on the secret key rate

    Full text link
    We present a simple method to obtain an upper bound on the achievable secret key rate in quantum key distribution (QKD) protocols that use only unidirectional classical communication during the public-discussion phase. This method is based on a necessary precondition for one-way secret key distillation; the legitimate users need to prove that there exists no quantum state having a symmetric extension that is compatible with the available measurements results. The main advantage of the obtained upper bound is that it can be formulated as a semidefinite program, which can be efficiently solved. We illustrate our results by analysing two well-known qubit-based QKD protocols: the four-state protocol and the six-state protocol. Recent results by Renner et al., Phys. Rev. A 72, 012332 (2005), also show that the given precondition is only necessary but not sufficient for unidirectional secret key distillation.Comment: 11 pages, 1 figur

    Some upper and lower bounds on PSD-rank

    Get PDF
    Positive semidefinite rank (PSD-rank) is a relatively new quantity with applications to combinatorial optimization and communication complexity. We first study several basic properties of PSD-rank, and then develop new techniques for showing lower bounds on the PSD-rank. All of these bounds are based on viewing a positive semidefinite factorization of a matrix MM as a quantum communication protocol. These lower bounds depend on the entries of the matrix and not only on its support (the zero/nonzero pattern), overcoming a limitation of some previous techniques. We compare these new lower bounds with known bounds, and give examples where the new ones are better. As an application we determine the PSD-rank of (approximations of) some common matrices.Comment: 21 page
    • …
    corecore