915 research outputs found

    Cooperative Lattice Coding and Decoding

    Full text link
    A novel lattice coding framework is proposed for outage-limited cooperative channels. This framework provides practical implementations for the optimal cooperation protocols proposed by Azarian et al. In particular, for the relay channel we implement a variant of the dynamic decode and forward protocol, which uses orthogonal constellations to reduce the channel seen by the destination to a single-input single-output time-selective one, while inheriting the same diversity-multiplexing tradeoff. This simplification allows for building the receiver using traditional belief propagation or tree search architectures. Our framework also generalizes the coding scheme of Yang and Belfiore in the context of amplify and forward cooperation. For the cooperative multiple access channel, a tree coding approach, matched to the optimal linear cooperation protocol of Azarain et al, is developed. For this scenario, the MMSE-DFE Fano decoder is shown to enjoy an excellent tradeoff between performance and complexity. Finally, the utility of the proposed schemes is established via a comprehensive simulation study.Comment: 25 pages, 8 figure

    Response variability in balanced cortical networks

    Full text link
    We study the spike statistics of neurons in a network with dynamically balanced excitation and inhibition. Our model, intended to represent a generic cortical column, comprises randomly connected excitatory and inhibitory leaky integrate-and-fire neurons, driven by excitatory input from an external population. The high connectivity permits a mean-field description in which synaptic currents can be treated as Gaussian noise, the mean and autocorrelation function of which are calculated self-consistently from the firing statistics of single model neurons. Within this description, we find that the irregularity of spike trains is controlled mainly by the strength of the synapses relative to the difference between the firing threshold and the post-firing reset level of the membrane potential. For moderately strong synapses we find spike statistics very similar to those observed in primary visual cortex.Comment: 22 pages, 7 figures, submitted to Neural Computatio

    Languages of Quantum Information Theory

    Full text link
    This note will introduce some notation and definitions for information theoretic quantities in the context of quantum systems, such as (conditional) entropy and (conditional) mutual information. We will employ the natural C*-algebra formalism, and it turns out that one has an allover dualism of language: we can define everything for (compatible) observables, but also for (compatible) C*-subalgebras. The two approaches are unified in the formalism of quantum operations, and they are connected by a very satisfying inequality, generalizing the well known Holevo bound. Then we turn to communication via (discrete memoryless) quantum channels: we formulate the Fano inequality, bound the capacity region of quantum multiway channels, and comment on the quantum broadcast channel.Comment: 16 pages, REVTEX, typos corrected, references added and extende

    Lempel Ziv Welch data compression using associative processing as an enabling technology for real time application

    Get PDF
    Data compression is a term that refers to the reduction of data representation requirements either in storage and/or in transmission. A commonly used algorithm for compression is the Lempel-Ziv-Welch (LZW) method proposed by Terry A. Welch[l]. LZW is an adaptive, dictionary based, lossless algorithm. This provides for a general compression mechanism that is applicable to a broad range of inputs. Furthermore, the lossless nature of LZW implies that it is a reversible process which results in the original file/message being fully recoverable from compression. A variant of this algorithm is currently the foundation of the UNIX compress program. Additionally, LZW is one of the compression schemes defined in the TIFF standard[12], as well as in the CCITT V.42bis standard. One of the challenges in designing an efficient compression mechanism, such as LZW, which can be used in real time applications, is the speed of the search into the data dictionary. In this paper an Associative Processing implementation of the LZW algorithm is presented. This approach provides an efficient solution to this requirement. Additionally, it is shown that Associative Processing (ASP) allows for rapid and elegant development of the LZW algorithm that will generally outperform standard approaches in complexity, readability, and performance

    Entropy and Certainty in Lossless Data Compression

    Get PDF
    Data compression is the art of using encoding techniques to represent data symbols using less storage space compared to the original data representation. The encoding process builds a relationship between the entropy of the data and the certainty of the system. The theoretical limits of this relationship are defined by the theory of entropy in information that was proposed by Claude Shannon. Lossless data compression is uniquely tied to entropy theory as the data and the system have a static definition. The static nature of the two requires a mechanism to reduce the entropy without the ability to alter either of these key components. This dissertation develops the Map of Certainty and Entropy (MaCE) in order to illustrate the entropy and certainty contained within an information system and uses this concept to generate the proposed methods for prefix-free, lossless compression of static data. The first method, Select Level Method (SLM), increases the efficiency of creating Shannon-Fano-Elias code in terms of CPU cycles. SLM is developed using a sideways view of the compression environment provided by MaCE. This view is also used for the second contribution, Sort Linear Method Nivellate (SLMN) which uses the concepts of SLM with the addition of midpoints and a fitting function to increase the compression efficiency of SLM to entropy values L(x) \u3c H(x) + 1. Finally, the third contribution, Jacobs, Ali, Kolibal Encoding (JAKE), extends SLM and SLMN to bases larger than binary to increase the compression even further while maintaining the same relative computation efficiency

    Gamma Oscillations of Spiking Neural Populations Enhance Signal Discrimination

    Get PDF
    Selective attention is an important filter for complex environments where distractions compete with signals. Attention increases both the gamma-band power of cortical local field potentials and the spike-field coherence within the receptive field of an attended object. However, the mechanisms by which gamma-band activity enhances, if at all, the encoding of input signals are not well understood. We propose that gamma oscillations induce binomial-like spike-count statistics across noisy neural populations. Using simplified models of spiking neurons, we show how the discrimination of static signals based on the population spike-count response is improved with gamma induced binomial statistics. These results give an important mechanistic link between the neural correlates of attention and the discrimination tasks where attention is known to enhance performance. Further, they show how a rhythmicity of spike responses can enhance coding schemes that are not temporally sensitive
    corecore