6,330 research outputs found
High-throughput variable-to-fixed entropy codec using selective, stochastic code forests
Efficient high-throughput (HT) compression algorithms are paramount to meet the stringent constraints of present and upcoming data storage, processing, and transmission systems. In particular, latency, bandwidth and energy requirements are critical for those systems. Most HT codecs are designed to maximize compression speed, and secondarily to minimize compressed lengths. On the other hand, decompression speed is often equally or more critical than compression speed, especially in scenarios where decompression is performed multiple times and/or at critical parts of a system. In this work, an algorithm to design variable-to-fixed (VF) codes is proposed that prioritizes decompression speed. Stationary Markov analysis is employed to generate multiple, jointly optimized codes (denoted code forests). Their average compression efficiency is on par with the state of the art in VF codes, e.g., within 1% of Yamamoto et al.\u27s algorithm. The proposed code forest structure enables the implementation of highly efficient codecs, with decompression speeds 3.8 times faster than other state-of-the-art HT entropy codecs with equal or better compression ratios for natural data sources. Compared to these HT codecs, the proposed forests yields similar compression efficiency and speeds
Simulation techniques for cosmological simulations
Modern cosmological observations allow us to study in great detail the
evolution and history of the large scale structure hierarchy. The fundamental
problem of accurate constraints on the cosmological parameters, within a given
cosmological model, requires precise modelling of the observed structure. In
this paper we briefly review the current most effective techniques of large
scale structure simulations, emphasising both their advantages and
shortcomings. Starting with basics of the direct N-body simulations appropriate
to modelling cold dark matter evolution, we then discuss the direct-sum
technique GRAPE, particle-mesh (PM) and hybrid methods, combining the PM and
the tree algorithms. Simulations of baryonic matter in the Universe often use
hydrodynamic codes based on both particle methods that discretise mass, and
grid-based methods. We briefly describe Eulerian grid methods, and also some
variants of Lagrangian smoothed particle hydrodynamics (SPH) methods.Comment: 42 pages, 16 figures, accepted for publication in Space Science
Reviews, special issue "Clusters of galaxies: beyond the thermal view",
Editor J.S. Kaastra, Chapter 12; work done by an international team at the
International Space Science Institute (ISSI), Bern, organised by J.S.
Kaastra, A.M. Bykov, S. Schindler & J.A.M. Bleeke
General form of almost instantaneous fixed-to-variable-length codes
A general class of the almost instantaneous fixed-to-variable-length (AIFV)
codes is proposed, which contains every possible binary code we can make when
allowing finite bits of decoding delay. The contribution of the paper lies in
the following. (i) Introducing -bit-delay AIFV codes, constructed by
multiple code trees with higher flexibility than the conventional AIFV codes.
(ii) Proving that the proposed codes can represent any uniquely-encodable and
uniquely-decodable variable-to-variable length codes. (iii) Showing how to
express codes as multiple code trees with minimum decoding delay. (iv)
Formulating the constraints of decodability as the comparison of intervals in
the real number line. The theoretical results in this paper are expected to be
useful for further study on AIFV codes.Comment: submitted to IEEE Transactions on Information Theory. arXiv admin
note: text overlap with arXiv:1607.07247 by other author
Simulation of networks of spiking neurons: A review of tools and strategies
We review different aspects of the simulation of spiking neural networks. We
start by reviewing the different types of simulation strategies and algorithms
that are currently implemented. We next review the precision of those
simulation strategies, in particular in cases where plasticity depends on the
exact timing of the spikes. We overview different simulators and simulation
environments presently available (restricted to those freely available, open
source and documented). For each simulation tool, its advantages and pitfalls
are reviewed, with an aim to allow the reader to identify which simulator is
appropriate for a given task. Finally, we provide a series of benchmark
simulations of different types of networks of spiking neurons, including
Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based
or conductance-based synapses, using clock-driven or event-driven integration
strategies. The same set of models are implemented on the different simulators,
and the codes are made available. The ultimate goal of this review is to
provide a resource to facilitate identifying the appropriate integration
strategy and simulation tool to use for a given modeling problem related to
spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of
Computational Neuroscience, in press (2007
- …