2 research outputs found
Dynamical complexity of short and noisy time series: Compression-Complexity vs. Shannon entropy
Shannon entropy has been extensively used for characteriz-
ing complexity of time series arising from chaotic dynamical systems and stochastic processes such as Markov chains. However, for short and noisy time series, Shannon entropy performs poorly. Complexity measures which are based on lossless compression algorithms are a good substitute in such scenarios. We evaluate the performance of
two such Compression-Complexity Measures namely Lempel-Ziv complexity(LZ)andEffort-To-Compress(
ETC)onshorttimeseriesfrom chaoticdynamicalsystemsinthepresenceofnoise.Both
LZ and ETC outperform Shannon entropy (H) in accurately characterizing the dynamical complexity of such systems. For very short binary sequences
(which arise in neuroscience applications),
ETC has higher number of distinct complexity values than
LZ and H, thus enabling a finer resolution. For two-state ergodic Markov chains, we empirically show that ETC
converges to a steady state value faster than LZ.
Compression-Complexity measures
are promising for applications which involve short
and noisy time series
Three Perspectives on Complexity: Entropy, Compression, Subsymmetry
There is no single universally accepted definition of `Com-
plexity'. There are several perspectives on complexity and what constitutes complex behaviour or complex systems, as opposed to regular, predictable behaviour and simple systems. In this paper, we explore the following perspectives on complexity: effort-to-describe (Shannon
entropy H, Lempel-Ziv complexity LZ), effort-to-compress (ETCcomplexity) and degree-of-order (Subsymmetry or SubSym). While Shannon entropy and LZ are very popular and widely used, ETC is relatively a new complexity measure. In this paper, we also propose a novel normalized complexity measure SubSym based on the existing idea of counting the number of subsymmetries or palindromes within a sequence. We compare the performance of these complexity measures
on the following tasks: (A) characterizing complexity of short binary sequences of lengths 4 to 16, (B) distinguishing periodic and chaotic time series from 1D logistic map and 2D Henon map, (C) analyzing
the complexity of stochastic time series generated from 2-state Markov chains, and (D) distinguishing between tonic and irregular spiking patterns generated from the `Adaptive exponential integrate-and-fire' neuron model. Our study reveals that each perspective has its own advantages and uniqueness while also having an overlap with each
other