766 research outputs found

    Compressed Data Structures for Dynamic Sequences

    Full text link
    We consider the problem of storing a dynamic string SS over an alphabet Σ={1,,σ}\Sigma=\{\,1,\ldots,\sigma\,\} in compressed form. Our representation supports insertions and deletions of symbols and answers three fundamental queries: access(i,S)\mathrm{access}(i,S) returns the ii-th symbol in SS, ranka(i,S)\mathrm{rank}_a(i,S) counts how many times a symbol aa occurs among the first ii positions in SS, and selecta(i,S)\mathrm{select}_a(i,S) finds the position where a symbol aa occurs for the ii-th time. We present the first fully-dynamic data structure for arbitrarily large alphabets that achieves optimal query times for all three operations and supports updates with worst-case time guarantees. Ours is also the first fully-dynamic data structure that needs only nHk+o(nlogσ)nH_k+o(n\log\sigma) bits, where HkH_k is the kk-th order entropy and nn is the string length. Moreover our representation supports extraction of a substring S[i..i+]S[i..i+\ell] in optimal O(logn/loglogn+/logσn)O(\log n/\log\log n + \ell/\log_{\sigma}n) time

    A Proof of Entropy Minimization for Outputs in Deletion Channels via Hidden Word Statistics

    Get PDF
    From the output produced by a memoryless deletion channel from a uniformly random input of known length nn, one obtains a posterior distribution on the channel input. The difference between the Shannon entropy of this distribution and that of the uniform prior measures the amount of information about the channel input which is conveyed by the output of length mm, and it is natural to ask for which outputs this is extremized. This question was posed in a previous work, where it was conjectured on the basis of experimental data that the entropy of the posterior is minimized and maximized by the constant strings 000\texttt{000}\ldots and 111\texttt{111}\ldots and the alternating strings 0101\texttt{0101}\ldots and 1010\texttt{1010}\ldots respectively. In the present work we confirm the minimization conjecture in the asymptotic limit using results from hidden word statistics. We show how the analytic-combinatorial methods of Flajolet, Szpankowski and Vall\'ee for dealing with the hidden pattern matching problem can be applied to resolve the case of fixed output length and nn\rightarrow\infty, by obtaining estimates for the entropy in terms of the moments of the posterior distribution and establishing its minimization via a measure of autocorrelation.Comment: 11 pages, 2 figure
    corecore