Most communication channels are subjected to noise. One of the goals of
Information Theory is to add redundancy in the transmission of information so
that the information is transmitted reliably and the amount of information
transmitted through the channel is as large as possible. The maximum rate at
which reliable transmission is possible is called the capacity. If the channel
does not keep memory of its past, the capacity is given by a simple
optimization problem and can be efficiently computed. The situation of channels
with memory is less clear. Here we show that for channels with memory the
capacity cannot be computed to within precision 1/5. Our result holds even if
we consider one of the simplest families of such channels -information-stable
finite state machine channels-, restrict the input and output of the channel to
4 and 1 bit respectively and allow 6 bits of memory.Comment: Improved presentation and clarified claim