The major problem in information theoretic analysis of neural responses and
other biological data is the reliable estimation of entropy--like quantities
from small samples. We apply a recently introduced Bayesian entropy estimator
to synthetic data inspired by experiments, and to real experimental spike
trains. The estimator performs admirably even very deep in the undersampled
regime, where other techniques fail. This opens new possibilities for the
information theoretic analysis of experiments, and may be of general interest
as an example of learning from limited data.Comment: 7 pages, 4 figures; referee suggested changes, accepted versio