In this work, we investigate the statistical computation of the Boltzmann
entropy of statistical samples. For this purpose, we use both histogram and
kernel function to estimate the probability density function of statistical
samples. We find that, due to coarse-graining, the entropy is a monotonic
increasing function of the bin width for histogram or bandwidth for kernel
estimation, which seems to be difficult to select an optimal bin
width/bandwidth for computing the entropy. Fortunately, we notice that there
exists a minimum of the first derivative of entropy for both histogram and
kernel estimation, and this minimum point of the first derivative
asymptotically points to the optimal bin width or bandwidth. We have verified
these findings by large amounts of numerical experiments. Hence, we suggest
that the minimum of the first derivative of entropy be used as a selector for
the optimal bin width or bandwidth of density estimation. Moreover, the optimal
bandwidth selected by the minimum of the first derivative of entropy is purely
data-based, independent of the unknown underlying probability density
distribution, which is obviously superior to the existing estimators. Our
results are not restricted to one-dimensional, but can also be extended to
multivariate cases. It should be emphasized, however, that we do not provide a
robust mathematical proof of these findings, and we leave these issues with
those who are interested in them.Comment: 8 pages, 6 figures, MNRAS, in the pres