69 research outputs found
Computationally Efficient Implementation of Convolution-based Locally Adaptive Binarization Techniques
One of the most important steps of document image processing is binarization.
The computational requirements of locally adaptive binarization techniques make
them unsuitable for devices with limited computing facilities. In this paper,
we have presented a computationally efficient implementation of convolution
based locally adaptive binarization techniques keeping the performance
comparable to the original implementation. The computational complexity has
been reduced from O(W2N2) to O(WN2) where WxW is the window size and NxN is the
image size. Experiments over benchmark datasets show that the computation time
has been reduced by 5 to 15 times depending on the window size while memory
consumption remains the same with respect to the state-of-the-art algorithmic
implementation
Normalized measures of entropy
A number of “normalized” measures ol entropy have been obtained to measure the “intrinsic” uncertainty of a probability distribution. Their graphs have been drawn and it has been shown that the normalized measures of a given probability distribution are much closer to one another than the corresponding absolute measures of entropy
GENERALIZATION OF THE ENTROPY MODEL FOR BRAND PURCHASE BEHAVIOR.
J. D. Herniter's entropy model for brand purchase behavior has been generalized for A. Renyi's measure of entropy which is a more general concept than C. E. Shannon's measure of entropy used by Herniter and which includes Shannon's measure as a limiting case. The generalized model considered here is more flexible than Herniter's model since it can give different marketing statistics for different products and it can give these statistics even when only some of the brands are considered
Some normalized measures of directed divergence
In a recent paper, we discussed normalized measures of entropy,13 i.e. measures of entropy all of which lie between 0 and 1. This leads to standardized measures which can be compared with one another. In the present paper, we discuss normalized measures of directed divergence, all of which lie between 0 and 1 and as such are comparable with one another
- …