This work first reviews an already-developed, existing deterministic parallel algorithm to compute the complete histogram of an image in optimal number of steps (log n) on a hypercube architecture and utilizing memory space on the order of O(x1/2 log x), where x is the number of gray levels in the image, at each processing element. The paper then introduces our improvement to this algorithm’s memory requirements by introducing the concept of randomization into the algorithm