509,942 research outputs found
Steganalytic Methods for the Detection of Histogram Shifting Data Hiding Schemes
Peer-reviewedIn this paper, several steganalytic techniques designed to detect the existence of hidden messages using histogram shifting schemes are presented. Firstly, three techniques to identify specific histogram shifting data hiding schemes, based on detectable visible alterations on the histogram or abnormal statistical distributions, are suggested. Afterwards, a general technique capable of detecting all the analyzed histogram shifting data hiding methods is suggested. This technique is based on the effect of histogram shifting methods on the ¿volatility¿ of the histogram of the difference image. The different behavior of volatility whenever new data are hidden makes it possible to identify stego and cover images
Exact Histogram Specification Optimized for Structural Similarity
An exact histogram specification (EHS) method modifies its input image to
have a specified histogram. Applications of EHS include image (contrast)
enhancement (e.g., by histogram equalization) and histogram watermarking.
Performing EHS on an image, however, reduces its visual quality. Starting from
the output of a generic EHS method, we maximize the structural similarity index
(SSIM) between the original image (before EHS) and the result of EHS
iteratively. Essential in this process is the computationally simple and
accurate formula we derive for SSIM gradient. As it is based on gradient
ascent, the proposed EHS always converges. Experimental results confirm that
while obtaining the histogram exactly as specified, the proposed method
invariably outperforms the existing methods in terms of visual quality of the
result. The computational complexity of the proposed method is shown to be of
the same order as that of the existing methods.
Index terms: histogram modification, histogram equalization, optimization for
perceptual visual quality, structural similarity gradient ascent, histogram
watermarking, contrast enhancement
Histogram Tomography
In many tomographic imaging problems the data consist of integrals along
lines or curves. Increasingly we encounter "rich tomography" problems where the
quantity imaged is higher dimensional than a scalar per voxel, including
vectors tensors and functions. The data can also be higher dimensional and in
many cases consists of a one or two dimensional spectrum for each ray. In many
such cases the data contain not just integrals along rays but the distribution
of values along the ray. If this is discretized into bins we can think of this
as a histogram. In this paper we introduce the concept of "histogram
tomography". For scalar problems with histogram data this holds the possibility
of reconstruction with fewer rays. In vector and tensor problems it holds the
promise of reconstruction of images that are in the null space of related
integral transforms. For scalar histogram tomography problems we show how bins
in the histogram correspond to reconstructing level sets of function, while
moments of the distribution are the x-ray transform of powers of the unknown
function. In the vector case we give a reconstruction procedure for potential
components of the field. We demonstrate how the histogram longitudinal ray
transform data can be extracted from Bragg edge neutron spectral data and
hence, using moments, a non-linear system of partial differential equations
derived for the strain tensor. In x-ray diffraction tomography of strain the
transverse ray transform can be deduced from the diffraction pattern the full
histogram transverse ray transform cannot. We give an explicit example of
distributions of strain along a line that produce the same diffraction pattern,
and characterize the null space of the relevant transform.Comment: Small corrections from last versio
Image enhancement using fuzzy intensity measure and adaptive clipping histogram equalization
Image enhancement aims at processing an input
image so that the visual content of the output image is more
pleasing or more useful for certain applications. Although
histogram equalization is widely used in image enhancement due
to its simplicity and effectiveness, it changes the mean brightness
of the enhanced image and introduces a high level of noise and
distortion. To address these problems, this paper proposes
image enhancement using fuzzy intensity measure and adaptive
clipping histogram equalization (FIMHE). FIMHE uses fuzzy
intensity measure to first segment the histogram of the original
image, and then clip the histogram adaptively in order to
prevent excessive image enhancement. Experiments on the
Berkeley database and CVF-UGR-Image database show that
FIMHE outperforms state-of-the-art histogram equalization
based methods
A Convex Model for Edge-Histogram Specification with Applications to Edge-preserving Smoothing
The goal of edge-histogram specification is to find an image whose edge image
has a histogram that matches a given edge-histogram as much as possible.
Mignotte has proposed a non-convex model for the problem [M. Mignotte. An
energy-based model for the image edge-histogram specification problem. IEEE
Transactions on Image Processing, 21(1):379--386, 2012]. In his work, edge
magnitudes of an input image are first modified by histogram specification to
match the given edge-histogram. Then, a non-convex model is minimized to find
an output image whose edge-histogram matches the modified edge-histogram. The
non-convexity of the model hinders the computations and the inclusion of useful
constraints such as the dynamic range constraint. In this paper, instead of
considering edge magnitudes, we directly consider the image gradients and
propose a convex model based on them. Furthermore, we include additional
constraints in our model based on different applications. The convexity of our
model allows us to compute the output image efficiently using either
Alternating Direction Method of Multipliers or Fast Iterative
Shrinkage-Thresholding Algorithm. We consider several applications in
edge-preserving smoothing including image abstraction, edge extraction, details
exaggeration, and documents scan-through removal. Numerical results are given
to illustrate that our method successfully produces decent results efficiently
- …