3,096 research outputs found
Phase and TV Based Convex Sets for Blind Deconvolution of Microscopic Images
In this article, two closed and convex sets for blind deconvolution problem
are proposed. Most blurring functions in microscopy are symmetric with respect
to the origin. Therefore, they do not modify the phase of the Fourier transform
(FT) of the original image. As a result blurred image and the original image
have the same FT phase. Therefore, the set of images with a prescribed FT phase
can be used as a constraint set in blind deconvolution problems. Another convex
set that can be used during the image reconstruction process is the epigraph
set of Total Variation (TV) function. This set does not need a prescribed upper
bound on the total variation of the image. The upper bound is automatically
adjusted according to the current image of the restoration process. Both of
these two closed and convex sets can be used as a part of any blind
deconvolution algorithm. Simulation examples are presented.Comment: Submitted to IEEE Selected Topics in Signal Processin
Hierarchical Bayesian sparse image reconstruction with application to MRFM
This paper presents a hierarchical Bayesian model to reconstruct sparse
images when the observations are obtained from linear transformations and
corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is
well suited to such naturally sparse image applications as it seamlessly
accounts for properties such as sparsity and positivity of the image via
appropriate Bayes priors. We propose a prior that is based on a weighted
mixture of a positive exponential distribution and a mass at zero. The prior
has hyperparameters that are tuned automatically by marginalization over the
hierarchical Bayesian model. To overcome the complexity of the posterior
distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be
used to estimate the image to be recovered, e.g. by maximizing the estimated
posterior distribution. In our fully Bayesian approach the posteriors of all
the parameters are available. Thus our algorithm provides more information than
other previously proposed sparse reconstruction methods that only give a point
estimate. The performance of our hierarchical Bayesian sparse reconstruction
method is illustrated on synthetic and real data collected from a tobacco virus
sample using a prototype MRFM instrument.Comment: v2: final version; IEEE Trans. Image Processing, 200
Perfectly Secure Steganography: Capacity, Error Exponents, and Code Constructions
An analysis of steganographic systems subject to the following perfect
undetectability condition is presented in this paper. Following embedding of
the message into the covertext, the resulting stegotext is required to have
exactly the same probability distribution as the covertext. Then no statistical
test can reliably detect the presence of the hidden message. We refer to such
steganographic schemes as perfectly secure. A few such schemes have been
proposed in recent literature, but they have vanishing rate. We prove that
communication performance can potentially be vastly improved; specifically, our
basic setup assumes independently and identically distributed (i.i.d.)
covertext, and we construct perfectly secure steganographic codes from public
watermarking codes using binning methods and randomized permutations of the
code. The permutation is a secret key shared between encoder and decoder. We
derive (positive) capacity and random-coding exponents for perfectly-secure
steganographic systems. The error exponents provide estimates of the code
length required to achieve a target low error probability. We address the
potential loss in communication performance due to the perfect-security
requirement. This loss is the same as the loss obtained under a weaker order-1
steganographic requirement that would just require matching of first-order
marginals of the covertext and stegotext distributions. Furthermore, no loss
occurs if the covertext distribution is uniform and the distortion metric is
cyclically symmetric; steganographic capacity is then achieved by randomized
linear codes. Our framework may also be useful for developing computationally
secure steganographic systems that have near-optimal communication performance.Comment: To appear in IEEE Trans. on Information Theory, June 2008; ignore
Version 2 as the file was corrupte
Perfectly Secure Steganography: Capacity, Error Exponents, and Code Constructions
An analysis of steganographic systems subject to the following perfect
undetectability condition is presented in this paper. Following embedding of
the message into the covertext, the resulting stegotext is required to have
exactly the same probability distribution as the covertext. Then no statistical
test can reliably detect the presence of the hidden message. We refer to such
steganographic schemes as perfectly secure. A few such schemes have been
proposed in recent literature, but they have vanishing rate. We prove that
communication performance can potentially be vastly improved; specifically, our
basic setup assumes independently and identically distributed (i.i.d.)
covertext, and we construct perfectly secure steganographic codes from public
watermarking codes using binning methods and randomized permutations of the
code. The permutation is a secret key shared between encoder and decoder. We
derive (positive) capacity and random-coding exponents for perfectly-secure
steganographic systems. The error exponents provide estimates of the code
length required to achieve a target low error probability. We address the
potential loss in communication performance due to the perfect-security
requirement. This loss is the same as the loss obtained under a weaker order-1
steganographic requirement that would just require matching of first-order
marginals of the covertext and stegotext distributions. Furthermore, no loss
occurs if the covertext distribution is uniform and the distortion metric is
cyclically symmetric; steganographic capacity is then achieved by randomized
linear codes. Our framework may also be useful for developing computationally
secure steganographic systems that have near-optimal communication performance.Comment: To appear in IEEE Trans. on Information Theory, June 2008; ignore
Version 2 as the file was corrupte
A signomial programming approach for binary image restoration by penalized least squares
The authors present a novel optimization approach, using signomial programming (SP), to restore noise-corrupted binary and grayscale images. The approach requires the minimization of a penalized least squares functional over binary variables, which has led to the design of various approximation methods in the past. In this brief, we minimize the functional as a SP problem which is then converted into a reversed geometric programming (GP) problem and solved using standard GP solvers. Numerical experiments show that the proposed approach restores both degraded binary and grayscale images with good accuracy, and is over 20 times faster than the positive semidefinite programming approach. © 2007 IEEE.published_or_final_versio
- âŠ