10,391 research outputs found
A proximal iteration for deconvolving Poisson noisy images using sparse representations
We propose an image deconvolution algorithm when the data is contaminated by
Poisson noise. The image to restore is assumed to be sparsely represented in a
dictionary of waveforms such as the wavelet or curvelet transforms. Our key
contributions are: First, we handle the Poisson noise properly by using the
Anscombe variance stabilizing transform leading to a {\it non-linear}
degradation equation with additive Gaussian noise. Second, the deconvolution
problem is formulated as the minimization of a convex functional with a
data-fidelity term reflecting the noise properties, and a non-smooth
sparsity-promoting penalties over the image representation coefficients (e.g.
-norm). Third, a fast iterative backward-forward splitting algorithm is
proposed to solve the minimization problem. We derive existence and uniqueness
conditions of the solution, and establish convergence of the iterative
algorithm. Finally, a GCV-based model selection procedure is proposed to
objectively select the regularization parameter. Experimental results are
carried out to show the striking benefits gained from taking into account the
Poisson statistics of the noise. These results also suggest that using
sparse-domain regularization may be tractable in many deconvolution
applications with Poisson noise such as astronomy and microscopy
Deconvolution under Poisson noise using exact data fidelity and synthesis or analysis sparsity priors
In this paper, we propose a Bayesian MAP estimator for solving the
deconvolution problems when the observations are corrupted by Poisson noise.
Towards this goal, a proper data fidelity term (log-likelihood) is introduced
to reflect the Poisson statistics of the noise. On the other hand, as a prior,
the images to restore are assumed to be positive and sparsely represented in a
dictionary of waveforms such as wavelets or curvelets. Both analysis and
synthesis-type sparsity priors are considered. Piecing together the data
fidelity and the prior terms, the deconvolution problem boils down to the
minimization of non-smooth convex functionals (for each prior). We establish
the well-posedness of each optimization problem, characterize the corresponding
minimizers, and solve them by means of proximal splitting algorithms
originating from the realm of non-smooth convex optimization theory.
Experimental results are conducted to demonstrate the potential applicability
of the proposed algorithms to astronomical imaging datasets
Sparse and stable Markowitz portfolios
We consider the problem of portfolio selection within the classical Markowitz
mean-variance framework, reformulated as a constrained least-squares regression
problem. We propose to add to the objective function a penalty proportional to
the sum of the absolute values of the portfolio weights. This penalty
regularizes (stabilizes) the optimization problem, encourages sparse portfolios
(i.e. portfolios with only few active positions), and allows to account for
transaction costs. Our approach recovers as special cases the
no-short-positions portfolios, but does allow for short positions in limited
number. We implement this methodology on two benchmark data sets constructed by
Fama and French. Using only a modest amount of training data, we construct
portfolios whose out-of-sample performance, as measured by Sharpe ratio, is
consistently and significantly better than that of the naive evenly-weighted
portfolio which constitutes, as shown in recent literature, a very tough
benchmark.Comment: Better emphasis of main result, new abstract, new examples and
figures. New appendix with full details of algorithm. 17 pages, 6 figure
- …