The University of Montana
Department of Mathematical Sciences

Technical report #17/2007

An Efficient Computational Method for Total Variation-Penalized Poisson Likelihood Estimation

Johnathan M. Bardsley
Department of Mathematical Sciences
University of Montana, Missoula, MT 59812-0864, USA
email:bardsleyj@mso.umt.edu

Abstract

Approximating non-Gaussian noise processes with Gaussian models is standard in data analysis. This is due in large part to the fact that Gaussian models yield parameter estimation problems of least squares form, which have been extensively studied both from the theoretical and computational points of view. In image processing applications, for example, data is often collected by a CCD camera, in which case the noise is a Guassian/Poisson mixture with the Poisson noise dominating for a sufficiently strong signal. Even so, the standard approach in such cases is to use a Gaussian approximation that leads to a negative-log likelihood function of weighted least squares type.

In the Bayesian point-of-view taken in this paper, a negative-log prior (or regularization) function is added to the negative-log likelihood function, and the resulting function is minimized. We focus on the case where the negative-log prior is the well-known total variation function and give a statistical interpretation. Regardless of whether the least squares or Poisson negative-log likelihood is used, the total variation term yields a minimization problem that is computationally challenging. The primary result of this work is the efficient computational method that is presented for the solution of such problems, together with its convergence analysis. With the computational method in hand, we then perform experiments that indicate that the Poisson negative-log likelihood yields a more computationally efficient method than does the use of the least squares function. We also present results that indicate that this may even be the case when the data noise is i.i.d. Gaussian, suggesting that irregardless of noise statistics, using the Poisson negative-log likelihood can yield a more computationally tractable problem when total variation regularization is used.

Keywords:total variation, nonnegatively constrained optimization, image recon- struction, Bayesian statistical methods.

AMS Subject Classification:65J22, 65K10, 65F22.

Download Technical Report: Pdf (438 KB)