Expectation propagation for scalable inverse problems in imaging
The solutions of ill-posed inverse problems in imaging are usually non-unique, making it important to quantify the uncertainties associated with the estimates. Bayesian inference provides a powerful theoretical framework to derive various summaries of the posterior distribution of the unknown parameter of interest, such as the posterior mean and covariance. However, using Bayesian inference to find such solutions requires to compute integrals over the high-dimensional unknown image vectors. While Markov chain Monte Carlo sampling based methods are classically and widely used to draw samples from the posterior distributions, sampling methods for accurate evaluation of higher-order moments are still computationally expensive and not yet fully scalable for fast inference. Expectation Propagation provides a fast alternative to sampling methods and has recently become popular for approximate Bayesian inference. In this thesis, a set of new Expectation Propagation algorithms are proposed to achieve scalable posterior approximation for high-dimensional imaging inverse problems. The main contribution of this thesis is to construct new Expectation Propagation algorithms to provide both point estimate and uncertainty quantification for imaging inverse problems. By designing the factorization over the posterior distribution and tailoring the covariance matrix structure of the approximating distributions, the resulting Expectation Propagation algorithms are scalable to address high-dimensional imaging problems. The main novelty considers three aspects: (1) block diagonal covariance matrix structure is, to the best of this author’s knowledge, proposed for the first time in applying Expectation Propagation for Bayesian models with patch-based image prior, (2) factorization over convex and non-convex gradient-based priors is designed to allow for highly parallel computation in using Expectation Propagation to solve high-dimensional imaging inverse problems, and (3) the proposed Expectation Propagation algorithms are embedded within larger inference schemes where the prior regularization parameters are unknown. Without significantly increasing the computational footprint, the resulting Expectation Propagation based algorithms allow for greater scalability with unsupervised hyperparameter tuning.