Skip to content

salimandre/Markov-Random-Fields

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

81 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Markov Random Fields

We studied undirected probabilistic graphical models and here we provide some result in image processing.

We sampled Markov Random Fields models (Ising model, Potts model) using the following algorithm:

  • Gibbs sampling
  • Metropolis sampling
  • Iterated Conditional Modes (ICM)

Finally we applied these methods to solve a task of image denoising.

Ising model

We aim to sample the following distribution on state space {0,1}^n where

We achieved sampling Ising model by using 3 different algorithms: Gibbs sampling, Metropolis sampling, ICM.

Example with beta = 1. :

Example with beta = 3. :

While MCMC algorithms Gibbs and Metropolis have theorical guarantee to converge to the designed distribution, ICM converge quickly to a local minimum. On the following graph we compare convergence of these algorithms by measuring how fast they minimize the global energy of the distribution.

If ICM is faster it requires to start from a suitable initial solution and if not it may not converge at all. On the following plots we compare the ability of respectively ICM, Gibbs and Metropolis algorithms to converge to Ising model distribution starting from a all white image.

Potts model

We also sample the Potts model, a more generalized version of Ising model as the state space {0,1,2,...,q}^n is no more binary.

In the following we sampled:

In the following we sampled:

Image denoising

In order to perform image denoising using Markov Random Fields we added some random gaussian noise to an original binary image. We then aim to recover the original image. We show that the use of an Ising prior achieves better result than a naive independancy prior.

We performed image denoising in the bayesian framework using a naive prior assumption of independancy between pixels then using a prior following Ising distribution with 4 connexity.

Naive prior:

Ising prior with 4 connexity:

The Likelhood (data attachment) is modelled by a gaussian. Hence we have:

We use MAP in order to recover original image:

First we estimated parameters of gaussian by computing empirical mean and standard deviation on random patches of the image. We also computed confidence intervals to ensure of the accuracy of the estimators.

We then applied ICM algorithm to solve MAP. Actually with naive prior it is equivalent to simple thresholding.

We then applied ICM algorithm to solve MAP with Ising prior and we see that the final image obtained is closer to the orginal image than in the previous case.

This observation is confirmed by the following plot which highlights how the two models converge to original image.