Maximum Likelihood, Maximum A Posteriori and Bayesian

Maximum Likelihood, Maximum A Posteriori and Bayesian


Definitions

  • Loading
     : random variables, usually state and observations respectively
  • Loading
     : probability distribution function for a random variable 
    Loading
    .

Bayes Rule

Combining the rules for conditional probabilities 

Loading
 and 
Loading
 we get Bayes rule:

Loading

Examples:

  1. Likelihood - coin flipping

Parameter Estimation

Bayes rule is often used (in robotics) to do parameter estimation. There are two ways you can do this:

  1. Maximum Likelihood : find 
    Loading
     for which the likelihood is maximum.
  2. Maximum a Posteriori : find 
    Loading
     for which the posterior is maximum.

Maximising the likelihood generates a result purely based on observations whereas maximising the posterior involves working with the prior belief, which is usually embedded via information from experts or machine learning. This MIL/MAP Tutorial Presentation has a couple of great simple examples which highlight the situation. MAP estimation pulls the estimate to the prior. The more focused the prior belief, the larger the pull towards the prior.