Maximum Likelihood, Maximum A Posteriori and Bayesian
Definitions
- : random variables, usually state and observations respectively
- : probability distribution function for a random variable .
Bayes Rule
Combining the rules for conditional probabilities and we get Bayes rule:
(1) |
Examples:
- Likelihood - coin flipping
Parameter Estimation
Bayes rule is often used (in robotics) to do parameter estimation. There are two ways you can do this:
- Maximum Likelihood : find for which the likelihood is maximum.
- Maximum a Posteriori : find for which the posterior is maximum.
Maximising the likelihood generates a result purely based on observations whereas maximising the posterior involves working with the prior belief, which is usually embedded via information from experts or machine learning. This MIL/MAP Tutorial Presentation has a couple of great simple examples which highlight the situation. MAP estimation pulls the estimate to the prior. The more focused the prior belief, the larger the pull towards the prior.