Maximum Likelihood, Maximum A Posteriori and Bayesian
Definitions
- Loading: random variables, usually state and observations respectively
- Loading: probability distribution function for a random variableLoading.
Bayes Rule
Combining the rules for conditional probabilities
Loading
and Loading
we get Bayes rule:Loading
Examples:
- Likelihood - coin flipping
Parameter Estimation
Bayes rule is often used (in robotics) to do parameter estimation. There are two ways you can do this:
- Maximum Likelihood : find Loadingfor which the likelihood is maximum.
- Maximum a Posteriori : find Loadingfor which the posterior is maximum.
Maximising the likelihood generates a result purely based on observations whereas maximising the posterior involves working with the prior belief, which is usually embedded via information from experts or machine learning. This MIL/MAP Tutorial Presentation has a couple of great simple examples which highlight the situation. MAP estimation pulls the estimate to the prior. The more focused the prior belief, the larger the pull towards the prior.
, multiple selections available, Use left or right arrow keys to navigate selected items