- Bayes' theorem was used to convert a prior probability into a posterior probability by incorporating the evidence provided by the observed data.
- P(W | D) = P(D | W)P(W)/P(D) ==> posterior = likelihood x prior
- P(W | D): posterior probability
- P(D | W): likelihood function
- P(W): prior probability
- P(D): normalization constant, ensures that the posterior distribution on the left-hand side is a valid probability density and integrates to one.
- maximum likelihood: w is set to the value that maximizes the likelihood function p(D | W). This corresponds to choosing the value of W for which the probability of the observed data set is maximized.
Jul 1, 2015
Bayes' theorem
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment