AdSense

Saturday, February 11, 2017

Bayesian Inference

Example:


"Prior Probability"
P(H1) = Probability of Computer Science (CS) major = (# of CS major) / (# of students in total)
P(H1) = 0.03 (given)

P(H2) = Probability of Non-CS major = 1 - P(H1)
P(H2) = 1 - P(H1) = 1 - 0.03 = 0.97

D: A person is meticulous and works in a well-organized manner.


"Posterior Probability"
P(H1|D): Probability of H1 after D is observed for a person.

A major of a person with D is CS:
P(H1|D) * P(D) P(D|H1) * P(H1)
P(H1|D) P(D|H1) * P(H1) / P(D)

P(D) = P(H1) * P(D|H1) + P(H2) * P(D|H2)

P(H1|D) P(D|H1) * P(H1) / (P(H1) * P(D|H1) + P(H2) * P(D|H2))
P(H1|D)  P(H1) / (P(H1) + P(H2) * P(D|H2) / P(D|H1))

P(D|H2) / P(D|H1) = 1/4 (Given. It means a ratio of people of non-CS major with D and people of CS major with D is 1/4; more people of CS major is D compared to non-CS major people.)

P(H1|D) = 0.03 / (0.03 + 0.97 * 1/4) = 0.11009... ~ 0.11

No comments:

Post a Comment

Deep Learning (Regression, Multiple Features/Explanatory Variables, Supervised Learning): Impelementation and Showing Biases and Weights

Deep Learning (Regression, Multiple Features/Explanatory Variables, Supervised Learning): Impelementation and Showing Biases and Weights ...