Statistics: Statistics *
Probability Calculus
Chain rule and conditional probabilities
Suppose we have two random variables and with joint distribution . We understand as the probability that has outcome and has outcome . Intuitively, we can also understand that joint probability as follows: my belief that will take on outcome and takes on outcome can be ‘decomposed’ into my belief that will take on in general, and then that takes on value given that has taken on value . For example, my belief in me studying well and passing the exam is simply my belief in me studying well in general combined with my belief that if I study well, I will pass the exam. Formally, we write this as
Marginalization and marginal probabilities
Another very important fact of probability theory is marginalization. It says that if we have two random variables and , we can understand also as a ‘summed out’ probabilities of the joint distribution of and , i.e.
Exercise Suppose we have a joint distribution , how can we use the above two rules to find ? The answer will be covered in the theory page on graphical models.
Bayes' rule
Summary In this theory page, you have learned how to answer queries such as marginals and conditionals given your joint distribution. For this, we have looked at the most important rules of probability theory.