loader image
Skip to main content
If you continue browsing this website, you agree to our policies:
x
Completion requirements

Probability theory provides a robust and well-understood platform to handle uncertainty. In addition to "prior" probability, it is also useful to master conditional probability to sharpen our ability to reason about uncertain events. Can you explain how conditional probability works and how to analyze the likelihood of events with some apparent dependence on one another?

Example

Suppose that somebody secretly rolls two fair six-sided dice, and we wish to compute the probability that the face-up value of the first one is 2, given the information that their sum is no greater than 5.

  • Let D1 be the value rolled on die 1.
  • Let D2 be the value rolled on die 2.


Probability that D1 = 2

Table 1 shows the sample space of 36 combinations of rolled values of the two dice, each of which occurs with probability 1/36, with the numbers displayed in the red and dark gray cells being D1 + D2.

D1 = 2 in exactly 6 of the 36 outcomes; thus P(D1 = 2) = 6⁄36 = 1⁄6:

Table 1

+ D2
1 2 3 4 5 6
D1 1 2 3 4 5 6 7
2 3 4 5 6 7 8
3 4 5 6 7 8 9
4 5 6 7 8 9 10
5 6 7 8 9 10 11
6 7 8 9 10 11 12

Probability that D1 + D2 ≤ 5

Table 2 shows that D1 + D2 ≤ 5 for exactly 10 of the 36 outcomes, thus P(D1 + D2 ≤ 5) = 10⁄36:

Table 2

+ D2
1 2 3 4 5 6
D1 1 2 3 4 5 6 7
2 3 4 5 6 7 8
3 4 5 6 7 8 9
4 5 6 7 8 9 10
5 6 7 8 9 10 11
6 7 8 9 10 11 12


Probability that D1 = 2 given that D1 + D2 ≤ 5

Table 3 shows that for 3 of these 10 outcomes, D1 = 2.

Thus, the conditional probability P(D1 = 2 | D1+D2 ≤ 5) = 3⁄10 = 0.3:


Table 3

+ D2
1 2 3 4 5 6
D1 1 2 3 4 5 6 7
2 3 4 5 6 7 8
3 4 5 6 7 8 9
4 5 6 7 8 9 10
5 6 7 8 9 10 11
6 7 8 9 10 11 12

Here, in the earlier notation for the definition of conditional probability, the conditioning event B is that D1 + D2 ≤ 5, and the event A is D1 = 2. We have P(A\mid B)={\tfrac {P(A\cap B)}{P(B)}}={\tfrac {3/36}{10/36}}={\tfrac {3}{10}}, as seen in the table.