Algebra
Textbooks
Boundless Algebra
Combinatorics and Probability
Probability
Algebra Textbooks Boundless Algebra Combinatorics and Probability Probability
Algebra Textbooks Boundless Algebra Combinatorics and Probability
Algebra Textbooks Boundless Algebra
Algebra Textbooks
Algebra
Concept Version 13
Created by Boundless

Conditional Probability

The conditional probability of an event is the probability that an event will occur given that another event has occurred.

Learning Objective

  • Explain the significance of Bayes' theorem in manipulating conditional probabilities


Key Points

    • The conditional probability P(B∣A)P(B \vert A)P(B∣A) of an event BBB, given an event AAA, is defined by: P(B∣A)=P(A∩B)P(A)P(B|A)=\frac{P(A\cap B)}{P(A)}P(B∣A)=​P(A)​​P(A∩B)​​, when P(A)>0P(A) > 0P(A)>0.
    • If the knowledge that event AAA occurs does not change the probability that event BBB occurs, then AAA and BBB are independent events, and thus, P(B∣A)=P(B)P(B|A) = P(B)P(B∣A)=P(B).
    • Mathematically, Bayes' theorem gives the relationship between the probabilities of AAA and BBB, P(A)P(A)P(A) and P(B)P(B)P(B), and the conditional probabilities of AAA given BBB and BBB given AAA, P(A∣B)P(A|B)P(A∣B) and P(B∣A)P(B|A)P(B∣A). In its most common form, it is: P(A∣B)=P(B∣A)P(A)P(B)P(A|B)=\frac{P(B|A)P(A)}{P(B)}P(A∣B)=​P(B)​​P(B∣A)P(A)​​.

Terms

  • conditional probability

    The probability that an event will take place given the restrictive assumption that another event has taken place, or that a combination of other events has taken place

  • independent

    Not dependent; not contingent or depending on something else; free.


Full Text

Probability of BBB Given That AAA Has Occurred

Our estimation of the likelihood of an event can change if we know that some other event has occurred. For example, the probability that a rolled die shows a 222 is 1/61/61/6 without any other information, but if someone looks at the die and tells you that is is an even number, the probability is now 1/31/31/3 that it is a 222. The notation P(B∣A)P(B|A)P(B∣A) indicates a conditional probability, meaning it indicates the probability of one event under the condition that we know another event has happened. The bar "|" can be read as "given", so that P(B∣A)P(B|A)P(B∣A) is read as "the probability of BBB given that AAA has occurred".

The conditional probability P(B∣A)\displaystyle P(B|A)P(B∣A) of an event BBB, given an event AAA, is defined by:

P(B∣A)=P(A∩B)P(A)\displaystyle P(B|A)=\frac{P(A\cap B)}{P(A)}P(B∣A)=​P(A)​​P(A∩B)​​ 

When P(A)>0P(A) > 0P(A)>0. Be sure to remember the distinct roles of BBB and AAA in this formula. The set after the bar is the one we are assuming has occurred, and its probability occurs in the denominator of the formula.  

Example

Suppose that a coin is flipped 3 times giving the sample space:

S={HHH,HHT,HTH,THH,TTH,THT,HTT,TTT}S=\{HHH, HHT, HTH, THH, TTH, THT, HTT, TTT\}S={HHH,HHT,HTH,THH,TTH,THT,HTT,TTT} 

Each individual outcome has probability 1/81/81/8. Suppose that BBB is the event that at least one heads occurs and AAA is the event that all 333 coins are the same. Then the probability of BBB given AAA is 1/21/21/2, since A∩B={HHH}A \cap B=\{HHH\}A∩B={HHH} which has probability 1/81/81/8 and A={HHH,TTT}A=\{HHH,TTT\}A={HHH,TTT} which has probability 2/82/82/8, and 1/82/8=12.\frac{1/8}{2/8}=\frac{1}{2}.​2/8​​1/8​​=​2​​1​​.

Independence

The conditional probability P(B∣A)P(B|A)P(B∣A) is not always equal to the unconditional probability P(B)P(B)P(B). The reason behind this is that the occurrence of event AAA may provide extra information that can change the probability that event BBB occurs. If the knowledge that event AAA occurs does not change the probability that event BBB occurs, then AAA and BBB are independent events, and thus, P(B∣A)=P(B)P(B|A) = P(B)P(B∣A)=P(B).

Bayes' Theorem

In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule) is a result that is of importance in the mathematical manipulation of conditional probabilities. It can be derived from the basic axioms of probability.

Mathematically, Bayes' theorem gives the relationship between the probabilities of AAA and BBB, P(A)P(A)P(A) and P(B)P(B)P(B), and the conditional probabilities of AAA given BBB and BBB given AAA. In its most common form, it is:

P(A∣B)=P(B∣A)P(A)P(B)\displaystyle P(A|B)=\frac{P(B|A)P(A)}{P(B)}P(A∣B)=​P(B)​​P(B∣A)P(A)​​

This may be easier to remember in this alternate symmetric form: 

P(A∣B)P(B∣A)=P(A)P(B)\displaystyle \frac{P(A|B)}{P(B|A)}=\frac{P(A)}{P(B)}​P(B∣A)​​P(A∣B)​​=​P(B)​​P(A)​​

Example:

Suppose someone told you they had a nice conversation with someone on the train. Not knowing anything else about this conversation, the probability that they were speaking to a woman is 50%50\%50%. Now suppose they also told you that this person had long hair. It is now more likely they were speaking to a woman, since women in in this city are more likely to have long hair than men. Bayes's theorem can be used to calculate the probability that the person is a woman.

To see how this is done, let WWW represent the event that the conversation was held with a woman, and LLL denote the event that the conversation was held with a long-haired person. It can be assumed that women constitute half the population for this example. So, not knowing anything else, the probability that WWW occurs is P(W)=0.5P(W) = 0.5P(W)=0.5.

Suppose it is also known that 75%75\%75% of women in this city have long hair, which we denote as P(L∣W)=0.75P(L|W) = 0.75P(L∣W)=0.75. Likewise, suppose it is known that 25%25\%25% of men in this city have long hair, or P(L∣M)=0.25P(L|M) = 0.25P(L∣M)=0.25, where MMM is the complementary event of WWW, i.e., the event that the conversation was held with a man (assuming that every human is either a man or a woman).

Our goal is to calculate the probability that the conversation was held with a woman, given the fact that the person had long hair, or, in our notation, P(W∣L)P(W|L)P(W∣L). Using the formula for Bayes's theorem, we have:

P(W∣L)=P(L∣W)P(W)P(L)=P(L∣W)P(W)P(L∣W)P(W)+P(L∣M)P(M)=0.75⋅0.50.75⋅0.5+0.25⋅0.5=0.75\displaystyle \begin{aligned} P(W|L) &= \frac{P(L|W)P(W)}{P(L)}\\ &= \frac{P(L|W)P(W)}{P(L|W)P(W)+P(L|M)P(M)}\\ &=\frac{0.75\cdot 0.5}{0.75\cdot 0.5+0.25\cdot 0.5}\\ &=0.75 \end{aligned}​P(W∣L)​​​​​​=​P(L)​​P(L∣W)P(W)​​​=​P(L∣W)P(W)+P(L∣M)P(M)​​P(L∣W)P(W)​​​=​0.75⋅0.5+0.25⋅0.5​​0.75⋅0.5​​​=0.75​​

[ edit ]
Edit this content
Prev Concept
Unions and Intersections
Complementary Events
Next Concept
Subjects
  • Accounting
  • Algebra
  • Art History
  • Biology
  • Business
  • Calculus
  • Chemistry
  • Communications
  • Economics
  • Finance
  • Management
  • Marketing
  • Microbiology
  • Physics
  • Physiology
  • Political Science
  • Psychology
  • Sociology
  • Statistics
  • U.S. History
  • World History
  • Writing

Except where noted, content and user contributions on this site are licensed under CC BY-SA 4.0 with attribution required.