Conditional Probability: Definition, Formula, Properties and Examples

Conditional Probability: Definition, Formula, Properties and Examples

Edited By Komal Miglani | Updated on Jul 02, 2025 07:54 PM IST

Probability is defined as the ratio of the number of favorable outcomes to the total number of outcomes. Conditional probability is an operation of set theory that helps to measure the probability of an event occurring when another event has already occurred. It provides a way to update probabilities based on new information and is essential for understanding how events interact and depend on each other. This event occurs very frequently in normal life and conditional probability is used to determine those cases.

Conditional Probability: Definition, Formula, Properties and Examples
Conditional Probability: Definition, Formula, Properties and Examples

Conditional Probability

Conditional probability is a measure of the probability of an event given that another event has already occurred. If A and B are two events associated with the same sample space of a random experiment, the conditional probability of the event A given that B has already occurred is written as $P(A \mid B), P(A / B)$ or $P\left(\frac{\{A\}}{\{B\}}\right)$.

The formula to calculate $P(A \mid B)$ is
$P(A \mid B)=\frac{\{P(A \cap B)\}}{\{P(B)\}}$ where $P(B)$ is greater than zero.
For example, suppose we toss one fair, six-sided die. The sample space $S=\\{1,2,3, 4,5,6\\}$. Let $A=$ face is $2$ or $3$ and $B=$ face is even number $(2,4,6)$.

Here, $P(A|B)$ means the probability of occurrence of face $2$ or $3$ when an even number has occurred which means that one of $2, 4$ and $6$ has occurred.

To calculate $P(A|B),$ we count the number of outcomes $2$ or $3$ in the modified sample space $B = \{2, 4, 6\}:$ meaning the common part in $A$ and $B$. Then we divide that by the number of outcomes in $B$ (rather than $S$).

$\begin{aligned} P(A \mid B) & =\frac{P(A \cap B)}{P(B)}=\frac{\frac{n(A \cap B)}{n(S)}}{\frac{n(B)}{n(S)}} \\ & =\frac{\frac{\text { the number of outcomes that are } 2 \text { or } 3 \text { and even in } S)}{6}}{\frac{\text { (the number of outcomes that are even in } S)}{6}} \\ & =\frac{\frac{1}{6}}{\frac{3}{6}}=\frac{1}{3}\end{aligned}$

Properties of Conditional Probability

Let $A$ and $B$ are events of a sample space $S$ of an experiment, then

Property $1: \mathrm{P}(\mathrm{S} \mid \mathrm{A})=\mathrm{P}(\mathrm{A} \mid \mathrm{A})=1$
Proof:

Also,

$
\begin{aligned}
& P(S \mid A)=\frac{P(S \cap A)}{P(A)}=\frac{P(A)}{P(A)}=1 \\
& P(A \mid A)=\frac{P(A \cap A)}{P(A)}=\frac{P(A)}{P(A)}=1
\end{aligned}
$

Thus,

$
\mathrm{P}(\mathrm{S} \mid \mathrm{A})=\mathrm{P}(\mathrm{A} \mid \mathrm{A})=1
$

Property 2 If $A$ and $B$ are any two events of a sample space $S$ and $C$ is an event of $S$ such that $P(C) ≠ 0,$ then
$
P((A \cup B) \mid C)=P(A \mid C)+P(B \mid C)-P((A \cap B) \mid C)
$


In particular, if $A$ and $B$ are disjoint events, then

$
P((A \cup B) \mid C)=P(A \mid C)+P(B \mid C)
$


Proof:

$
\begin{aligned}
\mathrm{P}((\mathrm{A} \cup \mathrm{B}) \mid \mathrm{C}) & =\frac{\mathrm{P}[(\mathrm{A} \cup \mathrm{B}) \cap \mathrm{C}]}{\mathrm{P}(\mathrm{C})} \\
& =\frac{\mathrm{P}[(\mathrm{A} \cap \mathrm{C}) \cup(\mathrm{B} \cap \mathrm{C})]}{\mathrm{P}(\mathrm{C})}
\end{aligned}
$

(by distributive law of union of sets over intersection)

$
\begin{aligned}
& =\frac{P(A \cap C)+P(B \cap C)-P((A \cap B) \cap C)}{P(C)} \\
& =\frac{P(A \cap C)}{P(C)}+\frac{P(B \cap C)}{P(C)}-\frac{P[(A \cap B) \cap C]}{P(C)} \\
& =P(A \mid C)+P(B \mid C)-P((A \cap B) \mid C)
\end{aligned}
$


When $A$ and $B$ are disjoint events, then

$
\begin{aligned}
& \mathrm{P}((\mathrm{A} \cap \mathrm{B}) \mid \mathrm{C})=0 \\
& \Rightarrow \quad \mathrm{P}((\mathrm{A} \cup \mathrm{B}) \mid \mathrm{F})=\mathrm{P}(\mathrm{A} \mid \mathrm{F})+\mathrm{P}(\mathrm{B} \mid \mathrm{F})
\end{aligned}
$

Property $3: P\left(A^{\prime} \mid B\right)=1-P(A \mid B)$, if $P(B) \neq 0$
Proof:
From Property 1, we know that $\mathrm{P}(\mathrm{S} \mid \mathrm{B})=1$

$
\begin{array}{lll}
\Rightarrow & P\left(\left(A \cup A^{\prime}\right) \mid B\right)=1 & \left(\text { as } A \cup A^{\prime}=S\right) \\
\Rightarrow & P(A \mid B)+P\left(A^{\prime} \mid B\right)=1 & \text { (as } A \text { and } A^{\prime} \text { are disjoint } \\
\text { event) } & & \\
\Rightarrow & P\left(A^{\prime} \mid B\right)=1-P(A \mid B) &
\end{array}
$

Recommended Video Based on Conditional Probability


Solved Examples Based on Conditional Probability:

Example 1: One Indian and four American men and their wives are to be seated randomly around a circular table. Then the conditional probability that the Indian man is seated adjacent to his wife given that each American man is seated adjacent to his wife is:

1) $\frac{1}{2}$
2) $\frac{1}{3}$
3) $\frac{2}{5}$
4) $\frac{1}{5}$

Solution

Conditional Probability
Let A and B be any two events such that $B \neq \phi$ or $\mathrm{n}(\mathrm{B})=0$ or $\mathrm{P}(\mathrm{B})=0$ then $P\left(\frac{A}{B}\right)$ denotes the conditional probability of occurrence of event $A$ when $B$ has already occurred

$
P\left(\frac{l_M l_W}{A_M A_W}\right)=\frac{4!\cdot(2!)^5}{5!\cdot(2!)^4}=\frac{2}{5}
$

Hence, the answer is option 3.

Example 2: In a random experiment, a fair die is rolled until two fours are obtained in succession. The probability that the experiment will end in the fifth throw of the die is equal to:
1) $\frac{200}{6^5}$
2) $\frac{150}{6^5}$
3) $\frac{225}{6^5}$
4) $\frac{175}{6^5}$

Solution

Probability of occurrence of an event -

Let $S$ be the sample space then the probability of occurrence of an event $E$ is denoted by $P(E)$ and it is defined as

$
\begin{aligned}
& P(E)=\frac{n(E)}{n(S)} \\
& P(E) \leq 1 \\
& P(E)=\lim _{n \rightarrow \infty}\left(\frac{r}{n}\right)
\end{aligned}
$

Where n repeated experiment and E occurs r times.
Conditional Probability -
Let A and B be any two events such that $B \neq \phi$ or $\mathrm{n}(\mathrm{B})=0$ or $\mathrm{P}(\mathrm{B})=0$ then $P\left(\frac{A}{B}\right)$ denotes the conditional probability of occurrence of event $A$ when $B$ has already occurred

$
\begin{aligned}
P(---4) & =P(4--44)+P(\text { not } 4--44) \\
& =\frac{1}{6} \times \frac{5}{6} \times \frac{5}{6} \times \frac{1}{6} \times \frac{1}{6}+\frac{5}{6} \times 1 \times \frac{5}{6} \times \frac{1}{6} \times \frac{1}{6} \\
& =\frac{25}{6^5}+\frac{25}{6^4} \\
& =\frac{175}{6^5}
\end{aligned}
$

Hence, the answer is the option 4.

Example 3: Three numbers are chosen at random without replacement from $\{1,2,3, \ldots \ldots .81\}$. The probability that their minimum is 3 , given that their maximum is 6 , is
1) $\frac{3}{8}$
2) $\frac{1}{5}$
3) $\frac{1}{4}$
4) $\frac{2}{5}$

Solution

Conditional Probability -

$
P\left(\frac{A}{B}\right)=\frac{P(A \cap B)}{P(B)}
$
and
$
P\left(\frac{B}{A}\right)=\frac{P(A \cap B)}{P(A)}
$
where the probability of A when B already happened.
$A$ is the event that the maximum is 6 .
$B$ is the event that the minimum is 3 .

$
P(B / A)=\frac{P(A \cap B)}{P(B)}=\frac{\frac{1 \cdot 1 \cdot 2}{8^8 C_3}}{\frac{{ }^5 C_2}{{ }^8 C_3}}=\frac{2}{10}=\frac{1}{5}
$

Example 4: It is given that the events $A$ and $B$ are such that $P(A)=\frac{1}{4}, P(A \mid B)=\frac{1}{2}$ and $P(B \mid A)=\frac{2}{3}$. Then $P(B)$ is:
1) $\frac{1}{2}$
2) $\frac{1}{6}$
3) $\frac{1}{3}$
4) $\frac{2}{3}$

Solution

Conditional Probability -

$
P\left(\frac{A}{B}\right)=\frac{P(A \cap B)}{P(B)}
$

and

$
P\left(\frac{B}{A}\right)=\frac{P(A \cap B)}{P(A)}
$

where $P\left(\frac{A}{B}\right)$ probability of A when B already happened.

$
\begin{aligned}
& P(B \mid A) P(A)=P(A \mid B) P(B)=\stackrel{\text { 䂇 }}{P}(A \cap B) \\
& \Rightarrow \frac{1}{4} \times \frac{2}{3}=\frac{1}{2} \times P(B) \\
& \Rightarrow P(B)=\frac{1}{3}
\end{aligned}
$

Hence, the answer is the option 3.

Example 5: If C and D are two events such that $C \subset D$ and $P(D) \neq 0$, then the correct statement among the following is:
1) $P(C / D)<P(C)$
2) $P(C / D)=\frac{P(D)}{P(C)}$
3) $P(C / D)=P(C)$
4) $P(C / D) \geqslant P(C)$

Solution
Conditional Probability -
$
P\left(\frac{A}{B}\right)=\frac{P(A \cap B)}{P(B)}
$
and
$
P\left(\frac{B}{A}\right)=\frac{P(A \cap B)}{P(A)}
$

where the probability of $A$ when $B$ already happened.

$
P\left(\frac{C}{D}\right)=\frac{P(C \cap D)}{P(D)}
$
If C and d are independent events
$
\begin{aligned}
& P(C \cap D)=P(C) \cdot P(D) \\
& \text { otherwise if } C \subseteq D \\
& P(C \cap D)=P(C) \\
& \therefore P\left(\frac{C}{D}\right)=\frac{P(C)}{P(D)} \geqslant P(C)
\end{aligned}
$

Hence, the answer is the option 4.

Summary
Conditional probability is a measure of the probability of an event given that another event has already occurred. Conditional Probability is a powerful tool in probability theory that helps in analyzing events based on new events. By mastering these concepts, complex problems can be solved effectively. This concept is crucial in various fields such as statistics and finance etc.

Frequently Asked Questions (FAQs)

1. What is Probability?

Probability is defined as the ratio of the number of favorable outcomes to the total number of outcomes.

2. What is Conditional Probability?

Conditional probability is a measure of the probability of an event given that another event has already occurred.

3. What is Conditional Probability?
Conditional probability is the likelihood of an event occurring given that another event has already occurred. It measures how the probability of one event changes when we have information about another related event.
4. How can conditional probability be calculated?

The formula to calculate $P(A \mid B)$ is
$P(A \mid B)=\frac{\{P(A \cap B)\}}{\{P(B)\}}$ where $P(B)$ is greater than zero.

5. How does conditional probability differ from regular probability?
Regular probability considers the likelihood of an event occurring without any additional information. Conditional probability, on the other hand, takes into account known information about a related event, potentially changing the likelihood of the event in question.
6. What is the formula for conditional probability?
The formula for conditional probability is P(A|B) = P(A ∩ B) / P(B), where P(A|B) is the probability of event A given that event B has occurred, P(A ∩ B) is the probability of both events A and B occurring, and P(B) is the probability of event B occurring.
7. Can you explain the meaning of the vertical bar "|" in P(A|B)?
The vertical bar "|" in P(A|B) is read as "given that" or "conditional on." It indicates that we are calculating the probability of event A, given that event B has already occurred or is known to be true.
8. Why is P(B) in the denominator of the conditional probability formula?
P(B) is in the denominator because we are restricting our sample space to only those outcomes where event B occurs. Dividing by P(B) normalizes the probability to account for this restriction.
9. How can tree diagrams be used to solve conditional probability problems?
Tree diagrams visually represent the sequence of events and their probabilities. Each branch represents a condition, and probabilities along a path are multiplied. This makes it easier to calculate conditional probabilities and see the relationships between events.
10. What is the "monty hall problem" and how does it relate to conditional probability?
The Monty Hall problem is a famous probability puzzle where conditional probability plays a key role. It demonstrates how additional information (a door being opened) changes the probabilities, often counter to intuition, highlighting the importance of considering all available information in probability calculations.
11. Can you explain the concept of conditional expectation using conditional probability?
Conditional expectation E[X|Y] is the expected value of X given information about Y. It uses conditional probability to calculate a weighted average of X for each possible value of Y, providing a way to predict one variable based on another.
12. What is the relationship between conditional probability and probabilistic graphical models?
Probabilistic graphical models, such as Bayesian networks, use conditional probabilities to represent relationships between variables. These models use graphs to visualize the conditional dependencies between variables, making it easier to reason about complex probability distributions.
13. How does conditional probability help in understanding and applying Bayes factors in statistical inference?
Bayes factors, used to compare the predictive accuracy of different models, are ratios of conditional probabilities. They compare the probability of the observed data under one model to the probability under another model, helping to quantify the evidence for different hypotheses.
14. How does conditional probability relate to independent events?
For independent events, conditional probability has no effect. If A and B are independent, then P(A|B) = P(A). In other words, knowing that B occurred doesn't change the probability of A occurring.
15. What is the difference between P(A|B) and P(B|A)?
P(A|B) is the probability of A occurring given that B has occurred, while P(B|A) is the probability of B occurring given that A has occurred. These are generally not equal unless A and B are independent or have a special relationship.
16. How can you use a Venn diagram to visualize conditional probability?
In a Venn diagram, conditional probability P(A|B) can be visualized as the area of intersection of A and B divided by the total area of B. This helps illustrate why we divide P(A ∩ B) by P(B) in the formula.
17. What is the multiplication rule of probability and how does it relate to conditional probability?
The multiplication rule states that P(A ∩ B) = P(A) * P(B|A). This rule is derived from the conditional probability formula and shows how to calculate the probability of two events occurring together using conditional probability.
18. How does conditional probability help in understanding Bayes' theorem?
Conditional probability is fundamental to Bayes' theorem, which allows us to update probabilities based on new evidence. Bayes' theorem uses conditional probabilities to relate P(A|B) to P(B|A), enabling us to reverse the condition and update our beliefs.
19. What is the law of total probability and how does it use conditional probability?
The law of total probability states that for mutually exclusive and exhaustive events B1, B2, ..., Bn, the probability of an event A is the sum of P(A|Bi) * P(Bi) for all i. This law uses conditional probabilities to calculate overall probabilities across different scenarios.
20. How can conditional probability be used in real-world decision making?
Conditional probability is crucial in decision making as it allows us to update our beliefs based on new information. For example, in medical diagnoses, we can calculate the probability of a disease given certain symptoms, helping doctors make more informed decisions.
21. What is a common misconception about conditional probability?
A common misconception is confusing P(A|B) with P(B|A). For instance, people might confuse the probability of having a disease given a positive test result with the probability of a positive test result given that someone has the disease.
22. How does conditional probability relate to the concept of correlation?
While correlation measures the strength and direction of a relationship between variables, conditional probability can indicate whether events are related. If P(A|B) ≠ P(A), it suggests a correlation between events A and B.
23. Can conditional probability ever be greater than 1?
No, conditional probability, like all probabilities, must be between 0 and 1 inclusive. If a calculation results in a conditional probability greater than 1, it indicates an error in the calculation or the given information.
24. How does sample space change when considering conditional probability?
When calculating conditional probability, the sample space effectively shrinks to only those outcomes where the given event (the condition) occurs. This is why we divide by the probability of the given event in the formula.
25. What is the difference between joint probability and conditional probability?
Joint probability P(A ∩ B) is the probability of both events A and B occurring together. Conditional probability P(A|B) is the probability of A occurring given that B has already occurred. They are related by the formula P(A ∩ B) = P(A|B) * P(B).
26. How does conditional probability apply in genetics, such as in calculating the probability of inheriting certain traits?
In genetics, conditional probability is used to calculate the likelihood of offspring having certain traits given information about their parents' genes. For example, we can calculate the probability of a child having blue eyes given that both parents carry the recessive blue eye gene.
27. What is a conditional probability distribution?
A conditional probability distribution gives the probabilities of different outcomes for one variable, given a specific value of another variable. It's a way of describing how the probability distribution of one variable changes based on the value of another variable.
28. How does conditional probability relate to the concept of statistical independence?
Two events A and B are statistically independent if P(A|B) = P(A). In other words, if knowing that B occurred doesn't change the probability of A occurring, then A and B are independent. This is a key application of conditional probability in understanding relationships between events.
29. How is conditional probability used in weather forecasting?
In weather forecasting, conditional probability is used to predict the likelihood of certain weather conditions given current observations. For example, the probability of rain tomorrow given today's atmospheric conditions is a conditional probability.
30. What is the difference between conditional probability and marginal probability?
Conditional probability P(A|B) is the probability of A given that B has occurred. Marginal probability P(A) is the overall probability of A occurring, regardless of other events. Marginal probability can be calculated from conditional probabilities using the law of total probability.
31. How does conditional probability apply to risk assessment in finance?
In finance, conditional probability is used to assess the risk of certain financial events given specific market conditions. For example, the probability of a stock price falling given a change in interest rates is a conditional probability used in risk management.
32. What is a conditional probability table and how is it used?
A conditional probability table lists the probabilities of an event occurring for each possible condition. It's often used in Bayesian networks to represent the relationships between variables and calculate probabilities of different outcomes given various conditions.
33. How does conditional probability relate to the concept of false positives in medical testing?
Conditional probability is crucial in understanding false positives. The probability of having a disease given a positive test result P(D|+) is different from the probability of a positive test result given the disease P(+|D). This distinction helps in interpreting test results accurately.
34. Can you explain how conditional probability is used in machine learning algorithms?
In machine learning, conditional probability is fundamental to many algorithms, especially in classification tasks. For example, Naive Bayes classifiers use conditional probabilities to predict the most likely class for a given input based on its features.
35. How does conditional probability help in understanding the concept of information gain in decision trees?
In decision trees, information gain measures how much a feature reduces uncertainty about the target variable. This is calculated using conditional probabilities – how the probability distribution of the target changes given different values of the feature.
36. What is the role of conditional probability in Bayesian inference?
Conditional probability is central to Bayesian inference. It allows us to update our beliefs (prior probabilities) based on new evidence, resulting in posterior probabilities. This process of updating probabilities is the core of Bayesian reasoning and statistical inference.
37. How can conditional probability be used to solve problems involving dependent events?
For dependent events, the probability of one event affects the probability of another. Conditional probability allows us to calculate these interrelated probabilities accurately by considering how one event changes the likelihood of another.
38. What is the difference between conditional probability and joint probability density functions in continuous probability distributions?
In continuous distributions, conditional probability density functions give the probability density of one variable given a specific value of another. Joint probability density functions give the probability density for both variables occurring together. They are related through integration and the chain rule of probability.
39. How does conditional probability apply to Markov chains?
In Markov chains, the probability of transitioning to a future state depends only on the current state, not on the past states. This is a direct application of conditional probability, where the probability of each state is conditional on the previous state.
40. Can you explain how conditional probability is used in natural language processing, particularly in language models?
In language models, conditional probability is used to predict the likelihood of a word or phrase given the preceding words. For example, in a bigram model, the probability of each word is conditional on the previous word, allowing the model to generate coherent text.
41. How does conditional probability help in understanding and calculating odds ratios?
Odds ratios compare the likelihood of an outcome in two groups. They can be expressed using conditional probabilities: the odds ratio is the ratio of the conditional probability of the outcome in one group to the conditional probability in the other group.
42. What is the role of conditional probability in hypothesis testing?
In hypothesis testing, conditional probability is used to calculate p-values, which are the probability of obtaining test results at least as extreme as the observed results, assuming the null hypothesis is true. This is a conditional probability: the probability of the data given the null hypothesis.
43. How can conditional probability be applied to problems involving competing risks in survival analysis?
In survival analysis with competing risks, conditional probability helps calculate the probability of one type of event occurring given that other competing events haven't occurred. This allows for more accurate risk assessments when multiple possible outcomes exist.
44. What is the relationship between conditional probability and likelihood functions in statistics?
Likelihood functions, central to many statistical methods, are often based on conditional probabilities. They represent the probability of observing the data given specific parameter values, which is a form of conditional probability.
45. How does conditional probability relate to the concept of mutual information in information theory?
Mutual information measures how much knowing one variable reduces uncertainty about another. It's calculated using conditional probabilities, comparing the joint probability distribution to what would be expected if the variables were independent.
46. Can you explain how conditional probability is used in credit scoring models?
In credit scoring, conditional probability helps assess the likelihood of loan default given various factors like income, credit history, and employment status. These conditional probabilities are combined to create a overall credit score.
47. What is the role of conditional probability in understanding Simpson's paradox?
Simpson's paradox occurs when a trend appears in several groups of data but disappears or reverses when these groups are combined. Conditional probability helps explain this by showing how the relationship between variables can change when conditioned on different factors.
48. How is conditional probability used in A/B testing?
In A/B testing, conditional probability helps calculate the likelihood of observed results given different hypotheses. For example, the probability of seeing a certain conversion rate difference between two versions, given that there is or isn't a real effect.
49. What is the connection between conditional probability and the concept of confounding variables in statistics?
Confounding variables can create misleading associations between other variables. Conditional probability helps identify and account for confounders by examining how probabilities change when conditioned on different variables, revealing true relationships between variables of interest.
50. How does conditional probability apply to the analysis of contingency tables?
In contingency table analysis, conditional probabilities are used to examine the relationship between categorical variables. By calculating probabilities of one variable's outcomes conditioned on another variable's categories, we can identify associations and dependencies between variables.
51. Can you explain how conditional probability is used in calculating sensitivity and specificity of diagnostic tests?
Sensitivity is the conditional probability of a positive test result given that the person has the disease: P(+|D). Specificity is the conditional probability of a negative test result given that the person doesn't have the disease: P(-|not D). These measures help evaluate the accuracy of diagnostic tests.
52. What is the role of conditional probability in understanding and applying the concept of conditional independence?
Conditional independence occurs when two events are independent given a third event. This is expressed using conditional probabilities: A and B are conditionally independent given C if P(A|B,C) = P(A|C). This concept is crucial in simplifying complex probability models.
53. Can you explain how conditional probability is used in the expectation-maximization (EM) algorithm?
The EM algorithm, used for finding maximum likelihood estimates of parameters in statistical models with latent variables, relies heavily on conditional probabilities. In the expectation step, it calculates the expected value of the log-likelihood function based on conditional probabilities of the latent variables given the observed data and current parameter estimates.

Articles

Back to top