Chapter 13 Probability

The Theory of probabilities is simply the science of logic quantitatively treated - C.S. PEIRCE

13.1 Introduction

Pierre de Fermat $(1601-1665)$

In earlier Classes, we have studied the probability as a measure of uncertainty of events in a random experiment. We discussed the axiomatic approach formulated by Russian Mathematician, A.N. Kolmogorov (1903-1987) and treated probability as a function of outcomes of the experiment. We have also established equivalence between the axiomatic theory and the classical theory of probability in case of equally likely outcomes. On the basis of this relationship, we obtained probabilities of events associated with discrete sample spaces. We have also studied the addition rule of probability. In this chapter, we shall discuss the important concept of conditional probability of an event given that another event has occurred, which will be helpful in understanding the Bayes’ theorem, multiplication rule of probability and independence of events. We shall also learn an important concept of random variable and its probability distribution and also the mean and variance of a probability distribution. In the last section of the chapter, we shall study an important discrete probability distribution called Binomial distribution. Throughout this chapter, we shall take up the experiments having equally likely outcomes, unless stated otherwise.

13.2 Conditional Probability

Uptill now in probability, we have discussed the methods of finding the probability of events. If we have two events from the same sample space, does the information about the occurrence of one of the events affect the probability of the other event? Let us try to answer this question by taking up a random experiment in which the outcomes are equally likely to occur.

Consider the experiment of tossing three fair coins. The sample space of the experiment is

$$ \mathrm{S}=\{\mathrm{HHH}, \mathrm{HHT}, \mathrm{HTH}, \mathrm{THH}, \mathrm{HTT}, \mathrm{THT}, \mathrm{TTH}, \mathrm{TTT}\} $$

Since the coins are fair, we can assign the probability $\frac{1}{8}$ to each sample point. Let $E$ be the event ‘at least two heads appear’ and $F$ be the event ‘first coin shows tail’. Then

$\mathrm{E}=\{\mathrm{HHH}, \mathrm{HHT}, \mathrm{HTH}, \mathrm{THH}\}$

or $\mathrm{F}=\{ \mathrm{THH, THT, TTH, TTT} \}$

therefore $$ \mathrm{P}(\mathrm{E})=\mathrm{P}(\{\mathrm{HHH}\})+\mathrm{P}(\{\mathrm{HHT}\})+\mathrm{P}(\{\mathrm{HTH}\})+\mathrm{P}(\{\mathrm{THH}\}) $$

$$ =\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}=\frac{1}{2} \text { (क्यों ?) } $$

or $$ \mathrm{P}(\mathrm{F})=\mathrm{P}(\{\mathrm{THH}\})+\mathrm{P}(\{\mathrm{THT}\})+\mathrm{P}(\{\mathrm{TTH}\})+\mathrm{P}(\{\mathrm{TTT}\}) $$

$$ =\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}=\frac{1}{2} $$

with $\mathrm{E} \cap \mathrm{F}=\{\mathrm{THH}\}$

therefore $\quad \mathrm{P}(\mathrm{E} \cap \mathrm{F})=\mathrm{P}(\{\mathrm{THH}\})=\frac{1}{8}$

Now, suppose we are given that the first coin shows tail, i.e. F occurs, then what is the probability of occurrence of $E$ ? With the information of occurrence of $F$, we are sure that the cases in which first coin does not result into a tail should not be considered while finding the probability of $E$. This information reduces our sample space from the set $S$ to its subset $F$ for the event $E$. In other words, the additional information really amounts to telling us that the situation may be considered as being that of a new random experiment for which the sample space consists of all those outcomes only which are favourable to the occurrence of the event $F$.

Now, the sample point of $F$ which is favourable to event $E$ is THH.

Thus, Probability of $E$ considering $F$ as the sample space $=\frac{1}{4}$,

or $\quad$ Probability of $E$ given that the event $F$ has occurred $=\frac{1}{4}$

This probability of the event $E$ is called the conditional probability of $E$ given that $F$ has already occurred, and is denoted by $P(E \mid F)$.

Thus $\quad P(E \mid F)=\frac{1}{4}$

Note that the elements of $F$ which favour the event $E$ are the common elements of $E$ and $F$, i.e. the sample points of $E \cap F$.

Thus, we can also write the conditional probability of $E$ given that $F$ has occurred as

$$ \begin{aligned} P(E \mid F) & =\frac{\text{ Number of elementary events favourable to } E \cap F}{\text{ Number of elementary events which are favourable to } F} \\ & =\frac{n(E \cap F)}{n(F)} \end{aligned} $$

Dividing the numerator and the denominator by total number of elementary events of the sample space, we see that $P(EIF)$ can also be written as

$$ P(E \mid F)=\frac{\frac{n(E \cap F)}{n(S)}}{\frac{n(F)}{n(S)}}=\frac{P(E \cap F)}{P(F)} \tag{1} $$

Note that (1) is valid only when $P(F) \neq 0$ i.e., $F \neq \phi$ (Why?) Thus, we can define the conditional probability as follows :

Definition 1 If $E$ and $F$ are two events associated with the same sample space of a random experiment, the conditional probability of the event $E$ given that $F$ has occurred, i.e. $P(E \mid F)$ is given by

$$ P(EIF)=\frac{P(E \cap F)}{P(F)} \text{ provided } P(F) \neq 0 $$

13.2.1 Properties of conditional probability

Let $E$ and $F$ be events of a sample space $S$ of an experiment, then we have

Property $1 P(S \mid F)=P(F \mid F)=1$

We know that $$ \mathrm{P}(\mathrm{S} \mid \mathrm{F})=\frac{\mathrm{P}(\mathrm{S} \cap \mathrm{F})}{\mathrm{P}(\mathrm{F})}=\frac{\mathrm{P}(\mathrm{F})}{\mathrm{P}(\mathrm{F})}=1 $$

Thus $$ P(F \mid F)=\frac{P(F \cap F)}{P(F)}=\frac{P(F)}{P(F)}=1 $$

or $$ \mathrm{P}(\mathrm{S} \mid \mathrm{F})=\mathrm{P}(\mathrm{F} \mid \mathrm{F})=1 $$

Property 2 If $A$ and $B$ are any two events of a sample space $S$ and $F$ is an event of $S$ such that $P(F) \neq 0$, then

$$ P((A \cup B) \mid F)=P(A \mid F)+P(B \mid F)-P((A \cap B) \mid F) $$

In particular, if $A$ and $B$ are disjoint events, then

We have $$ P((A \cup B) \mid F)=P(A \mid F)+P(B \mid F) $$

we know $$ \begin{aligned} P((A \cup B) \mid F) & =\frac{P[(A \cup B) \cap F]}{P(F)} \\ & =\frac{P[(A \cap F) \cup(B \cap F)]}{P(F)} \end{aligned} $$

(by distributive law of union of sets over intersection)

$$ \begin{aligned} & =\frac{P(A \cap F)+P(B \cap F)-P(A \cap B \cap F)}{P(F)} \\ & =\frac{P(A \cap F)}{P(F)}+\frac{P(B \cap F)}{P(F)}-\frac{P[(A \cap B) \cap F]}{P(F)} \\ & =P(A \mid F)+P(B \mid F)-P((A \cap B) \mid F) \end{aligned} $$

When $A$ and $B$ are disjoint events, then

$$ \begin{matrix}
& P((A \cap B) \mid F)=0 \\ \Rightarrow \quad & P((A \cup B) \mid F)=P(A \mid F)+P(B \mid F) \end{matrix} $$

when $\mathrm{A}$ and $\mathrm{B}$ are disjoint events,then $\mathrm{P}(\mathrm{A} \cup \mathrm{B})=\mathrm{P}(\mathrm{A} \mid F)+\mathrm{P}(\mathrm{B} \mid \mathrm{F})$

Property $3 P(E^{\prime} \mid F)=1-P(E \mid F)$

From Property 1 , we know that $P(SIF)=1$

$$ \begin{matrix} \Rightarrow & P(E \cup E^{\prime} \mid F)=1 & \text{ since } S=E \cup E^{\prime} \\ \Rightarrow & P(E \mid F)+P(E^{\prime} \mid F)=1 & \text{ since } E \text{ and } E^{\prime} \text{ are disjoint events } \\ \text{ Thus, } & P(E^{\prime} \mid F)=1-P(E \mid F) & \end{matrix} $$

Let us now take up some examples.

Example 1 If $P(A)=\frac{7}{13}, P(B)=\frac{9}{13}$ and $P(A \cap B)=\frac{4}{13}$, evaluate $P(A \mid B)$.

Solution We have $P(A \mid B)=\frac{P(A \cap B)}{P(B)}=\frac{\frac{4}{13}}{\frac{9}{13}}=\frac{4}{9}$

Example 2 A family has two children. What is the probability that both the children are boys given that at least one of them is a boy?

Solution Let $b$ stand for boy and $g$ for girl. The sample space of the experiment is

$$ S=\{(b, b),(g, b),(b, g),(g, g)\} $$

Let $E$ and $F$ denote the following events :

E : ‘both the children are boys’

$F$ : ‘at least one of the child is a boy’

Then $$E=\{(b, b)\} and F=\{(b, b),(g, b),(b, g)\}$$

Now $$E \cap F=\{(b, b)\}$$

Thus $$ P(F)=\frac{3}{4} \text{ and } P(E \cap F)=\frac{1}{4} $$

Therefore $$ P(E \mid F)=\frac{P(E \cap F)}{P(F)}=\frac{\frac{1}{4}}{\frac{3}{4}}=\frac{1}{3} $$

Example 3 Ten cards numbered 1 to 10 are placed in a box, mixed up thoroughly and then one card is drawn randomly. If it is known that the number on the drawn card is more than 3 , what is the probability that it is an even number?

Solution Let A be the event ’the number on the card drawn is even’ and B be the event ’the number on the card drawn is greater than 3 ‘. We have to find $P(AlB)$.

Now, the sample space of the experiment is $S=\{1,2,3,4,5,6,7,8,9,10\}$

Then $$ A=\{2,4,6,8,10\}, B=\{4,5,6,7,8,9,10\} $$

and $$ A \cap B=\{4,6,8,10\} $$

Also $$ P(A)=\frac{5}{10}, P(B)=\frac{7}{10} \text{ and } P(A \cap B)=\frac{4}{10} $$

$$ P(A \mid B)=\frac{P(A \cap B)}{P(B)}=\frac{\frac{4}{10}}{\frac{7}{10}}=\frac{4}{7} $$

Example 4 In a school, there are 1000 students, out of which 430 are girls. It is known that out of $430,10 \%$ of the girls study in class XII. What is the probability that a student chosen randomly studies in Class XII given that the chosen student is a girl?

Solution Let E denote the event that a student chosen randomly studies in Class XII and $F$ be the event that the randomly chosen student is a girl. We have to find $P(EIF)$.

Now $\quad P(F)=\frac{430}{1000}=0.43$ and $P(E \cap F)=\frac{43}{1000}=0.043$(Why?)

Then $$\quad P(E \mid F)=\frac{P(E \cap F)}{P(F)}=\frac{0.043}{0.43}=0.1$$

Example 5 A die is thrown three times. Events A and B are defined as below:

A : 4 on the third throw

B : 6 on the first and 5 on the second throw

Find the probability of A given that B has already occurred.

Solution The sample space has 216 outcomes.

Now

$\mathrm{A} =\left\lbrace \begin{array}{ccccccc} (1,1,4) & (1,2,4) & \ldots & (1,6,4) & (2,1,4)& (2,2,4)& \ldots & (2,6,4) \\
(3,1,4) & (3,2,4) &\ldots & (3,6,4)& (4,1,4)& (4,2,4) &\ldots &(4,6,4) \\ (5,1,4) & (5,2,4) & \ldots & (5,6,4)& (6,1,4)&(6,2,4)& \ldots &(6,6,4) \\ \end{array}\right\rbrace $

$$ \begin{aligned} & B=\{(6,5,1),(6,5,2),(6,5,3),(6,5,4),(6,5,5),(6,5,6)\} \end{aligned} $$

and $$ A \cap B=\{(6,5,4)\} . $$

Now $$ P(B)=\frac{6}{216} \text{ and } P(A \cap B)=\frac{1}{216} $$

Then $$ P(A \mid B)=\frac{P(A \cap B)}{P(B)}=\frac{\frac{1}{216}}{\frac{6}{216}}=\frac{1}{6} $$

Example 6 A die is thrown twice and the sum of the numbers appearing is observed to be 6 . What is the conditional probability that the number 4 has appeared at least once?

Solution Let $E$ be the event that ’number 4 appears at least once’ and $F$ be the event that ’the sum of the numbers appearing is 6 ‘.

Then, $$ \begin{aligned} & E=\{(4,1),(4,2),(4,3),(4,4),(4,5),(4,6),(1,4),(2,4),(3,4),(5,4),(6,4)\} \\ & F=\{(1,5),(2,4),(3,3),(4,2),(5,1)\} \end{aligned} $$

and We have $$ P(E)=\frac{11}{36} \text{ and } P(F)=\frac{5}{36} $$

Also $$ E \cap F=\{(2,4),(4,2)\} $$

Therefore $$ P(E \cap F)=\frac{2}{36} $$

Hence, the required probability $$ P(E \mid F)=\frac{P(E \cap F)}{P(F)}=\frac{\frac{2}{36}}{\frac{5}{36}}=\frac{2}{5} $$

For the conditional probability discussed above, we have considered the elementary events of the experiment to be equally likely and the corresponding definition of the probability of an event was used. However, the same definition can also be used in the general case where the elementary events of the sample space are not equally likely, the probabilities $P(E \cap F)$ and $P(F)$ being calculated accordingly. Let us take up the following example.

Example 7 Consider the experiment of tossing a coin. If the coin shows head, toss it again but if it shows tail, then throw a die. Find the conditional probability of the event that ’the die shows a number greater than 4 ’ given that ’there is at least one tail’.

Solution The outcomes of the experiment can be represented in following diagrammatic manner called the ’tree diagram’.

The sample space of the experiment may be described as

$ S=\{(H, H),(H, T),(T, 1),(T, 2),(T, 3),(T, 4),(T, 5),(T, 6)\} $

where $(H, H)$ denotes that both the tosses result into head and $(T, i)$ denote the first toss result into a tail and the number $i$ appeared on the die for $i=1,2,3,4,5,6$. Thus, the probabilities assigned to the 8 elementary events

$(H, H),(H, T),(T, 1),(T, 2),(T, 3)(T, 4),(T, 5),(T, 6)$ are $\frac{1}{4}, \frac{1}{4}, \frac{1}{12}, \frac{1}{12}, \frac{1}{12}, \frac{1}{12}, \frac{1}{12}, \frac{1}{12}$ respectively

which is clear from the Fig 13.2.

Let $F$ be the event that ’there is at least one tail’ and $E$ be the event ’the die shows a number greater than 4 ‘.

Then $$ \begin{aligned} & F=\{(H, T),(T, 1),(T, 2),(T, 3),(T, 4),(T, 5),(T, 6)\} \\ & E=\{(T, 5),(T, 6)\} \text{ and } E \cap F=\{(T, 5),(T, 6)\} \end{aligned} $$

Now $$ \begin{aligned} P(F)= & P(\{(H, T)\})+P(\{(T, 1)\})+P(\{(T, 2)\})+P(\{(T, 3)\}) \\ & +P(\{(T, 4)\})+P(\{(T, 5)\})+P(\{(T, 6)\}) \\ = & \frac{1}{4}+\frac{1}{12}+\frac{1}{12}+\frac{1}{12}+\frac{1}{12}+\frac{1}{12}+\frac{1}{12}=\frac{3}{4} \end{aligned} $$

and $\quad P(E \cap F)=P(\{(T, 5)\})+P(\{(T, 6)\})=\frac{1}{12}+\frac{1}{12}=\frac{1}{6}$

Hence $\quad P(E \mid F)=\frac{P(E \cap F)}{P(F)}=\frac{\frac{1}{6}}{\frac{3}{4}}=\frac{2}{9}$

13.3 Multiplication Theorem on Probability

Let $E$ and $F$ be two events associated with a sample space $S$. Clearly, the set $E \cap F$ denotes the event that both $E$ and $F$ have occurred. In other words, $E \cap F$ denotes the simultaneous occurrence of the events $E$ and $F$. The event $E \cap F$ is also written as $EF$.

Very often we need to find the probability of the event EF. For example, in the experiment of drawing two cards one after the other, we may be interested in finding the probability of the event ‘a king and a queen’. The probability of event EF is obtained by using the conditional probability as obtained below :

We know that the conditional probability of event $E$ given that $F$ has occurred is denoted by $P(E \mid F)$ and is given by

$$ P(E \mid F)=\frac{P(E \cap F)}{P(F)}, P(F) \neq 0 $$

From this result, we can write

$$ P(E \cap F)=P(F) . P(E \mid F) \tag{1} $$

Also, we know that

$$ \begin{aligned} & P(F \mid E)=\frac{P(F \cap E)}{P(E)}, P(E) \neq 0 \\ & P(F \mid E)=\frac{P(E \cap F)}{P(E)}(\text{ since } E \cap F=F \cap E) \end{aligned} $$

Thus, $$ P(E \cap F)=P(E) . P(F \mid E) \tag{2} $$

Combining (1) and (2), we find that

$$ \begin{aligned} P(E \cap F) & =P(E) P(F \mid E) \\ & =P(F) P(E \mid F) \text{ provided } P(E) \neq 0 \text{ and } P(F) \neq 0 . \end{aligned} $$

The above result is known as the multiplication rule of probability.

Let us now take up an example.

Example 8 An urn contains 10 black and 5 white balls. Two balls are drawn from the urn one after the other without replacement. What is the probability that both drawn balls are black?

Solution Let $E$ and $F$ denote respectively the events that first and second ball drawn are black. We have to find $P(E \cap F)$ or $P(EF)$.

Now $$ P(E)=P(\text{ black ball in first draw })=\frac{10}{15} $$

Also given that the first ball drawn is black, i.e., event $E$ has occurred, now there are 9 black balls and five white balls left in the urn. Therefore, the probability that the second ball drawn is black, given that the ball in the first draw is black, is nothing but the conditional probability of $F$ given that $E$ has occurred.

i.e. $$ P(F \mid E)=\frac{9}{14} $$

By multiplication rule of probability, we have

$$ \begin{aligned} \mathrm{P}(\mathrm{E} \cap \mathrm{F}) & =\mathrm{P}(\mathrm{E}) \mathrm{P}(\mathrm{F} \mid \mathrm{E})=\mathrm{P}(\mathrm{E}) \cdot \mathrm{P}(\mathrm{F} \mid \mathrm{E}) \cdot \mathrm{P}(\mathrm{G} \mid \mathrm{EF}) \\ & =\frac{10}{15} \times \frac{9}{14}=\frac{3}{7} \end{aligned} $$

Multiplication rule of probability for more than two events If $E, F$ and $G$ are three events of sample space, we have

$$ P(E \cap F \cap G)=P(E) P(F \mid E) P(G \mid(E \cap F))=P(E) P(F \mid E) P(G \mid E F) $$

Similarly, the multiplication rule of probability can be extended for four or more events.

The following example illustrates the extension of multiplication rule of probability for three events.

Example 9 Three cards are drawn successively, without replacement from a pack of 52 well shuffled cards. What is the probability that first two cards are kings and the third card drawn is an ace?

Solution Let $K$ denote the event that the card drawn is king and $A$ be the event that the card drawn is an ace. Clearly, we have to find P (KKA)

Now $$ P(K)=\frac{4}{52} $$

Also, $P(K \mid K)$ is the probability of second king with the condition that one king has already been drawn. Now there are three kings in $(52-1)=51$ cards.

Therefore $$ P(K \mid K)=\frac{3}{51} $$

Lastly, $P(A \mid KK)$ is the probability of third drawn card to be an ace, with the condition that two kings have already been drawn. Now there are four aces in left 50 cards.

Therefore $$ P(A \mid KK)=\frac{4}{50} $$

By multiplication law of probability, we have

$$ \begin{aligned} P(KKA) & =P(K) \quad P(K \mid K) \quad P(A \mid KK) \\ & =\frac{4}{52} \times \frac{3}{51} \times \frac{4}{50}=\frac{2}{5525} \end{aligned} $$

13.4 Independent Events

Consider the experiment of drawing a card from a deck of 52 playing cards, in which the elementary events are assumed to be equally likely. If $E$ and $F$ denote the events ’the card drawn is a spade’ and ’the card drawn is an ace’ respectively, then

$$ P(E)=\frac{13}{52}=\frac{1}{4} \text{ and } P(F)=\frac{4}{52}=\frac{1}{13} $$

Also $E$ and $F$ is the event ’ the card drawn is the ace of spades’ so that

Hence $$ \begin{aligned} & P(E \cap F)=\frac{1}{52} \\ & P(E \mid F)=\frac{P(E \cap F)}{P(F)}=\frac{\frac{1}{52}}{\frac{1}{13}}=\frac{1}{4} \end{aligned} $$

Since $P(E)=\frac{1}{4}=P(E \mid F)$, we can say that the occurrence of event $F$ has not affected the probability of occurrence of the event $E$.

We also have $$ P(F \mid E)=\frac{P(E \cap F)}{P(E)}=\frac{\frac{1}{52}}{\frac{1}{4}}=\frac{1}{13}=P(F) $$

Again, $P(F)=\frac{1}{13}=P(F \mid E)$ shows that occurrence of event $E$ has not affected the probability of occurrence of the event $F$.

Thus, $E$ and $F$ are two events such that the probability of occurrence of one of them is not affected by occurrence of the other.

Such events are called independent events.

Definition 2 Two events $E$ and $F$ are said to be independent, if

$$ \begin{aligned} & P(F \mid E)=P(F) \text{ provided } P(E) \neq 0 \\ & P(E \mid F)=P(E) \text{ provided } P(F) \neq 0 \end{aligned} $$

and Thus, in this definition we need to have $P(E) \neq 0$ and $P(F) \neq 0$

Now, by the multiplication rule of probability, we have

$$ P(E \cap F)=P(E) \cdot P(F \mid E) \tag{1} $$

If $E$ and $F$ are independent, then (1) becomes

$$ P(E \cap F)=P(E) \cdot P(F) \tag{2} $$

Thus, using (2), the independence of two events is also defined as follows:

Definition 3 Let $E$ and $F$ be two events associated with the same random experiment, then $E$ and $F$ are said to be independent if

$$ P(E \cap F)=P(E) . P(F) $$

Remarks

(i) Two events $E$ and $F$ are said to be dependent if they are not independent, i.e. if $ P(E \cap F) \neq P(E) . P(F) $

(ii) Sometimes there is a confusion between independent events and mutually exclusive events. Term ‘independent’ is defined in terms of ‘probability of events’ whereas mutually exclusive is defined in term of events (subset of sample space). Moreover, mutually exclusive events never have an outcome common, but independent events, may have common outcome. Clearly, ‘independent’ and ‘mutually exclusive’ do not have the same meaning.

In other words, two independent events having nonzero probabilities of occurrence can not be mutually exclusive, and conversely, i.e. two mutually exclusive events having nonzero probabilities of occurrence can not be independent.

(iii) Two experiments are said to be independent if for every pair of events $E$ and $F$, where $E$ is associated with the first experiment and $F$ with the second experiment, the probability of the simultaneous occurrence of the events $E$ and $F$ when the two experiments are performed is the product of $P(E)$ and $P(F)$ calculated separately on the basis of two experiments, i.e., $P(E \cap F)=P(E)$. $P(F)$

(iv) Three events A, B and C are said to be mutually independent, if

$$ \begin{aligned} P(A \cap B) & =P(A) P(B) \\ P(A \cap C) & =P(A) P(C) \\ P(B \cap C) & =P(B) P(C) \end{aligned} $$

$$ \text{ and } \quad P(A \cap B \cap C)=P(A) P(B) P(C) $$

If at least one of the above is not true for three given events, we say that the events are not independent.

Example 10 $A$ die is thrown. If $E$ is the event ’the number appearing is a multiple of 3 ’ and $F$ be the event ’the number appearing is even’ then find whether $E$ and $F$ are independent?

Solution We know that the sample space is $S=\{1,2,3,4,5,6\}$

Now $$ E=\{3,6\}, F=\{2,4,6\} \text{ and } E \cap F=\{6\} $$

Then $$ P(E)=\frac{2}{6}=\frac{1}{3}, P(F)=\frac{3}{6}=\frac{1}{2} \text{ and } P(E \cap F)=\frac{1}{6} $$

Clearly $$ P(E \cap F)=P(E) \cdot P(F) $$

Hence $\quad E$ and $F$ are independent events.

Example 11 An unbiased die is thrown twice. Let the event A be ‘odd number on the first throw’ and B the event ‘odd number on the second throw’. Check the independence of the events A and B.

Solution If all the 36 elementary events of the experiment are considered to be equally likely, we have

Also $$ P(A)=\frac{18}{36}=\frac{1}{2} \text{ and } P(B)=\frac{18}{36}=\frac{1}{2} $$

$$ P(A \cap B)=P(\text{ odd number on both throws }) $$

$$ =\frac{9}{36}=\frac{1}{4} $$

Now $$ P(A) P(B)=\frac{1}{2} \times \frac{1}{2}=\frac{1}{4} $$

Clearly $$ P(A \cap B)=P(A) \times P(B) $$

Thus, $\quad A$ and $B$ are independent events

Example 12 Three coins are tossed simultaneously. Consider the event $E$ ’three heads or three tails’, $F$ ‘at least two heads’ and $G$ ‘at most two heads’. Of the pairs (E,F), $(E, G)$ and $(F, G)$, which are independent? which are dependent?

Solution The sample space of the experiment is given by

$$ S=\{HHH, HHT, HTH, THH, HTT, THT, TTH, TTT\} $$

Clearly $\quad E=\{HHH, TTT\}, F=\{HHH, HHT, HTH, THH\}$

And $$ G=\{HHT, HTH, THH, HTT, THT, TTH, TTT\} $$

Also $E \cap F=\{HHH\}, E \cap G=\{TTT\}, F \cap G=\{HHT, HTH, THH\}$

Therefore $$ \begin{array}{r} \mathrm{P}(\mathrm{E})=\frac{2}{8}=\frac{1}{4}, \mathrm{P}(\mathrm{F})=\frac{4}{8}=\frac{1}{2}, \mathrm{P}(\mathrm{G})=\frac{7}{8} \\ \mathrm{P}(\mathrm{E} \cap \mathrm{F})=\frac{1}{8}, \mathrm{P}(\mathrm{E} \cap \mathrm{G})=\frac{1}{8}, \mathrm{P}(\mathrm{F} \cap \mathrm{G})=\frac{3}{8} \end{array} $$

as well as $$ P(E) \cdot P(F)=\frac{1}{4} \times \frac{1}{2}=\frac{1}{8}, P(E) \cdot P(G)=\frac{1}{4} \times \frac{7}{8}=\frac{7}{32} $$ $$ P(F) \cdot P(G)=\frac{1}{2} \times \frac{7}{8}=\frac{7}{16} $$

Thus $$ P(E \cap F)=P(E) \cdot P(F) $$

$$ P(E \cap G) \neq P(E) \cdot P(G) $$

and $$ P(F \cap G) \neq P(F) . P(G) $$

Hence, the events ( $E$ and $F$ ) are independent, and the events $(E$ and $G)$ and $(F$ and $G)$ are dependent.

Example 13 Prove that if $E$ and $F$ are independent events, then so are the events $E$ and $F^{\prime}$.

Solution Since $E$ and $F$ are independent, we have

$$ P(E \cap F)=P(E) . P(F) \tag{1} $$

From the venn diagram in Fig 13.3, it is clear that $E \cap F$ and $E \cap F^{\prime}$ are mutually exclusive events and also $E=(E \cap F) \cup(E \cap F^{\prime})$.

Therefore $$ P(E)=P(E \cap F)+P(E \cap F^{\prime}) $$

$$ \begin{aligned} P(E \cap F^{\prime})= & P(E)-P(E \cap F) \\ = & P(E)-P(E) \cdot P(F) \\ & (\text{ by }(1)) \\ \text{ or } \qquad= & P(E)(1-P(F)) \\ = & P(E) \cdot P(F^{\prime}) \end{aligned} $$

fig 13.3

Hence, $E$ and $F^{\prime}$ are independent

Note In a similar manner, it can be shown that if the events $E$ and $F$ are independent, then

(a) $E^{\prime}$ and $F$ are independent,

(b) $E^{\prime}$ and $F^{\prime}$ are independent

Example 14 If $A$ and $B$ are two independent events, then the probability of occurrence of at least one of $A$ and $B$ is given by $1-P(A^{\prime}) P(B^{\prime})$

Solution We have

$$ \begin{aligned} & P(\text{ at least one of } A \text{ and } B)=P(A \cup B) \\ &=P(A)+P(B)-P(A \cap B) \\ &=P(A)+P(B)-P(A) P(B) \\ &=P(A)+P(B)[1-P(A)] \\ &=P(A)+P(B) . P(A^{\prime}) \\ &=1-P(A^{\prime})+P(B) P(A^{\prime}) \\ &=1-P(A^{\prime})[1-P(B)] \\ &=1-P(A^{\prime}) P(B^{\prime}) \\ & \text{ } \end{aligned} $$

13.5 Bayes’ Theorem

Consider that there are two bags I and II. Bag I contains 2 white and 3 red balls and Bag II contains 4 white and 5 red balls. One ball is drawn at random from one of the bags. We can find the probability of selecting any of the bags (i.e. $\frac{1}{2}$ ) or probability of drawing a ball of a particular colour (say white) from a particular bag (say Bag I). In other words, we can find the probability that the ball drawn is of a particular colour, if we are given the bag from which the ball is drawn. But, can we find the probability that the ball drawn is from a particular bag (say Bag II), if the colour of the ball drawn is given? Here, we have to find the reverse probability of Bag II to be selected when an event occurred after it is known. Famous mathematician, John Bayes’ solved the problem of finding reverse probability by using conditional probability. The formula developed by him is known as ‘Bayes theorem’ which was published posthumously in 1763. Before stating and proving the Bayes’ theorem, let us first take up a definition and some preliminary results.

13.5.1 Partition of a sample space

A set of events $E_1, E_2, \ldots, E_n$ is said to represent a partition of the sample space $S$ if

(a) $E_i \cap E_j=\phi, i \neq j, i, j=1,2,3, \ldots, n$

(b) $E_1 \cup E_2 \cup \ldots \cup E_n=S$ and

(c) $P(E_i)>0$ for all $i=1,2, \ldots, n$.

In other words, the events $E_1, E_2, \ldots, E_n$ represent a partition of the sample space $S$ if they are pairwise disjoint, exhaustive and have nonzero probabilities.

As an example, we see that any nonempty event $E$ and its complement $E^{\prime}$ form a partition of the sample space $S$ since they satisfy $E \cap E^{\prime}=\phi$ and $E \cup E^{\prime}=S$.

From the Venn diagram in Fig 13.3, one can easily observe that if $E$ and $F$ are any two events associated with a sample space $S$, then the set $\{E \cap F^{\prime}, E \cap F, E^{\prime} \cap F, E^{\prime} \cap F^{\prime}\}$ is a partition of the sample space $S$. It may be mentioned that the partition of a sample space is not unique. There can be several partitions of the same sample space. we shall now prove a theorem known as Theorem of total probability.

13.5.2 Theorem of total probability

Let $\{E_1, E_2, \ldots, E_n\}$ be a partition of the sample space $S$, and suppose that each of the events $E_1, E_2, \ldots, E_n$ has nonzero probability of occurrence. Let $A$ be any event associated with $S$, then

$ \begin{aligned} P(A) & =P(E_1) P(AlE_1)+P(E_2) P(AlE_2)+\ldots+P(E_n) P(AlE_n) \\ & =\sum _{j=1}^{n} P(E_j) P(AlE_j) \end{aligned} $

Proof Given that $E_1, E_2, \ldots, E_n$ is a partition of the sample space $S$ (Fig 13.4). Therefore,

$$ S=E_1 \cup E_2 \cup \ldots \cup E_n $$

and $$ E_i \cap E_j=\phi, i \neq j, i, j=1,2, \ldots, n $$

Now, we know that for any event $A$,

$$ \begin{aligned} A & =A \cap S \\ & =A \cap(E_1 \cup E_2 \cup \ldots \cup E_n) \\ & =(A \cap E_1) \cup(A \cap E_2) \cup \ldots \cup(A \cap E_n) \end{aligned} $$

Fig 13.4

Also $A \cap E_i$ and $A \cap E_j$ are respectively the subsets of $E_i$ and $E_j$. We know that $E_i$ and $E_j$ are disjoint, for $i \neq j$, therefore, $A \cap E_i$ and $A \cap E_j$ are also disjoint for all $i \neq j, i, j=1,2, \ldots, n$.

Thus, $$ \begin{aligned} P(A) & =P[(A \cap E_1) \cup(A \cap E_2) \cup \ldots . . \cup(A \cap E_n)] \\ & =P(A \cap E_1)+P(A \cap E_2)+\ldots+P(A \cap E_n) \end{aligned} $$

Now, by multiplication rule of probability, we have

Now $ P(A \cap E_i)=P(E_i) P(AlE_i) \text{ as } P(E_i) \neq 0 \forall i=1,2, \ldots, n $

Therefore, $$ P(A)=P(E_1) P(AlE_1)+P(E_2) P(AlE_2)+\ldots+P(E_n) P(AlE_n) $$

or $$ P(A)=\sum _{j=1}^{n} P(E_j) P(AlE_j) $$

Example 15 A person has undertaken a construction job. The probabilities are 0.65 that there will be strike, 0.80 that the construction job will be completed on time if there is no strike, and 0.32 that the construction job will be completed on time if there is a strike. Determine the probability that the construction job will be completed on time.

Solution Let A be the event that the construction job will be completed on time, and B be the event that there will be a strike. We have to find $P(A)$.

We have $ \begin{aligned} P(B) & =0.65, P(\text{ no strike })=P(B^{\prime})=1-P(B)=1-0.65=0.35 \\ P(A \mid B) & =0.32, P(A \mid B^{\prime})=0.80 \end{aligned} $

Since events B and B’ form a partition of the sample space S, therefore, by theorem on total probability, we have

$$ =\mathrm{P}(\mathrm{B}) \cdot \mathrm{P}(\mathrm{A} \mid \mathrm{B})+\mathrm{P}\left(\mathrm{B}^{\prime}\right) \mathrm{P}\left(\mathrm{A} \mid \mathrm{B}^{\prime}\right) $$

$$ \begin{aligned} & =0.65 \times 0.32+0.35 \times 0.8 \\ & =0.208+0.28=0.488 \end{aligned} $$

Thus, the probability that the construction job will be completed in time is 0.488 .

We shall now state and prove the Bayes’ theorem.

Bayes’ Theorem If $E_1, E_2, \ldots, E_n$ are $n$ non empty events which constitute a partition of sample space $S$, i.e. $E_1, E_2, \ldots, E_n$ are pairwise disjoint and $E_1 \cup E_2 \cup \ldots \cup E_n=S$ and A is any event of nonzero probability, then

$$ P(E_i \mid A)=\frac{P(E_i) P(AlE_i)}{\sum _{j=1}^{n} P(E_j) P(AlE_j)} \text{ for any } i=1,2,3, \ldots, n $$

Proof By formula of conditional probability, we know that

$$ \begin{aligned} P(E_i \mid A) & =\frac{P(A \cap E_i)}{P(A)} \\ & =\frac{P(E_i) P(AlE_i)}{P(A)} \text{ (by multiplication rule of probability) } \\ & =\frac{P(E_i) P(AlE_i)}{\sum _{j=1}^{n} P(E_j) P(AlE_j)} \text{ (by the result of theorem of total probability) } \end{aligned} $$

Remark The following terminology is generally used when Bayes’ theorem is applied. The events $E_1, E_2, \ldots, E_n$ are called hypotheses. The probability $P(E_i)$ is called the priori probability of the hypothesis $E_i$ The conditional probability $P(E_i \mid A)$ is called a posteriori probability of the hypothesis $E_i$.

Bayes’ theorem is also called the formula for the probability of “causes”. Since the $E_i$ ’s are a partition of the sample space $S$, one and only one of the events $E_i$ occurs (i.e. one of the events $E_i$ must occur and only one can occur). Hence, the above formula gives us the probability of a particular $E_i$ (i.e. a “Cause”), given that the event $A$ has occurred.

The Bayes’ theorem has its applications in variety of situations, few of which are illustrated in following examples.

Example 16 Bag I contains 3 red and 4 black balls while another Bag II contains 5 red and 6 black balls. One ball is drawn at random from one of the bags and it is found to be red. Find the probability that it was drawn from Bag II.

Solution Let $E_1$ be the event of choosing the bag I, $E_2$ the event of choosing the bag II and $A$ be the event of drawing a red ball.

Then $$ P(E_1)=P(E_2)=\frac{1}{2} $$

Also $ P(AlE_1)=P(\text{ drawing a red ball from Bag } I)=\frac{3}{7} $

and $$ P(AlE_2)=P(\text{ drawing a red ball from Bag II })=\frac{5}{11} $$

Now, the probability of drawing a ball from Bag II, being given that it is red, is $P(E_2 \mid A)$

$$ P(E_2 \mid A)=\frac{P(E_2) P(A_2 E_2)}{P(E_1) P(A_1 E_1)+P(E_2) P(A_I E_2)}=\frac{\frac{1}{2} \times \frac{5}{11}}{\frac{1}{2} \times \frac{3}{7}+\frac{1}{2} \times \frac{5}{11}}=\frac{35}{68} $$

Example 17 Given three identical boxes I, II and III, each containing two coins. In box I, both coins are gold coins, in box II, both are silver coins and in the box III, there is one gold and one silver coin. A person chooses a box at random and takes out a coin. If the coin is of gold, what is the probability that the other coin in the box is also of gold?

Solution Let $E_1, E_2$ and $E_3$ be the events that boxes I, II and III are chosen, respectively.

Then $$ P(E_1)=P(E_2)=P(E_3)=\frac{1}{3} $$

Also, let A be the event that ’the coin drawn is of gold’

Then $$ \begin{aligned} & P(AIE_1)=P(\text{ a gold coin from bag } I)=\frac{2}{2}=1 \\ & P(AIE_2)=P(\text{ a gold coin from bag II })=0 \\ & P(AIE_3)=P(\text{ a gold coin from bag III })=\frac{1}{2} \end{aligned} $$

Now, the probability that the other coin in the box is of gold $$ \begin{aligned} & =\text{ the probability that gold coin is drawn from the box } I \text{. } \\ & =P(E_1 \mid A) \end{aligned} $$

By Bayes’ theorem, we know that

$ \begin{aligned} P(E_1 \mid A) & =\frac{P(E_1) P(AlE_1)}{P(E_1) P(AlE E_1)+P(E_2) P(AlE_2)+P(E_3) P(AlE_3)} \\ & =\frac{\frac{1}{3} \times 1}{\frac{1}{3} \times 1+\frac{1}{3} \times 0+\frac{1}{3} \times \frac{1}{2}}=\frac{2}{3} \end{aligned} $

Example 18 Suppose that the reliability of a HIV test is specified as follows:

Of people having HIV, $90 \%$ of the test detect the disease but $10 \%$ go undetected. Of people free of HIV, $99 \%$ of the test are judged HIV-ive but $1 \%$ are diagnosed as showing HIV+ive. From a large population of which only $0.1 \%$ have HIV, one person is selected at random, given the HIV test, and the pathologist reports him/her as HIV+ive. What is the probability that the person actually has HIV?

Solution Let $E$ denote the event that the person selected is actually having HIV and A the event that the person’s HIV test is diagnosed as +ive. We need to find P(EIA).

Also $E^{\prime}$ denotes the event that the person selected is actually not having HIV.

Clearly, $\{E, E^{\prime}\}$ is a partition of the sample space of all people in the population. We are given that

$$ \begin{aligned} & P(E)=0.1 \%=\frac{0.1}{100}=0.001 \\ & P\left(E^{\prime}\right)=1-P(E)=0.999 \end{aligned} $$

$ \begin{aligned} P(A \mid E)= & P(Person \text{ tested as HIV+ive given that he/she } \\ & \quad \text{ is actually having HIV }) = & 90 \%=\frac{90}{100}=0.9 \end{aligned} $

and $P(A \mid E^{\prime})=P($ Person tested as HIV +ive given that he/she is actually not having HIV) $=1 \%=0.01$

Now, by Bayes’ theorem

$ \begin{aligned} P(E \mid A) & =\frac{P(E) P(A \mid E)}{P(E) P(A \mid E)+P(E^{\prime}) P(AlE^{\prime})} \\ & =\frac{0.001 \times 0.9}{0.001 \times 0.9+0.999 \times 0.01}=\frac{90}{1089} \\ & =0.083 \text{ approx. } \end{aligned} $

Thus, the probability that a person selected at random is actually having HIV given that he/she is tested HIV+ive is 0.083 .

Example 19 In a factory which manufactures bolts, machines A, B and C manufacture respectively $25 \%, 35 \%$ and $40 \%$ of the bolts. Of their outputs, 5, 4 and 2 percent are respectively defective bolts. A bolt is drawn at random from the product and is found to be defective. What is the probability that it is manufactured by the machine $B$ ?

Solution Let events $B_1, B_2, B_3$ be the following :

$B_1$ : the bolt is manufactured by machine $A$

$B_2$ : the bolt is manufactured by machine $B$

$B_3$ : the bolt is manufactured by machine $C$

Clearly, $B_1, B_2, B_3$ are mutually exclusive and exhaustive events and hence, they represent a partition of the sample space.

Let the event $E$ be ’the bolt is defective’. The event $E$ occurs with $B_1$ or with $B_2$ or with $B_3$. Given that,

$$ \mathrm{P}\left(\mathrm{B} _{1}\right)=25 \%=0.25, \mathrm{P}\left(\mathrm{B} _{2}\right)=0.35 \text { or} \mathrm{P}\left(\mathrm{B} _{3}\right)=0.40 $$

Again $P(E_1 B_1)=$ Probability that the bolt drawn is defective given that it is manufactured by machine $A=5 \%=0.05$

Similarly $\mathrm{P}\left(\mathrm{E} \mid B _{2}\right)=0.04, \mathrm{P}\left(\mathrm{ElB} _{3}\right)=0.02$

Hence, by Bayes’ Theorem, we have

$$ \begin{aligned} \mathrm{P}\left(\mathrm{B} _{2} \mid \mathrm{E}\right) & =\frac{\mathrm{P}\left(\mathrm{B} _{2}\right) \mathrm{P}\left(\mathrm{ElB} _{2}\right)}{\mathrm{P}\left(\mathrm{B} _{1}\right) \mathrm{P}\left(\mathrm{ElB} _{1}\right)+\mathrm{P}\left(\mathrm{B} _{2}\right) \mathrm{P}\left(\mathrm{ElB} _{2}\right)+\mathrm{P}\left(\mathrm{B} _{3}\right) \mathrm{P}\left(\mathrm{E}+\mid \mathrm{B} _{3}\right)} \\ & =\frac{0.35 \times 0.04}{0.25 \times 0.05+0.35 \times 0.04+0.40 \times 0.02}=\frac{0.0140}{0.0345}=\frac{28}{69} \end{aligned} $$

Example 20 A doctor is to visit a patient. From the past experience, it is known that the probabilities that he will come by train, bus, scooter or by other means of transport are respectively $\frac{3}{10}, \frac{1}{5}, \frac{1}{10}$ and $\frac{2}{5}$. The probabilities that he will be late are $\frac{1}{4}, \frac{1}{3}$, and $\frac{1}{12}$, if he comes by train, bus and scooter respectively, but if he comes by other means of transport, then he will not be late. When he arrives, he is late. What is the probability that he comes by train?

Solution Let $E$ be the event that the doctor visits the patient late and let $T_1, T_2, T_3, T_4$ be the events that the doctor comes by train, bus, scooter, and other means of transport respectively.

Then $$ P(T_1)=\frac{3}{10}, P(T_2)=\frac{1}{5}, P(T_3)=\frac{1}{10} \text{ and } P(T_4)=\frac{2}{5} \quad \text{ (given) } $$

$ P(ElT_1)=\text{ Probability that the doctor arriving late comes by train }=\frac{1}{4} $

Similarly, $P(E \mid T_2)=\frac{1}{3}, P(E \mid T_3)=\frac{1}{12}$ and $P(E \mid T_4)=0$, since he is not late if he comes by other means of transport.

Therefore, by Bayes’ Theorem, we have

$P(T_1 \mid E)=$ Probability that the doctor arriving late comes by train

$$ \begin{aligned} & =\frac{P(T_1) P(ElT_1)}{P(T_1) P(E \mid T_1)+P(T_2) P(ElT T_2)+P(T_3) P(ElT_3)+P(T_4) P(ElT_4)} \\ & =\frac{\frac{3}{10} \times \frac{1}{4}}{\frac{3}{10} \times \frac{1}{4}+\frac{1}{5} \times \frac{1}{3}+\frac{1}{10} \times \frac{1}{12}+\frac{2}{5} \times 0}=\frac{3}{40} \times \frac{120}{18}=\frac{1}{2} \end{aligned} $$

Hence, the required probability is $\frac{1}{2}$.

Example 21 A man is known to speak truth 3 out of 4 times. He throws a die and reports that it is a six. Find the probability that it is actually a six.

Solution Let $E$ be the event that the man reports that six occurs in the throwing of the die and let $S_1$ be the event that six occurs and $S_2$ be the event that six does not occur.

Then $ \begin{aligned} & P(S_1)=\text{ Probability that six occurs }=\frac{1}{6} \end{aligned} $

$ \begin{aligned} & P(S_2)=\text{ Probability that six does not occur }=\frac{5}{6} \end{aligned} $

$$P(ElS_1)=\text{ Probability that the man reports that six occurs when six has }$$

Probability that the man speaks the truth $=\frac{3}{4} $

$\mathrm{P}\left(\mathrm{ElS} _2\right)=$ Probability that the man reports that six occurs when six has not actually occurred on the die

= Probability that the man does not speak the truth $=1-\frac{3}{4}=\frac{1}{4}$ Thus, by Bayes’ theorem, we get

$\mathrm{P}(\mathrm{S}, \mathrm{E})=$ Probability that the report of the man that six has occurred is actually a six

$ \begin{aligned} & =\frac{P(S_1) P(E \mid S_1)}{P(S_1) P(EIS_1)+P(S_2) P(E \mid S_2)} & =\frac{\frac{1}{6} \times \frac{3}{4}}{\frac{1}{6} \times \frac{3}{4}+\frac{5}{6} \times \frac{1}{4}}=\frac{1}{8} \times \frac{24}{8}=\frac{3}{8} \end{aligned} $

Hence, the required probability is $\frac{3}{8}$.

Remark A random variable is a real valued function whose domain is the sample space of a random experiment.

For example, let us consider the experiment of tossing a coin two times in succession. The sample space of the experiment is

$$S=\{HH, HT, TH, TT\}$$.

If $X$ denotes the number of heads obtained, then $X$ is a random variable and for each outcome, its value is as given below :

$$ X(HH)=2, X(HT)=1, X(TH)=1, X(TT)=0 . $$

More than one random variables can be defined on the same sample space. For example, let $Y$ denote the number of heads minus the number of tails for each outcome of the above sample space $S$.

$$ Y(HH)=2, Y(HT)=0, Y(TH)=0, Y(TT)=-2 $$

Thus, $X$ and $Y$ are two different random variables defined on the same sample space $S$.

Miscellaneous Examples

Example 22 Coloured balls are distributed in four boxes as shown in the following table:

Box Colour
I 3 4 5 6
II 2 2 2 2
III 1 2 3 1
IV 4 3 1 5

A box is selected at random and then a ball is randomly drawn from the selected box. The colour of the ball is black, what is the probability that ball drawn is from the box III?

Solution Let $A, E_1, E_2, E_3$ and $E_4$ be the events as defined below :

$ \begin{matrix} A: \text{ a black ball is selected } & E_1: \text{ box I is selected } \\ E_2: \text{ box II is selected } & E_3: \text{ box III is selected } \\ E_4: \text{ box IV is selected } & \end{matrix} $

Since the boxes are chosen at random,

Therefore $ P(E_1)=P(E_2)=P(E_3)=P(E_4)=\frac{1}{4} $

Also $ P(AlE E_1)=\frac{3}{18}, P(AlE_2)=\frac{2}{8}, P(AlE_3)=\frac{1}{7} \text{ and } P(AlE_4)=\frac{4}{13} $

$P($ box III is selected, given that the drawn ball is black $)=P(E_3 \mid A)$. By Bayes’ theorem,

$ \begin{aligned} P(E_3 \mid A) & =\frac{P(E_3) \cdot P(AlE_3)}{P(E_1) P(Al E_1)+P(E_2) P(AlE_2)+P(E_3) P(AlE_3)+P(E_4) P(AlE_4)} \\ & =\frac{\frac{1}{4} \times \frac{1}{7}}{\frac{1}{4} \times \frac{3}{18}+\frac{1}{4} \times \frac{1}{4}+\frac{1}{4} \times \frac{1}{7}+\frac{1}{4} \times \frac{4}{13}}=0.165 \end{aligned} $

Example 23 $A$ and $B$ throw a die alternatively till one of them gets a ’ 6 ’ and wins the game. Find their respective probabilities of winning, if A starts first.

Solution Let S denote the success (getting a ’ 6 ‘) and $F$ denote the failure (not getting a ’ 6 ‘).

Thus, $$ P(S)=\frac{1}{6}, P(F)=\frac{5}{6} $$

$P(A$ wins in the first throw $)=P(S)=\frac{1}{6}$

A gets the third throw, when the first throw by $A$ and second throw by $B$ result into failures.

Therefore, $\quad P(A$ wins in the 3rd throw $)=P(FFS)=P(F) P(F) P(S)=\frac{5}{6} \times \frac{5}{6} \times \frac{1}{6}$ $ =(\frac{5}{6})^{2} \times \frac{1}{6} $

$P(A$ wins in the 5 th throw $)=P($ FFFFS $)=(\frac{5}{6})^{4}(\frac{1}{6})$ and so on.

Hence, $ \begin{aligned} P(\text{ A wins }) & =\frac{1}{6}+(\frac{5}{6})^{2}(\frac{1}{6})+(\frac{5}{6})^{4}(\frac{1}{6})+\ldots \\ \end{aligned} $

$$\begin{aligned} & =\frac{\frac{1}{6}}{1-\frac{25}{36}}=\frac{6}{11} \\ P(B \text{ wins }) & =1-P(\text{ A wins })=1-\frac{6}{11}=\frac{5}{11} \end{aligned}$$

Remark If $a+a r+a r^{2}+\ldots+a r^{n-1}+\ldots$, where $|r|<1$, then sum of this infinite G.P. is given by $\frac{a}{1-r} \cdot($ Refer A.1.3 of Class XI Text book).

Example 24 If a machine is correctly set up, it produces $90 \%$ acceptable items. If it is incorrectly set up, it produces only $40 \%$ acceptable items. Past experience shows that $80 \%$ of the set ups are correctly done. If after a certain set up, the machine produces 2 acceptable items, find the probability that the machine is correctly setup.

Solution Let A be the event that the machine produces 2 acceptable items. Also let $B_1$ represent the event of correct set up and $B_2$ represent the event of incorrect setup.

Now $$ P(B_1)=0.8, P(B_2)=0.2 $$ $$ P(AlB_1)=0.9 \times 0.9 \text{ and } P(AlB_2)=0.4 \times 0.4 $$

Therefore $$ \begin{aligned} \mathrm{P}\left(\mathrm{B} _{1} \mid \mathrm{A}\right) & =\frac{\mathrm{P}\left(\mathrm{B} _{1}\right) \mathrm{P}\left(\mathrm{A} \mid \mathrm{B} _{1}\right)}{\mathrm{P}\left(\mathrm{B} _{1}\right) \mathrm{P}\left(\mathrm{A} \mid \mathrm{B} _{1}\right)+\mathrm{P}\left(\mathrm{B} _{2}\right) \mathrm{P}\left(\mathrm{A} \mid \mathrm{B} _{2}\right)} \\ & =\frac{0.8 \times 0.9 \times 0.9}{0.8 \times 0.9 \times 0.9+0.2 \times 0.4 \times 0.4}=\frac{648}{680}=0.95 \end{aligned} $$

Summary

The salient features of the chapter are -

  • The conditional probability of an event $E$, given the occurrence of the event $F$ is given by

$P(E \mid F)=\frac{P(E \cap F)}{P(F)}, P(F) \neq 0$

  • $\Delta 0 \leq P(E \mid F) \leq 1, \quad P(E^{\prime} \mid F)=1-P(E \mid F)$

$P((E \cup F) \mid G)=P(E \mid G)+P(F \mid G)-P((E \cap F) \mid G)$

  • $\Delta P(E \cap F)=P(E) P(F \mid E), P(E) \neq 0$

$P(E \cap F)=P(F) P(EIF), P(F) \neq 0$

  • If E and F are independent, then

$P(E \cap F)=P(E) P(F)$

$P(EIF)=P(E), P(F) \neq 0$

$P(F \mid E)=P(F), P(E) \neq 0$

  • Theorem of total probability

Let $\{E_1, E_2, \ldots, E_n\}$ be a partition of a sample space and suppose that each of $E_1, E_2, \ldots, E_n$ has nonzero probability. Let $A$ be any event associated with $S$, then $P(A)=P(E_1) P(AlE_1)+P(E_2) P(AlE_2)+\ldots+P(E_n) P(AlE_n)$

  • Bayes’ theorem If $E_1, E_2, \ldots, E_n$ are events which constitute a partition of sample space $S$, i.e. $E_1, E_2, \ldots, E_n$ are pairwise disjoint and $E_1 \cup E_2 \cup \ldots \cup E_n=S$ and $A$ be any event with nonzero probability, then

$$ P(E_i \mid A)=\frac{P(E_i) P(A \mid E_i)}{\sum _{j=1}^{n} P(E_j) P(A \mid E_j)} $$

Historical Note

The earliest indication on measurement of chances in game of dice appeared in 1477 in a commentary on Dante’s Divine Comedy. A treatise on gambling named liber de Ludo Alcae, by Geronimo Carden (1501-1576) was published posthumously in 1663 . In this treatise, he gives the number of favourable cases for each event when two dice are thrown. Galileo (1564-1642) gave casual remarks concerning the correct evaluation of chance in a game of three dice. Galileo analysed that when three dice are thrown, the sum of the number that appear is more likely to be 10 than the sum 9 , because the number of cases favourable to 10 are more than the number of cases for the appearance of number 9.

Apart from these early contributions, it is generally acknowledged that the true origin of the science of probability lies in the correspondence between two great men of the seventeenth century, Pascal (1623-1662) and Pierre de Fermat (1601-1665). A French gambler, Chevalier de Metre asked Pascal to explain some seeming contradiction between his theoretical reasoning and the observation gathered from gambling. In a series of letters written around 1654, Pascal and Fermat laid the first foundation of science of probability. Pascal solved the problem in algebraic manner while Fermat used the method of combinations.

Great Dutch Scientist, Huygens (1629-1695), became acquainted with the content of the correspondence between Pascal and Fermat and published a first book on probability, “De Ratiociniis in Ludo Aleae” containing solution of many interesting rather than difficult problems on probability in games of chances. The next great work on probability theory is by Jacob Bernoulli (1654-1705), in the form of a great book, “Ars Conjectendi” published posthumously in 1713 by his nephew, Nicholes Bernoulli. To him is due the discovery of one of the most important probability distribution known as Binomial distribution. The next remarkable work on probability lies in 1993. A. N. Kolmogorov (1903-1987) is credited with the axiomatic theory of probability. His book, ‘Foundations of probability’ published in 1933, introduces probability as a set function and is considered a ‘classic!’.

ICAR AIEEA (UG) Exam