0% found this document useful (0 votes)
9 views68 pages

Chapter I

For economics department
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
9 views68 pages

Chapter I

For economics department
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 68

STATISTICS FOR ECONOMISTS

ECON 2042
Addis Ababa University

College of Business and


Economics

Department of Economics
May, 2022
CHAPTER ONE
OVERVIEW OF BASIC PROBABILITY
THEORY
Introduction
 There are large numbers of happenings in nature and in the
realm of human activity that are associated with uncertainties.
 Uncertainty plays an important role in our daily lives and
activities as well as in business.
 It is often necessary to "guess" about the outcome of an event
in order to make a decision.
 In our daily lives we are faced with a lot of decision-making
situations that involve uncertainty.
For example:
 In a business context: investment, stock prices
 Game Wins, Lottery wins, Election win, Traffic
jam
 Market Conditions: effectiveness of an ad
campaign or the eventual success of a new product.
 Weather forecast: appearance of clouds in the sky
next morning is not as certain.
 The sex of a baby to be born some months hence is
again not known for certain.
 In generally, we often do not know an outcome with certainty,
instead, we are forced to guess, to estimate, to hedge our bets.
 In each of such happenings there are two or more outcomes
leading to uncertainty.
 If an experiment is repeated under essentially similar conditions,
we generally come across two types of situation: namely
1) the result or the outcome is unique or certain - known as
deterministic,
2) the result is not unique rather may be one of the several
possible outcomes- known as unpredictable or non-
deterministic or probabilistic phenomenon.
 The experiment (models) related to non-deterministic phenomenon
is called random experiment or non-deterministic model.
 In this course we will deal with the random experiments- or non -
deterministic models.
Examples:
Exp1: Toss a die and observe the number that shows up.
Exp2: Toss a coin n time and observe the total number of
heads obtained
Exp3: Manufacture items on a production line and count the
number of defective items produced during a 24 hr. period.
Exp4: testing the life length of a manufactured light bulb
by inserting it into a socket and the time elapsed (in hrs.) until it
burns out recorded.
Exp5: Measuring blood pressure of a group of individuals,
Exp6: Checking an automobile’s petrol mileage,
Exp7 : Measuring daily rainfall, and so on.
So what is probability theory ?
 Probability is the science of uncertainty.
 It provides precise mathematical rules for understanding and
analyzing our own ignorance.
 It does not tell us tomorrow’s weather or next week’s stock prices;
rather, it gives us a framework for working with our limited
knowledge and for making sensible decisions based on what we
do and do not know.
 Is a measure of uncertainty with values between zero and one,
inclusive, describing the relative possibility(chance or likelihood)
of occurrence of an event.
 In sum, probability theory provides us with a precise
understanding of uncertainty; and this understanding can help us
make predictions, make better decisions, assess risks, and even
make money.
1.1 Sample Space, Sample Points, Events and Event
Space
i) Experiment: An activity or measurement that results in an
outcome.
ii) Sample space: is the set of all possible outcomes of an experiment
denoted by S or Ω.
iii) Sample points: Each element of the sample space S is called a
sample point and usually denoted by ω𝑖 .
iv) Events: One or more of the possible outcomes; a subsets of the
sample space, S.
v) Event space: is a class or collection of all events associated with a
given experiment OR sample space. We use E to denote event
space.
EXAMPLE: Let the experiment be rolling of a well balanced single
die one times.
Given this:
 The sample space of the experiment is S = {1, 2, 3, 4, 5, 6}
which contains all the possible outcomes of the experiment.
 Each outcome is a sample point, i.e., 𝝎𝒊 ∈ S.
Let A = {2, 4, 6}.
 Then A is a subset of S and defines the event of obtaining an
even outcome.
Let B = {4}.
 Then B defines the event of obtaining a number 4.
 Both A and B are subsets of the sample space and thus are
events.
 We may be interested on the following events
a) the outcome is the number 1
b) the outcome is even but less than 3
c) the outcomes is not even and so on.
 When an event contains only one element of S, like B above, it is
called simple(elementary) event.
 If we collect all subsets of S in one set and denote it by E this new set
is called Event Space.

Types of Events
 We defined have already defined Event as any subset of
outcomes of an experiment.
1) Mutually Exclusive Events: If two or more events cannot occur
simultaneously in a single trial of an experiment, then such events
are called mutually exclusive events or disjoint events.
 In other words, two events are mutually exclusive if the occurrence
of one of them prevents or rules out the occurrence of the other.
For example, the numbers 2 and 3 cannot occur simultaneously on the
roll of a dice.
 Symbolically, a set of events {𝑨𝟏 , 𝑨𝟐 , . . ., 𝑨𝒏 } is mutually
exclusive if 𝑨𝒊 ∩ 𝑨𝒋 = ∅ for i ≠ j.
 This means the intersection of two events is a null set (∅); it is
impossible to observe an event that is common in both 𝑨𝒊 and 𝑨𝒋 .

2) Collectively Exhaustive Events: A list of events is said to be


collectively exhaustive when all possible events that can occur from
an experiment includes every possible outcome.
 That is, two or more events are said to be collectively exhaustive
if one of the events must occur.
 Symbolically, a set of events {𝑨𝟏 , 𝑨𝟐 , . . ., 𝑨𝒏 } is collectively exhaustive if
the union of these events is identical with the sample space S.
 That is, S = {𝑨𝟏 ∪ 𝑨𝟐 ∪, . . ., ∪ 𝑨𝒏 }.
For example, being a male and female; event of even and event of odd
number in rolling a die are mutually exclusive and collectively exhaustive
events.

3) Independent and Dependent Events: Two events are said to be


independent if information about one tells nothing about the occurrence of the
other.
 In other words, outcome of one event does not affect, and is not
affected by, the other event.
 The outcomes of successive tosses of a coin are independent of its
preceding toss.
Example: Increase in the population (in per cent) per year in India is
independent of increase in wheat production (in per cent) per year in the USA.
 However, two or more events are said to be dependent if
information about one tells something about the other.
 That is, dependence between characteristics implies that a
relationship exists, and therefore, knowledge of one characteristic
is useful in assessing the occurrence of the other.
For example, drawing of a card (say a queen) from a pack of playing
cards without replacement reduces the chances of drawing a queen
in the subsequent draws.

4) Compound Events: When two or more events occur in connection


with each other, then their simultaneous occurrence is called a
compound event.
 finding the probability of more than one event occurring at the
same time
 These event may be (i) independent, or (ii) dependent.
5) Equally Likely Events: Two or more events are said to be equally
likely if each has an equal chance to occur.
 That is, one of them cannot be expected to occur in preference to
the other.
For example, each number may be expected to occur on the
uppermost face of a rolling die the same number of times in the long
run.
6) Complementary Events: If A is any subset of the sample space(S),
then its complement denoted by (read as E-bar) contains all the
elements of the sample space that are not part of E.
 If S denotes the sample space then,
𝑨=𝑺−𝑨
= {All sample elements not in E}
 Obviously such events must be mutually exclusive and collective
exhaustive.
1.2 Approaches of Probability
 A general definition of probability states that probability is a
numerical measure (between 0 and 1 inclusively) of the likelihood
or chance of occurrence of an uncertain event.
 However, it does not tell us how to compute the probability.
 In this section, we shall discuss different conceptual approaches of
calculating the probability of an event.
 There are three widely used or applied definitions or interpretation
of probability.
 These definitions are:
a) Classical or a priori definition of probability
b) Relative frequency or posterior definition of probability
c) Subjective definition of probability
A) Classical or a priori definition of Probability
 This approach of defining the probability is based on the assumption that all
the possible outcomes (finite in number) of an experiment are mutually
exclusive and equally likely.
Definition: If a random experiment can result in N mutually exclusive and
equally likely outcomes, i.e. if the sample space S consists of N mutually
exclusive and equally likely outcomes, and if NA of these outcomes have an
attribute A.
 Then the probability of A is given by the ratio:

NA 𝑭𝒂𝒗𝒐𝒓𝒂𝒃𝒍𝒆 𝑵𝒖𝒎𝒃𝒆𝒓 𝒐𝒇 𝒐𝒖𝒕𝒄𝒐𝒎𝒆𝒔 𝒘𝒊𝒕𝒉 𝒂𝒕𝒕𝒓𝒊𝒃𝒖𝒕𝒆 𝑨


𝐏(𝐀) = = ………….……(1)
𝑵 𝑻𝒐𝒕𝒂𝒍 𝒑𝒐𝒔𝒔𝒊𝒃𝒍𝒆 𝒐𝒖𝒕𝒄𝒐𝒎𝒆𝒔
 Since the probability of occurrence of an event is based on prior
knowledge of the process involved, therefore this approach is often
called a priori classical probability approach.
 This means, we do not have to perform random experiments to find the
probability of occurrence of an event.
 This also implies that no experimental data are required for
computation of probability.
 Since the assumption of equally likely simple events can rarely be
verified with certainty, therefore this approach is not used often other
than in games of chance.
Example 1: Let the random experiment be tossing of a single die.
S = {1, 2, 3, 4, 5, 6}.
These six outcomes are mutually exclusive since two or more
faces can not turn up at once or simultaneously.
And if the die is fair, or unbiased, the six outcomes are equally
likely that is, each face is expected to appear with about equal
relative frequency in the long run.
In this example the probability of obtaining any single outcome
𝟏 𝟏
say 2, on a single toss is = .
𝑵 𝟔
Example 2: Similarly for the process of selecting a card at random, each
event or card is mutually exclusive, exhaustive, and equiprobable.
 The probability of selecting any one card on a trial is equal to 1/52,
since there are 52 cards.
Hence, in general, for a random experiment with N mutually
exclusive, exhaustive, equiprobable events, the probability of any of
the events is equal to 1/N.
 For instance, if an event A is elementary event, meaning containing
𝟏
a single outcome, then its probability, P (A), is .
𝑵
Example 3: Let the experiment be tossing a single coin 3 times: or
tossing 3 coins simultaneously.
S = {HHH, HHT, HTH, HTT, THH, THT, TTK, TTT}
 If the coin is well balanced, or fair, these 8 possible outcomes are
mutually exclusive and equally likely outcomes.
𝟏
 Each single outcome, or sample point, has the probability of =
𝑵
𝟏
.
𝟖
 Now suppose that we want the probability that the result of a toss
be at least two heads and represent it by B.
B = {HHH, HHT, HTH, THH}
 Then, the probability of the event B occurring is given by
𝑁(𝐵) 4
P B = = = 0.5
𝑁 8
 To sum up, according to the classical definition, the probability P(A) of an
event A is determined a priori without actual experimentation, i.e. no
actual experiment need ever take place.
 In other words it is possible to conceive the experiment, say, of throwing a
coin 3 times with out actually doing it and proceed logically on the
assumption of unbiased coin.
 That is why it is called a priori probability.

NOTE: By classical definition the probability of event A is a number


between 0 and 1 inclusive, i.e. P(A) take values between 0 and 1 inclusive, i.e.
0 ≤ P(A) ≤ 1.
i. P(A)≤1, because total number of possible outcomes can not be less
than the number of outcomes with specified attributes, i.e. N ≥
N(A).
ii. If an event is certain to happen, its probability is 1
iii. If it is certain not to happen, its probability is 0.
Critiques:
 The classical definition breaks down in the following cases.
(a) If outcomes of an experiment are not mutually exclusive and equally
likely.
Example:
 What is the probability of the event of rolling a number 4, with a single
toss of a die if the die is biased?
 To put it differently, if the die is loaded and the probability of 4 equals,
say 0.2, the number 0.2 can not be calculated from the ratio given by
equation (1).
 The classical definition will not help us when we try to answer
questions such as: What is the probability that
 a child born in Addis will be a boy?; a male will die before age
50?; a candidate will pass in a certain test ? it will rain tomorrow ?
 Outcomes of such experiment are not equally likely, so impossible to
answer.
 For example, it is not possible to state in advance, without repetitive
trials(empirical test) of the experiment, the probabilities in cases like
(i) whether a number greater than 3 will appear when die is rolled or
(ii) if 100 items will include 10 defective items.
b) When the number of possible outcomes is infinite
 So, the alternative is to use relative frequency or posterior definition of
probability.
2) Relative Frequency (posterior) definition of probability
 This approach of computing probability is based on the assumption that a
random experiment can be repeated a large number of times under
identical conditions where trials are independent to each other.
 While conducting a random experiment, we may or may not observe the
desired event.
 But as the experiment is repeated many times, that event may occur some
proportion of time.
 Thus, the approach calculates the proportion of the time (i.e.
the relative frequency) with which the event occurs over an
infinite number of repetitions of the experiment under identical
conditions.
 Let’s say N(A) is the number of times that A occurs in the N trials,
𝑵(𝑨)
the ratio appears to converge to a constant limit as N
𝑵
increases indefinitely.
 We can take the ultimate value of this ratio as the P(A).
For example, if a die is tossed N times and N(A) denotes the
number of times the event A (i.e., number 4, 5, or 6) occurs, then the
ratio P(A) = (N(A))/N gives the proportions of times the event A
occurs in n trials, and are also called relative frequencies of the
event in n trials.
 Although our estimate about P(A) may change after every trial,
yet we will find that the proportion (N(A))/N tends to cluster
around a unique central value as the number of trials N
becomes even larger.
 This unique central value (also called probability of event A)(the
relative frequency definition) is given by
𝑵(𝑨)
𝑃 𝐴 = lim ………………………………….…(2)
𝑛→∞ 𝑵
 This is by assumimg a series of experiment can be made
keeping the initial conditions as equal as possible.
 An observation of a random experiment is made, then the
experiment is repeated and another observation is taken.
 When the experiment is repeated up to sufficiently large times, in
many of the cases the observations fall into certain classes where in
the relative frequencies are quite stable.
 This stable relative frequency can be taken as the probability
(approximate) of events.
Example_1: Consider tossing a coin (balanced or unbalanced) 1000 times.
 The following result is expected from this experiment.
OUTCOME Observed Observed Relative Long run expected
Frequency Frequency frequency
T 540 0.54 0.5
H 460 0.46 0.5
Total 1000 1.00 1.00
𝟓𝟒𝟎
 Thus, probability of observing H, = P(H) = ≈ 𝟎. 𝟓; P(T) = 0.5
𝟏𝟎𝟎𝟎
Example_2: If we find up on examination of large records that about 51%
of births, in one locality, are male; it might be reasonable to postulate that
the probability of a male birth in this locality is equal to P and P  0.51, i.e.
P(male birth in that locality) = P  0.51.
Example_3: A study of 8,000 economics graduates of A.A.U was
conducted. The study revealed that 400 of these students were not employed
in their major areas of study.
What is the probability that a particular economics graduate will be
employed in an area other than his/her field of study?
400
P(a graduate will be employed in area other than his/her major) p≈ =
8000
0.05
 What we learn from the above examples is that the probability of an event
happening in the long run is determined by observing what fraction of
the time similar events happened in the past.
𝑵𝒖𝒎𝒃𝒆𝒓 𝒐𝒇 𝒕𝒊𝒎𝒆𝒔 𝒆𝒗𝒆𝒏𝒕 𝒐𝒄𝒄𝒖𝒓𝒆𝒅 𝒊𝒏 𝒕𝒉𝒆 𝒑𝒂𝒔𝒕
𝒑 𝒂𝒏 𝒆𝒗𝒆𝒏𝒕 𝒉𝒂𝒑𝒑𝒆𝒊𝒏𝒈 =
𝑻𝒐𝒕𝒂𝒍 𝑵𝒖𝒎𝒃𝒆𝒓 𝒐𝒇 𝒐𝒃𝒔𝒆𝒓𝒗𝒂𝒕𝒊𝒐𝒏𝒔
NOTE: the relative frequency definition does not
a) require events to be equally likely
b) necessarily requiring the objects of the experiment to be unbiased.

C) Subjective Definition
 If there is little or no past experience on which to base probability it may be
guessed subjectively.
 Essentially, this means evaluating the available information and then estimating
the probability of an event.
 The subjective approach is always based on the degree of beliefs, convictions, and
experience concerning the likelihood of occurrence of a random event.
Example: Estimating:
i) The probability that a new product of a firm will be successful in the market.
ii) The probability that a student will score an A in the course, and
iii) When a person says that the probability of rain tomorrow is, say 70% - he is
expressing his personal degree of belief.
1.3. Axioms of Probability: The Rule of probability
 Before giving axiomatic definition of probability let’s first see, the summary of what
we know about set theory which are relevant to our interest.
De Morgan’s Law
I) (𝐀 ∪ 𝑩)= 𝑨 ∩ 𝑩
II) (𝐀 ∩ 𝑩)= 𝑨 ∪ 𝑩

Example: Suppose Mr. X is interviewed for two Jobs(Bank trainee and


salesman).
Let A = Mr. X will be offered Job Bank Trainee
B = Mr. X will be offered Job Salesman
P(A) = 0.30; P(B)=0.50 and P(A∩ B) = 0.10
Questions: What is the probability that Mr. X
i) will be offered neither Bank Trainee nor Salesman?
ii) Will be not be offered both jobs at the same time?
(Use Venn Diagram for illustration)
Axiomatic Definition of Probability:
 The three Axioms of probability are:
i) P(A) ≥ 0 for every A ∈ S or Ω.
ii) P(S) or P(Ω) = 1
iii) P(𝐴1 ∪ 𝐴2 ∪ 𝐴3 ∪...𝐴𝑛 ) = P(𝐴1 ) + P(𝐴2 ) + P(𝐴3 ) +…P(𝐴𝑛 )
given 𝐴1 , 𝐴2 ,..,𝐴𝑛 are mutually exclusive or disjoint events i.e.
𝑨𝒊 ∩ 𝑨𝒋 = ∅.
OR p(∪∞ 𝑨
𝒊=𝟏 𝒊 )= ∞
𝒊=𝟏 𝒑(𝑨𝒊 )
Illustration: If S or Ω is a finite sample space, then if each outcome is
equally likely, we define the probability of A as the fraction of outcomes
that are in A.
NA
𝐏(𝐀) =
𝑵
 Thus, computing P(A) means counting the number of outcomes in the
event A and the number of outcomes in the sample space S or Ω and
dividing.
 This simple formula provides the first two axioms.
Questions for Exercise
Find the probabilities under equal likely outcomes.
i) Probability of head in a single coin toss?
ii) Probability of at least tow heads in tossing a coin three times?
ii) Probability of sum is 9 in a die rolling twice?
iii) Probability of exactly three heads in tossing a coin four times?
iv) Probability of exactly four heads in tossing a coin four times?
v) Probability of at least three heads in tossing a coin four times?

NOTE: Answers for questions iii to v provide us a proof for the third
axiom.
Rules(Properties) of Probability
 Other properties that we associate with a probability can be derived from the axioms.
1) The Complement Rule:
 Sometimes it is easier to calculate the probability of an event happening by determining
the probability of it not happening and subtracting the result from 1.
 Because A and its complement 𝐴𝐶 = {ω; ω ∉ A} are mutually exclusive and
collectively exhaustive, the probabilities of A and 𝐴𝐶 sum to 1.
P(A) + P(𝑨𝑪 ) = P(A ∪ 𝑨𝑪 ) = P(Ω) = 1
or P(𝑨𝑪 ) = 1 − P(A).
Example_1: if we toss a biased coin, we may want to say that P{heads} = p where p is not
necessarily equal to 1/2.
By necessity, P{tails} = 1 − p.
Example_2: Toss a coin 4 times.
5 𝟏𝟏
P{fewer than 3 heads} = 1 − P{at least 3 heads} = 1 − =
16 𝟏𝟔
2) The Difference Rule:
 We write B\A to denote the outcomes that are in B but not in A.
 If A ⊂ B, then
P(B \ A) = P(B) − P(A)
(The symbol ⊂ denotes “contains in”. A and B\A are mutually exclusive and their
union is B.
Thus, P(B) = P(A) + P(B\A).
Example_1: In A single die roll: let
A: an event of odd number
B: a number less than or equal to five
Solution:
A = {1, 3, 5} B = {1, 2, 3, 4, 5}
B/A = {2, 4}
P(B) = P(A) + P(B\A)

Thus, 5/6 = 1/2 + 1/3


Rules of Addition
3) Special Rule of Addition
 The special rule of addition is used when events are mutually exclusive.
 Recall that mutually exclusive means that when one event occurs, none of
the other events can occur at the same time.
 An illustration of mutually exclusive events in the die-tossing experiment is
the events “a number 4 or larger” and “a number 2 or smaller.” If the
outcome is in the first group {4, 5, and 6}, then it cannot also be in the
second group {1 and 2}.
𝑷(𝑨 𝒐𝒓 𝑩) = 𝑷(𝑨) + 𝑷(𝑩)
 For three mutually exclusive events designated A, B, and C, the rule is
written:
P(A or B or C) = P(A) + P(B) + P(C)
Example_1: What is the probability of either head or tail in a single toss of
coin?
Solution:
P(Head or Tail) = P(Head) + P(Tail) = ½ + ½ = 1 = P(S)
4) The General Rule of Addition:
 The outcomes of an experiment may not be mutually exclusive.
 For example, let’s assume that Ethiopian tourism bureau reports that 400
tourists visited two Ethiopian tourist sites during the year 2020. The report also
revealed that 240 tourists went to Lalibela and 200 went to Sof Omar caves.
Question: What is the probability that a person selected visited either Lalibela or
Sof Omar Caves?
Solution:
 If the special rule of addition is used, the probability of selecting a tourist who
went to Lalibela’s is 0.60, found by 240/400.
 Similarly, the probability of a tourist going to Sof Omar caves is .50.
 The sum of these probabilities is 1.10.
 We know, however, that this probability cannot be greater than 1.
 The explanation is that many tourists visited both sites and are being counted
twice!
 A check of the survey responses revealed that 120 out of 400 sampled did,
in fact, visit both attractions.
 To answer our question, “What is the probability a selected person
visited either Lalibela or Sof Omar caves?” (1) add the probability that a
tourist visited Lalibela and the probability he or she visited Sof Omar caves,
and (2) subtract the probability of visiting both.
 Thus: P(Lalibela or Sof Omar caves) = P(Lalibela) + P(Sof Omar caves)
- P(both Lalibela and Sof Omar caves)
= 0.6 + 0.50 – 0.30
= 0.8
 When two events both occur, the probability is called a joint probability.
 The probability (.30) that a tourist visits both sites is an example of a joint
probability.
 So the general rule of addition, which is used to compute the probability
of two events(A, B) that are not mutually exclusive, is:
𝑷(𝑨 𝒐𝒓 𝑩) = 𝑷(𝑨) + 𝑷(𝑩) – 𝑷(𝑨 𝒂𝒏𝒅 𝑩)
Example_1: What is the probability that a card chosen at random from a standard
deck of cards will be either a king or a heart?
Solution:
 There 4 cards with a king(K) in a deck of 52 cards.
 There are 13 cards with a heart in a deck of 52 cards.
 In the thirteen(13) heart cards there is one card with a king(K).
 Thus, P(king) = 4/52; P(Heart)= 13/52 and P(King and heart) = 1/52

P(either a King or a heart) = P(King) + P(Heart) – P(King and Heart)


=4/52 + 13/52 – 1/52 = 16/52 = 0.3077

Exercises: The events X and Y are mutually exclusive. Suppose P(X) = .05
and P(Y) = .02.
i) What is the probability of either X or Y occurring?
ii) What is the probability that neither X nor Y will happen?
Rules of Multiplication
5) The Special Rule of Multiplication
 The special rule of multiplication requires that two events A and B are
independent.
 Two events are independent if the occurrence of one event does not alter
the probability of the occurrence of the other event.
Example: suppose two coins are tossed. The outcome of a coin toss (head or
tail) is unaffected by the outcome of any other prior coin toss (head or tail).
 For two independent events A and B, the probability that A and B will both
occur is found by multiplying the two probabilities.
 This is the special rule of multiplication and is written symbolically as:
𝑷(𝑨 𝒂𝒏𝒅 𝑩) = 𝑷(𝑨)𝑷(𝑩)
 For three independent events, A, B, and C, the special rule of multiplication
used to determine the probability that all three events will occur is:
𝑷(𝑨 𝒂𝒏𝒅 𝑩 𝒂𝒏𝒅 𝑪) = 𝑷(𝑨)𝑷(𝑩)𝑷(𝑪)
Example_1: A survey by the American Automobile Association (AAA)
revealed 60% of its members made airline reservations last year. Two members
are selected at random. What is the probability both made airline reservations
last year?
Solution:
 The probability the first member made an airline reservation last year is .60,
written P(𝑅1 ) = 0.60, where 𝑅1 refers to the fact that the first member made a
reservation. The probability that the second member selected made a
reservation is also .60, so P(𝑅2) = 0.60.
 Because the number of AAA members is very large, you may assume that 𝑹𝟏
and 𝑹𝟐 are independent. Consequently, using the previous formula, the
probability they both make a reservation is 0.36, found by:
P(𝑅1 and 𝑅2) = P(𝑅1 )P(𝑅2 = (.60)(.60) = 0.36
 Furthermore, recall R indicates that a reservation is made and 𝑅 indicates
no reservation is made.
 The complement rule is applied to compute the probability that a member
does not make a reservation, P(𝑹) = 0.40.
 Using this information, the probability that neither member makes a
reservation, [P( 𝑅1 ) P 𝑅2)] = (.40)(.40) = 0.16.

6) The General Rule of Multiplication:


 If two events are not independent, they are referred to as dependent.
Example: A Player has 12 golf shirts in his closet. Suppose nine of these
shirts are white and the others blue. He gets dressed in the dark, so he just
grabs a shirt and puts it on. He plays golf two days in a row and does not
launder and return the used shirts to the closet (Replacement). What is the
likelihood both shirts selected are white?
Solution:
The event that the first shirt selected is white is W1. The probability is
P(W1) = 9/12 because nine of the 12 shirts are white. The event that
the second shirt selected is also white is identified as W2. The
conditional probability that the second shirt selected is white, given
that the first shirt selected is also white, is P(W2 | W1) = 8/11. Why is
this so?
 Because after the first shirt is selected, there are only 11 shirts
remaining in the closet and eight of these are white.
 To determine the probability of two white shirts being selected, we
use formula (5–6). P(W1 and W2) = P(W1)P(W2 | W1) = ( 9/12)(
8/11) = 0.55
 Thus, the general of multiplication for two events is given as:
P(A and B) = P(A)P(B/A)
1.4 Counting Procedures
 If the number of possible outcomes in an experiment is small, it is
relatively easy to list and count all the possible events and assign
probability to all events.
 However, if the size of the event, say N(A), and the size of the sample
space, N(S) are large for a given random experiment with a finite
number of equally likely outcomes, the counting procedure become
difficult problem.
 Such counting is usually handled by use of counting procedures known as
combinatorial formulas.
 There are three widely used counting (Enumeration) methods namely,
1) Multiplication Principle
2) Permutations
3) Combinations
1) Multiplication Principle
Suppose we have two sets F and T, if F has m distinct objects f1, f2, . . . fm
and T has n distinct objects, t1, t2, . . . tn, then the number of pairs ( fi, tj) that
can be formed by taking one object from set F and a second from the set T is
(m)(n).
Example 1: Let the random experiment be throwing a balanced coin and
fair die and record the paired outcomes. What is the total possible outcome of the
experiment?
Solution: If set F contains total possible outcomes of throwing a coin its
elements are ( H, T) and 𝑵(𝑭) = 𝟐 and if set T contains the total possible out
comes of throwing a die its elements are *1, 2, 3, 4, 5, 6+ and 𝑵 (𝑻) = 𝟔 . The
outcomes of our random experiment are obtained by the Cartesian cross products
of the two sets. That is (H, T) X ( 1, 2, 3, 4, 5, 6)
Thus, total possible paired outcomes are = 𝟐 ∗ 𝟔 = 𝟏𝟐
 In general, if there are m ways of doing one thing and n ways of
doing another thing, there are (m)(n) ways doing both .
Multiplication formula: Number of arrangements = (m)(n)
Note: This principle obviously can be extended to any number of
procedures.
 That is if there are m ways of doing one thing n ways of doing
another thing and r ways of doing the third thing then total number of
arrangements is given by (m)(n)( r).
Example: A manufactured item must pass through three controls
stations. At each station the item is inspected for particular
characteristics and marked accordingly. At the first station, 3 ratings are
possible while at the last two stations 4 ratings are possible.
In how many possible ways may the item be marked?
Solution: 3 x 4 x 4 = 48 possible ways
2) Permutation Rule
 This rule of counting involves ordering or permutations. This rule helps us to
compute the number of ways in which n distinct objects can be arranged,
taking r of them at a time.
Suppose that we have n different objects. Then the question is in how many
ways may these objects be arranged (Permuted ) where the order of
arrangement is important. That is ABC and CBA are considered as two
different arrangements.
 Let’s consider the following scheme. Arranging n objects is equivalent to
filling them into a box with n compartments in some specified order.
n n-1 … 2 1
 We have n choices to fill the first compartment. Once we choose any one of n
objects to fill the first compartment we will have ( n-1) options to fill the
second compartment etc. and for the last compartment we have exactly one
option, the total number of arrangements denoted by n Pn, is given by:
𝒏 𝑷𝒏 = 𝒏 𝒏 − 𝟏 𝒏 − 𝟐 𝒏 − 𝟑 … (𝟐)(𝟏) = 𝒏!
Example: Five Students can line up in 5! number of ways which is, 120
different ways.
 Besides, we may be interested in the number of permutations possible
when we choose r (<n) objects from n. That is an arrangement
(Permutation ) of n objects taking r objects at a time. Like the above
case the first compartment position can be filled in n ways, the second
in ( n-1) ways etc, and the last compartment in (𝒏 − 𝒓 + 𝟏) ways.
 Hence total permutation of n objects taken 4 at a time is:
𝒏!
nPr = 𝒏 𝒏 − 𝟏 𝒏 − 𝟐 𝒏 − 𝟑 … 𝒏 − 𝒓 + 𝟏 =
(𝒏 − 𝒓)!
Example: A manufacturer uses a color code to identify lots of
manufactured items. The code consists of stamping seven colored stripes
on the container. The order of the color is significant and each
identification uses all 7 colors.
Suppose that the manufacturer wishes to use seven stripes but has 15
colors available. How many distinct markings can he get?
Solution:
𝟏𝟓! 𝟏𝟓!
15P7 = = = 𝟑𝟐, 𝟒𝟑𝟐, 𝟒𝟎𝟎
(𝟏𝟓 − 𝟕)! 𝟖!

3) Combination Rule
 Sometimes the ordering or arrangement of objects is not important, but
only the objects that are chosen. For example, we may not care in what
order the books are placed on the shelf, but only which books you are able
to shelve. In addition, when a five-person committee is chosen from a
group of 10 students, the order of choice is not important because all 5
students will be equal members of committee.
 This counting rule for combinations allows us to select r (say) number of
outcomes from a collection of n distinct outcomes without caring in what
order they are arranged. This rule is denoted by
𝑛 𝒏!
nCr 𝒐𝒓 𝑟 =
(𝒏 − 𝒓)! 𝒓!
Example 1: From eight persons, how many committees of 3 members
may be chosen?
Solution: Since two committees are the same if they are made up of the same
members we have,
8 𝟖! 𝟖!
C
8 3 𝒐𝒓 = = = 𝟓𝟔 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑡 𝑐𝑜𝑚𝑚𝑖𝑡𝑒𝑠
𝟑 (𝟖−𝟑)!𝟑! 𝟓!𝟑!

Exercise 1: A class consists eight PhD students, 5 males and 3 females. A


committee consisting two males and one female is required by an office.
How many possible number of committees are there?
Exercise 2: In how many possible ways can a three digit car plate be
made from Arabic numbers(0 to 9) given the last digit as a single letter A?
Exercise 3: Three items out of ten are defective but it is not known
which are defective. In how many ways can three items be selected? How
many of these selections will include at least one defective item?
1.5 Conditional Probability and Independence
 When two events happen, the outcome of the first event may or may
not have an effect on the outcome of the second event. That is, the
events may be either dependent or independent.
Definition: Statistically independence is the case when the occurrence
of an event has no effect on the probability of the occurrence of any other
event. On the other hand, statistical dependence exists when the
probability of some event is dependent up on or affected by the occurrence
of some other event.
 In this section, we examine events that are statistically independent.
 There are 3 types of probabilities under statistical independence:
1) Marginal(unconditional) Probability
2) Joint Probability
3) Conditional Probability
1.5.1 Conditional Probability
 An experiment is repeated N times and on each occasion we observe the
occurrence or non-occurrence of two events, say, A and B.
 Given these two events we are interested to know the probability of
event A given that event B has occurred.
 This probability is known as conditional probability of event A, given
that B has occurred and denoted by P(A/B). Similarly conditional
probability of event B, given that A has occurred is expressed as
P(B/A).
 The conditional probability of events can be computed under statistical
dependence and statistical independence.
Conditional Probability Under Statistical Independence
 For statistically independent events, the conditional probability of event
B given that event A has occurred is simply the probability of event B:
𝑷 (𝑩/𝑨) = 𝑷 (𝑩).
 Thus, events A and B are defined to be independent if and only if any of the
following conditions are satisfied.
1) P(A ∩ B) = P(A) ∗ P(B)
2) 𝑷 (𝑨/𝑩) = 𝑷 (𝑨)
3) 𝑷 (𝑩/𝑨) = 𝑷 (𝑩)
Example 1: What is the probability that the 𝟐𝒏𝒅 toss of a fair coin will result
in heads, given that heads resulted on the first toss?
Solution: In this case the two events are independent.
Symbolically: the question is written as: 𝑃(𝐻2 /𝐻1 ) ??
 Using conditional probability under statistically independent situation,
𝑷(𝑯𝟐 /𝑯𝟏 ) = P (𝑯𝟐)
 Thus, 𝑷(𝑯𝟐 /𝑯𝟏 ) = 0.5
Implication: In the case of independent events the probability of occurrence
of either of the events does not depend or affect the occurrence of the others.
 Therefore in the coin tossing example, the probability of a head occurrence in
the second toss, given that head resulted in the first toss, is still 0.5.
Example 2: Suppose we have two red and three white balls in a bag.
Draw a ball with replacement. Are these events independent?
Solution: Let A = the event that the first draw is red.
B = the event that the second draw is red.
𝟐
𝑷 (𝑨/𝑩) = 𝑷 (𝑨) = = 𝟎. 𝟒
𝟓
𝟐
𝑷 (𝑩/𝑨) = 𝑷 (𝑩) = = 𝟎. 𝟓
𝟓
 Thus, events A and B are independent events.
Example 3: Two computers A and B are to be marketed. A salesman who
is assigned the job of finding customers for them has 60 percent and 40 percent
chances, respectively of succeeding for computers A and B. The two
computers can be sold independently. Given that he was able to sell at least
one computer, what is the probability that computer A has been sold?
Solution: Let us define the events as:
A = Computer A is marketed and
B = Computer B is marketed.
 It is given that P(A) = 0.60,
P(B) = 0.40 and
𝑃 𝐴 𝑎𝑛𝑑 𝐵 𝒐𝒓 𝑃 𝐴 ∩ 𝐵 = 𝑃 𝐴 𝑃 𝐵
= 0.60 × 0.40 = 𝟎. 𝟐𝟒
 Hence, the probability that computer A has been sold given that the
salesman was able to sell at least one computer is given by:
𝑷*𝑨∩ 𝑨∪𝑩 + 𝑷(𝑨) 𝑷(𝑨)
𝑃(𝐴/𝐴 ∪ 𝐵) = 𝑷(𝑨∪𝑩)
= =
𝑷(𝑨∪𝑩) 𝑷 𝑨 +𝑷 𝑩 −𝑷(𝑨∩𝑩)
𝟎.𝟔𝟎
= = 𝟎. 𝟕𝟖𝟗
𝟎.𝟔𝟎+𝟎.𝟒𝟎−𝟎.𝟐𝟒
Exercise 1: A bag contains 3 red balls, 2 blue balls and 5 white
balls. A ball is selected and its color noted. Then it is replaced. A second
ball is selected and its color noted. Find the probability of each of these.
i) Selecting 2 blue balls
ii) Selecting 1 blue ball and then 1 white ball
iii) Selecting 1 red ball and then 1 blue ball
Exercise 2: Approximately 9% of men have a red-green color
blindness. If 3 men are selected at random, find the probability that all
of them will have this type of red-green color blindness.
Exercise: 3: What is the probability that a couple’s second child
will be:
a) A boy, given that their first child was a girl?
b) A girl, given that their first child was a girl?
Conditional Probability Under Statistical Dependence
Definition: Suppose two events A and B are in the sample space S. If it is known
that an element randomly drawn from S belongs to B, then the probability that it also
belongs to A is defined to be the conditional probability of A given B.
 In this case, the occurrence of event A depends on the occurrence of event B.
 Thus, this concept answers the question: what is the probability that event A
occurs, knowing that event B has already occurred.
𝑃(𝐴∩𝐵)
 Symbolically, 𝑷 𝑨/𝑩 = if 𝑷(𝑩) > 𝟎
𝑷(𝑩)
 Using the Venn diagram, conditional probability is the area shown below:
Example 1: A box contains black chips and white chips. A person selects two chips
without replacement. If the probability of selecting a black chip and a white chip is 𝟎. 𝟐𝟔𝟖,
and the probability of selecting a black chip on the first draw is 𝟎. 𝟑𝟕𝟓, find the probability of
selecting the white chip on the second draw, given that the first chip selected was a black
chip.
Solution: Let B = selecting a black chip
W = selecting a white chip, then,
𝑷(𝑩 𝒂𝒏𝒅 𝑾) 𝟎. 𝟐𝟔𝟖 𝟓
𝑷 𝑾𝑩 = = = = 𝟎. 𝟕𝟏
𝑷(𝑩) 𝟎. 𝟑𝟕𝟓 𝟕
Example 2: A market survey was conducted in four cities to find out the preference for a
soap brand marked as XX. The responses are shown below:
Response City A City B City C City D
Yes 45 55 60 50
No 35 45 35 45
No opinion 5 5 5 5
Given the information, what is the probability that
A) a consumer selected at random, preferred brand XX?
B) a consumer preferred brand XX and was from city C?
C) a consumer preferred brand XX given that he was from city C?
D) a consumer was from city D given that consumer preferred brand XX?
Solution: Lets first summarize the given information as follow.
Response City A City B City C City D Total
Yes 45 55 60 50 210
No 35 45 35 45 160
No opinion 5 5 5 5 20
Total 85 105 100 100 = 𝟑𝟗𝟎
 Now, Let E denote the event that a consumer selected at random preferred brand XX.
Then,
A) The probability that a consumer selected at random preferred brand XX is:
𝟐𝟏𝟎
𝑷(𝑬) = = 𝟎. 𝟓𝟑𝟗𝟖
𝟑𝟗𝟎
B) The probability that a consumer preferred brand XX and was from city C
is:
𝟔𝟎
𝑷(𝑬 ∩ 𝑪) = = 𝟎. 𝟏𝟓𝟑𝟖
𝟑𝟗𝟎
C) The probability that a consumer preferred brand XX, given that he was
from city C:
𝑷(𝑬 𝒂𝒏𝒅 𝑪) 𝟎. 𝟏𝟓𝟑𝟖 𝟎. 𝟏𝟓𝟑𝟖
𝑷 𝑬𝑪 = = = = 𝟎. 𝟓𝟗𝟕
𝑷(𝑪) 𝟏𝟎𝟎/𝟑𝟗𝟎 𝟎. 𝟐𝟓𝟔
D) The probability that the consumer belongs to city D, given that he
preferred brand XX,
𝑷(𝑫 𝒂𝒏𝒅 𝑬) 𝟓𝟎 𝟓𝟎
𝑷 𝑫𝑬 = = 𝟑𝟗𝟎 = = 𝟎. 𝟐𝟑𝟖
𝑷(𝑬) 𝟐𝟏𝟎 𝟐𝟏𝟎
𝟑𝟗𝟎
Exercise 1: Two events, A and B, are statistically dependent, If 𝑷 (𝑨) = 𝟎. 𝟑𝟗,
𝑷 (𝑩) = 𝟎. 𝟐𝟏, and 𝑷(𝑨 𝒐𝒓 𝑩) = 𝟎. 𝟒𝟕, find the probability that,
i. Neither A nor B will Occur?
ii. Both A and B will occur?
iii. B will occur given that A has occurred?
iv. A will occur, given that B has occurred?
Exercise 2: The personnel department of a given company has records of its 200
engineers as below. Age BA Degree MSc Degree
Under 30 90 10
30 to 40 20 30
Over 40 40 10

 If one engineer is selected at random from the company, find the probability that,
A) he has only a bachelor’s degree.
B) he has a master’s degree, given that he is over 40.
C) he is under 30, given that he has only a bachelor’s degree.
1.6 Bayes' Theorem
 Thomas Bayes was the 18𝑡ℎ century British minister and clergyman.
He was interested in the question: Does God really exist?
 To answer this question, he attempted to develop a formula to
determine the probability that God does exist based on evidence that
was available to him on earth.
 Later Laplace refined Bayes’ work and gave it the name Bayes’
Theorem.
 The Bayes’ theorem is useful in revising the original probability
estimates of known outcomes as we gain additional information about
these outcomes. The prior probabilities, when changed in the light of
new information, are called revised or posterior probabilities.
 It is a method to compute posterior probabilities (conditional
probabilities under statistical dependence)(is based on conditional
probabilities).
 Suppose 𝑨𝟏 , 𝑨𝟐 , . . . , 𝑨𝒏 represent n mutually exclusive and collectively
exhaustive events with prior marginal probabilities 𝐏(𝑨𝟏 ), 𝐏(𝑨𝟐 ). . . ,𝐏(𝑨𝒏 ). Let B
be an arbitrary event with 𝑷(𝑩) ≠ 𝟎 for which conditional probabilities 𝐏(𝐁|𝑨𝟏 ),
𝐏(𝐁|𝑨𝟐 ), . . . , 𝐏(𝐁|𝑨𝒏 ), are also known.
 Given the information that outcome B has occurred, the revised (or posterior)
probabilities 𝐏(𝑨𝒊 |𝑩), are determined with the help of Bayes’ theorem as:
𝑷(𝑨𝒊 ∩ 𝑩)
𝐏 𝑨𝒊 𝑩 = … … … … … … … … … … … … … … … … … … . . . (𝒂)
𝑷(𝑩)
Where 𝐏 𝑨𝒊 𝑩 is read as the posterior probability of events 𝑨𝒊 given event B (thus,
conditional probability).
 Since events 𝑨𝟏 , 𝑨𝟐 , . . . , 𝑨𝒏 are mutually exclusive and collectively exhaustive, the
event B is bound to occur with either 𝑨𝟏 , 𝑨𝟏 , . . . ,𝑨𝒏 . That is,
𝑩 = 𝑨𝟏 ∩ 𝑩 ∪ 𝑨𝟐 ∩ 𝑩 ∪ ⋯ ∪ (𝑨𝒏 ∩ 𝑩)
 Since 𝑨𝟏 ∩ 𝑩 , 𝑨𝟐 ∩ 𝑩 ,…(𝑨𝒏 ∩ 𝑩) are mutually exclusive,
𝑷 𝑩 = 𝑷 𝑨𝟏 ∩ 𝑩 + 𝑷 𝑨𝟐 ∩ 𝑩 + ⋯ + 𝑷(𝑨𝒏 ∩ 𝑩)
= 𝒏𝒊=𝟏 𝑷(𝑨𝒊 ∩ 𝑩)
= 𝑷 𝑨𝟏 𝑷(𝑩|𝑨𝟏 ) + 𝑷 𝑨𝟐 𝑷(𝑩|𝑨𝟐 ) + ⋯ + 𝑷 𝑨𝒏 𝑷(𝑩|𝑨𝒏 )
𝒏

𝑷 𝑩 = 𝑷 𝑨𝒊 𝑷(𝑩|𝑨𝒊 )
𝒊=𝟏
 From formula (a) given in the previous, for a fixed i, we have,
𝑷(𝑨𝒊 ∩ 𝑩) 𝑷 𝑨𝒊 𝑷(𝑩|𝑨𝒊 )
𝐏 𝑨𝒊 𝑩 = =
𝑷(𝑩) 𝑷 𝑨𝟏 𝑷(𝑩|𝑨𝟏 ) + 𝑷 𝑨𝟐 𝑷(𝑩|𝑨𝟐 ) + ⋯ + 𝑷 𝑨𝒏 𝑷(𝑩|𝑨𝒏 )

 The steps in this probability revision process are shown in figure below.
Illustration
Example 1: Suppose 5 percent of the population of X, fictional Third
World country, have a disease that is peculiar to that country. We will let 𝑨𝟏
refer to the event “has the disease” and 𝑨𝟐 refer to the event “does not have
the disease.”
 Thus, we know that if we select a person from the X at random, the
probability the individual chosen has the disease is 0.05, or 𝐏 𝑨𝟏 =
𝟎. 𝟎𝟓.
 This probability, 𝐏 𝑨𝟏 = 𝑷 𝒉𝒂𝒔 𝒕𝒉𝒆 𝒅𝒊𝒔𝒆𝒂𝒔𝒆 = 𝟎. 𝟎𝟓, is called the
Prior Probability. It is given this name because the probability is
assigned before any empirical data are obtained.
Prior Probability is the initial probability based on the present level of
information.
 Thus, the prior probability a person is not afflicted with the disease is
0.95, or 𝐏 𝑨𝟐 = 𝟎. 𝟗𝟓.
Additional Information
 There is a diagnostic technique to detect the disease, but it is not very
accurate. Let B denote the event “test shows the disease is present.”
 Assume that historical evidence shows that if a person actually has the disease,
the probability that the test will indicate the presence of the disease is 0.90.
 Using the conditional probability definitions developed, this statement is written
as:
𝐏 𝑩|𝑨𝟏 = 𝟎. 𝟗𝟎
 Assume the probability is 0.15 that for a person who actually does not have the
disease the test will indicate the disease is present.
𝐏 𝑩|𝑨𝟐 = 𝟎. 𝟏𝟓
 Let’s randomly select a person from X and perform the test. The test results indicate
the disease is present. What is the probability the person actually has the
disease?
 That is 𝑷(𝒉𝒂𝒔 𝒕𝒉𝒆 𝒅𝒊𝒔𝒆𝒂𝒔𝒆|𝒕𝒉𝒆 𝒕𝒆𝒔𝒕 𝒓𝒆𝒔𝒖𝒍𝒕𝒔 𝒂𝒓𝒆 𝒑𝒐𝒔𝒊𝒕𝒊𝒗𝒆)
 The probability 𝐏 𝑨𝟏 |𝑩 is called a Posterior Probability.
Posterior Probability is a revised probability based on additional
information.
 Now, the remaining step will be to determine the posterior probability, and
with the help of Bayes’ theorem we can determine the posterior probability as
follow:
𝑷 𝑨𝟏 𝑷(𝑩|𝑨𝟏 )
𝐏 𝑨𝟏 𝑩 =
𝑷 𝑨𝟏 𝑷(𝑩|𝑨𝟏 ) + 𝑷 𝑨𝟐 𝑷(𝑩|𝑨𝟐
𝟎. 𝟎𝟓 ∗ 𝟎. 𝟗𝟎
𝐏 𝑨𝟏 𝑩 = = 𝟎. 𝟐𝟒
𝟎. 𝟎𝟓 ∗ 𝟎. 𝟗𝟎 + 𝟎. 𝟗𝟓 ∗ 𝟎. 𝟏𝟓
 So the probability that a person has the disease, given that he or she tested
positive, is 0.24.
How is the result interpreted? If a person is selected at random
from the population, the probability that he or she has the disease is 0.05.
 If the person is tested and the test result is positive, the probability that
the person actually has the disease is increased about fivefold, from 0.05
to 0.24.
Example 2: Suppose an item is manufacture by three machines X, Y, and Z having
equal capacity and operational rate. It is known that the percentages of defective items
produced by X, Y, and Z are 2, 7, and 12 percent, respectively. All the items produced by
X, Y, and Z are put into one bin. From this bin, one item is drawn at random and is found
to be defective. What is the probability that this item was produced on Y?
Solution: Let A be the defective item. We know the prior probability of defective
items produced on X, Y, and Z, that is, 𝑷(𝑿) = 𝟏/𝟑; 𝑷(𝒀) = 𝟏/𝟑 and 𝑷(𝒁) = 𝟏/𝟑.
 We also know that 𝑃(𝐴|𝑋) = 0.02, 𝑷(𝑨|𝒀) = 𝟎. 𝟎𝟕, 𝑷(𝑨|𝒁) = 𝟎. 𝟏𝟐.
𝑷 𝒀 𝑨 = ???
 Then,
𝑷 𝒀 𝑷(𝑨|𝒀)
𝐏 𝒀𝑨 =
𝑷 𝑿 𝑷(𝑨|𝑿) + 𝑷 𝒀 𝑷(𝑨|𝒀) + 𝑷 𝒁 𝑷(𝑨|𝒁)
1
0.07 ∗ ( ) 0.07 0.07
= 3 = =
1 1 1
∗ 0.02 + ∗ 0.07 + ∗ (0.12) 0.02 + 0.07 + 0.12 0.21
3 3 3
= 𝟎. 𝟑𝟑𝟑 or 33.3% they it will be from machine Y.
Exercise 1: A factory produces certain types of output by three machines.
Daily production for Machine 𝑨 = 3000 units; Machine 𝑩 = 2500 units;
and Machine 𝑪 = 4500 units, respectively. Past experience shows that 1%,
1.2% and 2% of the outputs from machines A, B and C, respectively, are
defective. An item is drawn at random from the day’s production and is found
to be defective. What is probability that it comes from the output of
a) Machine A?
b) Machine B? and
c) Machine C ?
Exercise 2: In a town, 10% of all adults over 50 years have diabetes. If a
doctor correctly diagnoses 90% of all persons with diabetes as having the
disease and incorrectly diagnoses 2% of all persons without diabetes as having
the disease, what is the probability of an adult over 50 is diagnosed by this
doctor as having diabetes and actually has the disease.
END !

You might also like