0% found this document useful (0 votes)
54 views80 pages

MAT 311 Notes

The document is a set of lecture notes for MAT 311 - Theory of Probability at Notre Dame of Maryland University, covering various topics including foundations of probability, conditional probability, independence, and probability distributions. It includes definitions, examples, and theorems related to probability, as well as counting rules and applications. The content is structured into chapters and sections, providing a comprehensive overview of probability theory for students.

Uploaded by

gayvondavon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views80 pages

MAT 311 Notes

The document is a set of lecture notes for MAT 311 - Theory of Probability at Notre Dame of Maryland University, covering various topics including foundations of probability, conditional probability, independence, and probability distributions. It includes definitions, examples, and theorems related to probability, as well as counting rules and applications. The content is structured into chapters and sections, providing a comprehensive overview of probability theory for students.

Uploaded by

gayvondavon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

MAT 311 – THEORY OF

PROBABILITY
Notes

FALL 2021
NOTRE DAME OF MARYLAND UNIVERSITY
Dr. K. Erickson
Table of Contents
Chapter 2 – Foundations of Probability ........................................................................................................ 1
Section 2.2 – Sample Space and Events ................................................................................................... 1
Section 2.3 – Definition of Probability ..................................................................................................... 3
Section 2.4 – Counting Rules Useful in Probability ................................................................................. 5
Chapter 3 – Conditional Probability & Independence .................................................................................. 9
Section 3.1 – Conditional Probability ....................................................................................................... 9
Section 3.2 - Independence ..................................................................................................................... 11
Section 3.3 – Theorem of Total Probability and Bayes’ Theorem.......................................................... 15
Section 3.4 – Odds, Odds Ratio, and Relative Risk ................................................................................ 18
Chapter 4 – Discrete Probability Distributions ........................................................................................... 19
Section 4.1 – Random Variables and their Probability Distributions ..................................................... 19
Section 4.2 – Expected Values of Random Variables............................................................................. 24
Section 4.3 – The Bernoulli Distribution ................................................................................................ 27
Section 4.4 – The Binomial Distribution ................................................................................................ 28
Section 4.5 – The Geometric Distribution .............................................................................................. 31
Section 4.6 – The Negative Binomial Distribution ................................................................................. 35
Section 4.7 – The Poisson Distribution................................................................................................... 38
Section 4.8 – The Hypergeometric Distribution ..................................................................................... 45
Section 4.9 – The Moment-Generating Function.................................................................................... 51
Chapter 5 – Continuous Probability Distributions ...................................................................................... 55
Section 5.1 – Continuous Random Variables and their Probability Distributions .................................. 55
Section 5.2 – Expected Values of Continuous Random Variables ......................................................... 65
Section 5.3 – The Uniform Distribution ................................................................................................. 69
Section 5.4 – The Exponential Distribution ............................................................................................ 73
Chapter 2 – Foundations of Probability Section 2.2 – Sample Space and Events

Chapter 2 – Foundations of Probability


Section 2.2 – Sample Space and Events

Football games typically start with a coin flip. Under identical conditions, the coins land on
heads ½ of the time and tails ½ of the time.
Dice – If we toss a die, the result of the experiment is that it will come up with one of the
numbers in the set {1, 2, 3, 4, 5, 6}.
Bolts – If we are making Bolts with a machine, the result of the experiment is that some may be
defective. Thus, when a bolt is made, it will be a member of the set {defective, nondefective}
Definition: Sample Space –

Examples
1. Toss 1 die, a sample space is given by S1 = {1, 2, 3, 4, 5, 6} or S2 = {even, odd}. Which
one makes more sense?
2. Toss a coin twice. S = {HH, HT, TH, TT}
Definition: Event –
Example
1. Toss a coin twice. S = {HH, HT, TH, TT}. What is the event that only one head comes
up?
Set operations.
1. A  B - The union of two sets is the 2. A  B = AB - the intersection is
event that of either A or B or both. where two sets overlap.
Think marriage…they become 1.

3. A or AC or A - is the event not A. 4. Disjoint Set - A  B =  - These


This is called the compliment. events are mutually exclusive.

1
Chapter 2 – Foundations of Probability Section 2.2 – Sample Space and Events

The operations of sets obey certain rules that are similar to the rules of algebra

Commutative Laws: A  B = B  A A B = B  A
Associative Laws: ( A  B )  C = A  ( B  C ) ( A  B)  C = A  ( B  C )
Distributive Laws: A  ( B  C ) = ( A  B )  ( A  C ) A  (B C ) = ( A  B)  ( A C )

DeMorgan’s Laws
1. A  B = A  B 2. A  B = A  B

Example – Page 20 # 2.8: In an isolated housing area consisting of 50 households, the residents
are allowed to have at most one dog and at most one cat per household. Currently 25
households have a dog, 7 have both a cat and a dog, and 18 have neither a cat nor a
dog. Using D to denote the event of all households that have a dog and C to denote
the event of those that have a cat, symbolically denote the following events and
identify the number of households in each.
a. The event of all households with a dog but no cat.

b. The event of all households with a cat.

c. The event of all households with a cat but no dog.

d. The event of all households that have a cat or a dog, but not both.

2
Chapter 2 – Foundations of Probability Section 2.3 – Definition of Probability

Section 2.3 – Definition of Probability

Definition: Probability
Suppose that a random experiment has a sample space S. A probability is a numerically valued
function that assigns a number P(A) to every event A so that the following axioms hold:

1. P ( A )  0
2. P ( S ) = 1
 
 
3. If A1 , A2 ,... is a sequence of mutually exclusive events, then P  Ai  =  P ( Ai ) .
 i =1  i =1
Example: A single fair die is tossed. Find the probability of a 2 or a 5 turning up.

What happens if A and B are not mutually exclusive (outcomes don’t overlap) or
disjoint?

P ( A  B ) = P ( A) + P ( B ) − P ( A  B )

Proof: The event A  B can be represented as a union of mutually exclusive events, namely

3
Chapter 2 – Foundations of Probability Section 2.3 – Definition of Probability

Example: A card is drawn at random from an ordinary deck of 52 playing cards. Use D, S, H, C
to denote Diamond, spade, heart, and club and 2, 3, 4, …9, 10, J, Q, K to denote the
cards. Find the probability that the card drawn is
a. An ace

b. A jack of hearts

c. A 3 of clubs or a 6 of diamonds

d. A heart

e. Any suit except hearts

f. A ten or a spade

g. Neither a four nor a club

4
Chapter 2 – Foundations of Probability Section 2.4 – Counting Rules Useful in Probability

Section 2.4 – Counting Rules Useful in Probability

Theorem: Fundamental Principle of Counting – If one thing can be accomplished in n1


different ways, and after this, a second thing can be accomplished in n2 different
ways, …, and finally, a kth thing can be accomplished in nk different ways, then all k
things can be accomplished in the specified order in n1  n2  ...  nk ways.

Example: A person has 2 flavors of ice cream and 3 toppings available. How many ways of
choosing an ice cream flavor and topping are there?

This Fundamental Principle of Counting only helps us to identify the number of elements in our
sample space. We still need to determine the probabilities.
Example: Using the example above, if all 6 toppings are equally likely, find the probability that
topping 2 is selected.

In general, the FPC will help us to determine a few types of counting. All of the ways involve
counting the number of ways to select r items from n objects. See chart below:
Order is Important Order is NOT important
With Replacement
Without Replacement

5
Chapter 2 – Foundations of Probability Section 2.4 – Counting Rules Useful in Probability

Example: How many possible codes can be made using the numbers 0, 1, 2,…, 9 if numbers
may be repeated?

So in the above example, we have that order is important (or we don’t get the correct code) and
we can repeat numbers, so with replacement.
Order is Important Order is NOT important
With Replacement nr

Without Replacement

Example: How many possible codes can be made using the numbers 0, 1, 2, …, 9 if numbers
may be not be repeated?

What we used in the last example is called a permutation.


Definitions
Each of the n! arrangements (in a row) of n different objects is called a permutation
of the n objects.

n
Each of the n Pr or Pr arrangements is called a permutation of n objects taken r at
a time. We define it as follows:

n ( n − 1) ... ( n − r + 1)( n − r ) ... ( 3)( 2 )(1) n!


Pr = =
n
( n − r ) ... ( 3)( 2 )(1) ( n − r )!
Order is Important Order is NOT important
With Replacement nr

Without Replacement n!
Pr =
n
( n − r )!

6
Chapter 2 – Foundations of Probability Section 2.4 – Counting Rules Useful in Probability

Example: How many ordered samples of 5 cards can be drawn without replacement from a
standard deck of 52 playing cards?

What if we are not concerned with order? For example, A, K, Q, J, 10 of spades is the same as
Q, 10, A, J, K or spades.

Definition
Each or the unordered subsets is called a combination of n objects taken r at a time,
where

n n!
Cr = Crn =   =
 r  r !( n − r )!
n

Example: How many possible 5-card hands (in 5-card poker) can be drawn from a standard deck
of 52 playing cards?

Order is Important Order is NOT important


With Replacement nr Section 2.5

Without Replacement n! n n!


Pr = Cr = Crn =   =
n
( n − r )! n
 r  r !( n − r )!

7
Chapter 2 – Foundations of Probability Section 2.4 – Counting Rules Useful in Probability

Definition – Partitions: The number of ways partitioning n distinct objects into k groups
n!
containing n1, n2, …, nk objects is
n1 !n2 ! nk !

Example: In how many ways can Mississippi be arranged?

8
Chapter 3 – Conditional Probability & Independence Section 3.1 – Conditional Probability

Chapter 3 – Conditional Probability & Independence


Section 3.1 – Conditional Probability

Suppose that we are given 20 tulip bulbs that are similar in appearance and told that 8 will bloom
early, 12 will bloom late, 13 will be red, and 7 will be yellow as shown below.
Early (E) Late (L) Totals
Red (R) 5 8 13
Yellow (Y) 3 4 7
Totals 8 12 20

If one bulb is selected at random, the probability that it will be red is P(R) = 13/20 (equally likely
assumption).
What if we wanted to consider the probability that the bulb is red given that it is an early
bloomer?

Definition – Conditional Probability: If A and B are any two events, then the conditional
probability of A given B, denoted P ( A | B ) , is given as follows provided P(B) > 0.
P ( A  B)
P ( A | B) =
P ( B)

P ( A  B)
NOTE: You could also have P ( B | A) = where P ( A )  0
P ( A)

Now we can solve for P(AB) and we get the multiplication rule.
Definition – Multiplication Rule: The probability that two events, A and B, both occur are
given by the multiplication rule,
P ( A  B ) = P ( B ) P ( A | B ) , where P ( B )  0 or
P ( A  B ) = P ( A ) P ( B | A ) , where P ( A )  0

9
Chapter 3 – Conditional Probability & Independence Section 3.1 – Conditional Probability

Example: A bowl contains 7 blue chips and 3 red chips. Two chips are to be drawn successively
at random and without replacement. Find the probability that the first draw results
in a red chip (A) and the second draw results in a blue chip (B).

NOTE: In many instances, it is possible to compute a probability by two seemingly different


methods.
Example: A bowl contains 7 blue chips and 3 red chips. Two chips are to be drawn successively
at random and without replacement. Find the probability that the first draw results
in a red chip (A) and the second draw results in a R chip (R).
Method 1 Method 2

Example – Page 67 # 3.6: The probability that Elise studies for a science test and passes it is
0.8. The probability that she studies is 0.9. If Elise studies, what is the probability
that she will pass the science test?

10
Chapter 3 – Conditional Probability & Independence Section 3.2 – Independence

Section 3.2 - Independence

Definition – Independent: Two events A and B are said to be independent if


P ( A  B ) = P ( A ) P ( B ) which is equivalent to stating that P ( A | B ) = P ( A ) or
P ( B | A ) = P ( B ) if the conditional probabilities exist. Otherwise, A and B are
dependent.
Example: A fair coin is tossed three times. Let event A = {first toss is heads}, event B =
{second toss is heads}, and event C = {exactly 2 heads in a row}. Determine which
of the three events, if any, are independent.

11
Chapter 3 – Conditional Probability & Independence Section 3.2 – Independence

Example: The probability that person A hits a target is ¼, and the probability that person B hits
the target is 2/5. Both shoot at the target. Find the probability that at least one of
them hits the target.

Example: Die A has orange on one face and blue on five faces, Die B has orange on two faces
and blue on four faces, Die C has orange on three faces and blue on three faces. If
the three dice are rolled, find the probability that exactly two of the three dice come
up orange.

12
Chapter 3 – Conditional Probability & Independence Section 3.2 – Independence

Example: The dice game Craps is played as follows. The player throws two fair dice, and there
are 3 possible outcomes:

a. If the sum is 7 or 11 on the first throw, the shooter wins; this event is called
a natural.
b. If the sum is 2, 3, or 12 on the first throw, the shooter loses; this event is
called craps.
c. If the sum is 4, 5, 6, 8, 9, or 10 on the first throw, this number becomes the
shooter's point. The shooter continues rolling the dice until either she rolls the
point again (in which case she wins) or rolls a 7 (in which case she loses).

13
Chapter 3 – Conditional Probability & Independence Section 3.2 – Independence

14
Chapter 3 – Conditional Probability & Independence Section 3.3 – Bayes’ Theorem

Section 3.3 – Theorem of Total Probability and Bayes’ Theorem

Sometimes, it is possible to partition an event, say A, into the union of two or more mutually
exclusive events. To partition the event A, we begin by partitioning the sample space S. Events
B1, B2, … , Bk are said to partition a sample space S if the following two conditions are satisfied:
1. Bi  B j =  for any pair i and j s.t i  j. (Mutually exclusive)
2. S = B1  B2   Bk

Then we can define the Theorem of Total Probability.


Theorem – Total Probability: If B1, B2, …, Bk is a collection of mutually exclusive and
k
exhaustive events, then for any event A, P ( A ) =  P ( Bi ) P ( A | Bi )
i =1

Example: Bowl A contains 2 red chips; bowl B contains two white chips; and bowl C contains 1
red chip and 1 white chip. A bowl is selected at random, and one chip is taken at
random from that bowl. What is the probability of selecting a white chip?

15
Chapter 3 – Conditional Probability & Independence Section 3.3 – Bayes’ Theorem

Theorem – Bayes Rule: If the events B1, B2, …, Bk form a partition of the sample space S, and A
P ( Bk ) P ( A | Bk )
is any event in S, then P ( Bk | A) = m , k = 1, 2,..., m
 P ( Bi ) P ( A | Bi )
i =1

Example: In a certain factory, machines A, B, and C are all producing springs of the same
length. Of their production, machines A, B, and C produce 2%, 1%, and 3%
defective springs, respectively. Of the total production of springs in the factory,
machine A produces 35%, machine B produces 25%, and machine C produces 40%.
If the selected spring is defective, find the probability that it was produced by
machine C. (This means C given Defective).

16
Chapter 3 – Conditional Probability & Independence Section 3.3 – Bayes’ Theorem

Example: A store sells four brands of DVD players. The least expensive brand, B1, accounts for
40% of the sales. The other brands (in order of their price) have the following
percentages of sales: B2, 30%; B3, 20%; and B4, 10%. The respective probabilities of
needing repair during warranty are 0.10 for B1, 0.05 for B2, 0.03 for B3, and 0.02 for
B4. A randomly selected purchaser has a DVD player that needs rep air under
warranty. What are the four conditional probabilities of being brand Bi where
i = 1, 2, 3, 4?

17
Chapter 3 – Conditional Probability & Independence Section 3.4 – Odds, Odds Ratio, & Relative Risk

Section 3.4 – Odds, Odds Ratio, and Relative Risk

Definition – Odds Ratio: The odds in favor of an event A is the ratio of the probability of A to
the probability of the compliment of A.
P ( A)
Odds in favor of A =
P ( A)

Definition – Relative Risk: The relative risk (or risk ratio) is an intuitive way to compare the
risks for the two groups. Simply divide the cumulative incidence in exposed group
by the cumulative incidence in the unexposed group.
P ( A | exposed )
Relative Risk of A =
P ( A | unexposed )

Example: Consider the table below.


Had Incidental Wound No wound
Total
Appendectomy? Infection Infection
Yes 7 124 131
No 1 78 79
Total 8 202 210
Calculate the following:
a. The odds in favor of suffering a wound infection for the Yes group.

b. The odds in favor of suffering a wound infection for the No group.

c. Calculate the risk ratio of wound infection.

18
Chapter 4 – Discrete Probability Distributions Section 4.1 – Rand. Vars. And their Prob. Distr.

Chapter 4 – Discrete Probability Distributions


Section 4.1 – Random Variables and their Probability Distributions

Definition – Random Variable:

Random variables are denoted by uppercase letters, usually towards the end of the alphabet, such
as X, Y, and Z. The actual values that random variables can assume are denoted by
lowercase letters such as x, y, and z.

Definition – Discrete:

Definition – Probability mass function (pmf): The pmf or probability function of X, denoted
p(x), assigns probability to each value x of X so that the following conditions are
satisfied:
1. P(X = x) = p(x) ≥ 0.
2. ∑ P(X = x) = 1, where the sum is over all possible values of x.
Example – Six boxes of components are ready to be shipped by a certain supplier. The number
of defective components in each box is as follows:

Box 1 2 3 4 5 6
Defectives 0 2 0 1 2 0
One of these boxes is to be randomly selected for shipment to a particular customer.
Let X be the number of defectives in the selected box. The three possible x values
are 0, 1, and 2. Find the pmf.

19
Chapter 4 – Discrete Probability Distributions Section 4.1 – Rand. Vars. And their Prob. Distr.

Definition – Cumulative distribution function (cdf): The cdf or distribution function F(b) for a
b
random variable X is F ( b ) = P ( X  b ) if X is discrete, F ( b ) =  p( x) where
x =−

p(x) is the pmf.

Example – Page 100 # 4.1: Circuit boards from two assembly lines set up to produce identical
boards are mixed in one storage tray. As inspectors examine boards, they find that it
is difficult to determine whether the board comes from line A or line B. A
probabilistic assessment of this question is often helpful. Suppose that a storage tray
contains 10 boards, of which 6 come from line A and four from line B. An inspector
selects two of these identical-looking boards for inspection. He is interested in X,
the number of inspected boards from line A.
a. Find the probability function for X.

20
Chapter 4 – Discrete Probability Distributions Section 4.1 – Rand. Vars. And their Prob. Distr.

b. Graph the probability function of X.

c. Find the distribution function of X.

d. Graph the distribution function of X.

21
Chapter 4 – Discrete Probability Distributions Section 4.1 – Rand. Vars. And their Prob. Distr.

Example – Page 101 # 4.5: A commercial building has four entrances, numbered I, II, III, and
IV. Three people enter the building at 9:00 am. Let X denote the number of people
who select entrance I. Assuming that the people choose entrances independently and
at random, find the probability distribution for X. Were any additional assumptions
necessary for your answer?
Solution:

22
Chapter 4 – Discrete Probability Distributions Section 4.1 – Rand. Vars. And their Prob. Distr.

Example – Page 101 # 4.7: Daily sales records for a car dealership show that it will sell 0, 1, 2,
or 3 cars, with probabilities as listed:
Number of Sales 0 1 2 3
Probability 0.5 0.3 0.15 0.05

a. Find the probability distribution for X, the number of sales in a 2-day period
assuming that the sales are independent from day to day.

b. Find the probability that two or more sales are made in the next 2 days.

23
Chapter 4 – Discrete Probability Distributions Section 4.2 – Expected Values of Random Variables

Section 4.2 – Expected Values of Random Variables

The idea of expected value is a value that you would most often see if you were to observe X
over a long period of time.
Definition – Expected Value: The expected value of a discrete random variable X with
probability distribution p(x) is given by E ( X ) =  =  xp ( x ) .
x

Theorem – If X is a discrete random variable with the probability distribution p(x) and if g(x) is
any real-valued function of X, then E ( g ( x ) ) =  =  g ( x ) p ( x ) .
x

Definition – Variance: The variance of a random variable X with expected value of μ is given
( ) ( )
by V ( X ) =  2 = E ( X −  )  = E X 2 −  2 = E X 2 −  E ( X )  .
2 2

 
Definition – Standard Deviation: The standard deviation of a random variable X is the square
root of the variance and is given by  = V ( X ) .

Example – Page 116 #4.17: You are to pay $1 to play a game that consists of drawing one ticket
at random from a box of numbered tickets. You win the amount in dollars of the
number on the ticket you draw. The following two boxes of numbered tickets are
available.
Box I: 0, 1, 2 Box II: 0, 0, 0, 1, 4
a. Find the expected value and variance of your net gain per play with Box I.

24
Chapter 4 – Discrete Probability Distributions Section 4.2 – Expected Values of Random Variables

b. Repeat part (a) for Box II.

c. Given that you have decided to play, which box would you choose and why?

25
Chapter 4 – Discrete Probability Distributions Section 4.2 – Expected Values of Random Variables

Tchebysheff’s Theorem – Let X be a random variable with mean μ and variance σ2. Then for
any positive k, P ( X −   k )  1 − 2
1
k
Example – Page 118 #4.27: The number of equipment breakdowns in a manufacturing plant is
closely monitored by the supervisor of operations because it is critical to the
production process. The number averages five per week with a standard deviation of
0.8 per week.
a. Find an interval that includes at least 90% of the weekly figures for the number
of breakdowns.

b. The supervisor promises that the number of breakdowns will rarely exceed 8 in a
1-week period. Is the supervisor safe in making this claim?

26
Chapter 4 – Discrete Probability Distributions Section 4.3 – The Bernoulli Distribution

Section 4.3 – The Bernoulli Distribution

Distribution – The Bernoulli Distribution


One outcome of a Bernoulli Distribution is identified to be a success and the other a
failure. We can define the random variable X as follows:
X = 1 if the outcome of the trial is a success
X = 0 if the outcome of the trial is a failure
p = the probability of observing a success
1 – p = the probability of observing a failure
p ( x ) = p x (1 − p )
1− x
, x = 0,1 for 0  p  1
E(X ) =  = p
V ( X ) =  2 = p (1 − p )

Example – In the Michigan daily lottery, the probability of winning when placing a 6-way boxed
bet is 0.006. A bet placed on each of the 12 successive days would correspond to 12
Bernoulli trials with p = 0.006.
Example – Suppose that the probability of germination of a beet seed is 0.8 and the germination
of a seed is called a success. If we plant 10 seeds and can assume that the
germination of one seed is independent of the germination of another seed, this would
correspond to 10 Bernoulli trials with p = 0.8.
Example – Out of millions of instant lottery tickets, suppose that 20% are winners. If five such
tickets are purchased in which the 4th ticket is a winner and the rest are losers, then
(0 0 0 1 0) is a possible observed sequence. Assuming independence among
winning and losing tickets, find the probability of this outcome.

Example - If five beet seeds are planted in a row in which the first, third, and fifth seeds
germinated and the other two did not, a possible sequence would be (1 0 1 0 1). If
the probability of germination is 0.8, then find the probability of this outcome
assuming independence.

This leads us to the Binomial Distribution.

27
Chapter 4 – Discrete Probability Distributions Section 4.4 – The Binomial Distribution

Section 4.4 – The Binomial Distribution

In a sequence of Bernoulli trials, we are often interested in the total number of successes but not
the actual order of their occurrences. We can let the random variable X = the number of
observed successes in n Bernoulli trials so the possible values for X are 0, 1, 2, …, n. If x
successes occur, where x = 0, 1, 2, …, n, then we have n – x failures.
The number of ways of selecting the positions for the x successes in the n trials is
n n!
 = .
 x  x !( n − x ) !

Since the trials are independent, the probability of a success is p and the probability of a failure is
1 – p, the probability of each of these ways is p x (1 − p ) . So, we can define the Binomial
n− x

distribution as follows.

Distribution – The Binomial Distribution b(n, p)

n
p ( x ) =   p x (1 − p ) , x = 0,1, 2,..., n for 0  p  1
n− x

 x
E ( X ) =  = np
V ( X ) =  2 = np (1 − p )

The random variable X possesses a binomial distribution if the following


conditions are satisfied:
1. The experiment consists of a fixed number of n identical trials.
2. Each trial is a Bernoulli trial meaning we can have only 1 of 2 possible
outcomes, either a success or failure.
3. The probability of a success, p, is constant trial to trial.
4. The trials are independent.
5. X is defined to be the number of successes among the n trials.

28
Chapter 4 – Discrete Probability Distributions Section 4.4 – The Binomial Distribution

Example - It is claimed that 15% of the ducks in a particular region have patent schistosome
infection. Suppose that 7 ducks are selected at random. Let X = the number of
ducks that are infected.
a. Assuming independence, how is X distributed?

b. Find P ( X  2 )

c. Find P ( X = 1)

d. Find P ( X  3)

Group Work – pg. 134 # 4.43 and 4.44: Turn in one page per group.

29
Chapter 4 – Discrete Probability Distributions Section 4.4 – The Binomial Distribution

Example – It is claimed that for a particular lottery, 1/10 of the 50 million tickets will win a
prize. Find the smallest number of tickets that must be purchased so that the
probability of winning at least one prize is greater than (a) 0.50; (b) 0.95.

a.

b.

1 − 0.9n = 0.95
0.9n = 0.05
n ln ( 0.9 ) = ln ( 0.05)
ln ( 0.05)
n=  28.43
ln ( 0.9 )
n  29

30
Chapter 4 – Discrete Probability Distributions Section 4.5 – The Geometric Distribution

Section 4.5 – The Geometric Distribution

Example - A representative from the National Football League's Marketing Division randomly
selects people on a random street in Philadelphia, PA until he finds a person who
attended the last home football game. Let p, the probability that he succeeds in
finding such a person, equal 0.20. And, let X denote the number of people he selects
until he finds his first success. What is the probability mass function of X?
Solution:

Distribution – Geometric Distribution:


Assume Bernoulli trials — that is, (1) there are two possible outcomes, (2) the trials
are independent, and (3) p, the probability of success, remains the same from trial to
trial. Let X denote the number of trials until the first success. Then, the probability
mass function of X is f ( x) = P ( X = x ) = (1 − p ) p for x = 1, 2, ... In this case, we
x −1

say that X follows a geometric distribution.

Theorem – The cumulative distribution function of a geometric random variable X is


F ( x ) = P ( X  x ) = 1 − (1 − p )
x

Distribution – Geometric Distribution:

f ( x) = P ( X = x ) = (1 − p ) F ( x ) = P ( X  x ) = 1 − (1 − p )
x −1 x
p
1
E(X ) =  =
p
1− p
V (X ) =2 = 2
p
 
a
 ar =  ar
k =0
k

k =1
k −1
=
1− r

31
Chapter 4 – Discrete Probability Distributions Section 4.5 – The Geometric Distribution

Example – Continuing from the previous example…A representative from the National
Football League's Marketing Division randomly selects people on a random street in
Philadelphia, PA until he finds a person who attended the last home football game.
Let p, the probability that he succeeds in finding such a person, equal 0.20. And,
let X denote the number of people he selects until he finds his first success. What is
the probability that the marketing representative must select 4 people before he finds
one who attended the last home football game?
Solution.

Example – What is the probability that the marketing representative must select more than 6
people before he finds one who attended the last home football game?
Solution:

32
Chapter 4 – Discrete Probability Distributions Section 4.5 – The Geometric Distribution

Example – How many people should we expect (that is, what is the average number) the
marketing representative needs to select before he finds one who attended the last
home football game? And, while we're at it, what is the variance?
Solution:

Example – Find the probability that in successive tosses of a fair die, a 3 will come up for the
first time on the fifth toss.
Solution:

33
Chapter 4 – Discrete Probability Distributions Section 4.5 – The Geometric Distribution

The geometric distribution is the only discrete distribution that has the Memoryless Property.
This means that if we have observed j straight failures, then the probability of observing at least k
more failures (at least j + k total failures) before a success is the same as if we were just
beginning and wanted to determine the probability of observing at least k failures prior to the
first success. P ( X  j + k | X  j ) = P ( X  k ) . (See proof on page 142)

Example – page 150 # 4.65: Let X denote a random variable that has a geometric distribution
with a probability of successes on any trial denoted by p. Let p = 0.1.
Find (a) P ( X  2 ) and (b) P ( X  4 | X  2 ) .

Solution:
a.

b.

34
Chapter 4 – Discrete Probability Distributions Section 4.6 – The Negative Binomial Distribution

Section 4.6 – The Negative Binomial Distribution

Example – A representative from the National Football League's Marketing Division randomly
selects people on a random street in Kansas City, Kansas until he finds a person who
attended the last home football game. Let p, the probability that he succeeds in finding
such a person, equal 0.20. Now, let X denote the number of people he selects until he
finds r = 3 who attended the last home football game. What is the probability that
X = 10?
Solution:

Definition – Negative Binomial Distribution:


Assume Bernoulli trials — that is, (1) there are two possible outcomes, (2) the trials
are independent, and (3) p, the probability of success, remains the same from trial to
trial. Let X denote the number of trials until the rth success. Then, the probability
 x − 1
mass function of X is f ( x) = P ( X = x ) =   (1 − p ) ( p ) for x = r, r + 1, ... In
x −r r

 r −1
this case, we say that X follows a negative binomial distribution.
Distribution – Negative Binomial Distribution:

 x − 1
f ( x) = P ( X = x ) =   (1 − p ) ( p ) where x = r , r + 1, r + 2,... and r = 1, 2,3,...
x−r r

 r −1
r
E(X ) =  =
p
r (1 − p )
V (X ) =2 =
p2

NOTE: This text makes the expected value more difficult that needs be. Please ignore
the expected value in the text and follow these notes.

35
Chapter 4 – Discrete Probability Distributions Section 4.6 – The Negative Binomial Distribution

Example – An oil company conducts a geological study that indicates that an exploratory oil
well should have a 20% chance of striking oil.
a. What is the probability that the first strike comes on the third well drilled?

b. What is the probability that the third strike comes on the seventh well drilled?

c. What is the mean and variance of the number of wells that must be drilled if the
oil company wants to set up three producing wells?

36
Chapter 4 – Discrete Probability Distributions Section 4.6 – The Negative Binomial Distribution

An alternative way to express the Negative Binomial Distribution by letting X denote the number
 x + r − 1
of failures prior to the rth success is as follows: f ( x) = P ( X = x ) =   (1 − p ) ( p ) and
x r

 r −1 
Example – Page 155 # 4.77: A large lot of tires contains 5% defectives. Four tires are to be
chosen from the lot and placed on a car.
a. Find the probability that two defectives are found before four good ones.

b. Find the expected value and the variance of the number of selections that must be
made to get four good tires.

37
Chapter 4 – Discrete Probability Distributions Section 4.7 – The Poisson Distribution

Section 4.7 – The Poisson Distribution

Situation
Let the discrete random variable X denote the number of times an event occurs in an interval of
time (or space). Then X may be a Poisson random variable with x = 0, 1, 2, ...

Examples
1. Let X equal the number of typos on a printed page. (This is an example of an interval of
space — the space being the printed page.)
2. Let X equal the number of cars passing through the intersection of Allen Street and
College Avenue in one minute. (This is an example of an interval of time — the time
being one minute.)
3. Let X equal the number of Alaskan salmon caught in a squid driftnet. (This is again an
example of an interval of space — the space being the squid driftnet.)
4. Let X equal the number of customers at an ATM in 10-minute intervals.
5. Let X equal the number of students arriving during office hours.

Suppose we are given an interval (this could be time, length, area or volume) and we are
interested in the number of “successes” in that interval. Assume that the interval can be divided
into very small subintervals such that:
1. the probability of more than one success in any subinterval is zero;
2. the probability of one success in a subinterval is constant for all subintervals and is
proportional to its length;
3. subintervals are independent of each other.
We assume the following.
The random variable X denotes the number of successes in the whole interval.
λ is the mean number of successes in the interval.
Then X has a Poisson Distribution with parameter λ and we can state the Poisson Distribution.
Distribution – Poisson Distribution:

 x e− a
 x e−
f ( x) = P ( X = x ) = where x = 0,1, 2,... F ( x) = P ( X  a) = 
x! x =0 x!
E(X ) =  = 
V (X ) =2 = 

38
Chapter 4 – Discrete Probability Distributions Section 4.7 – The Poisson Distribution

Example – Births in a hospital occur randomly at an average rate of 1.8 births per hour.
a. What is the probability of observing 4 births in a given hour at the hospital?

b. What about the probability of observing more than or equal to 2 births in a given hour at
the hospital?

39
Chapter 4 – Discrete Probability Distributions Section 4.7 – The Poisson Distribution

We can also change the size of the interval.

Example – Suppose we know that births in a hospital occur randomly at an average rate of 1.8
births per hour. What is the probability that we observe 5 births in a given 2-hour
interval?

This example illustrates the following rule:

If X Po (  ) on a 1 unit interval, then


X Po ( k  ) on a k unit intervals.

Let’s try some more problems!

40
Chapter 4 – Discrete Probability Distributions Section 4.7 – The Poisson Distribution

Example – page 199 # 4.89: Let X denote a random variable that has a Poisson distribution with
 = 4. Find the following probabilities.
a. P ( X = 5 )

b. P ( X  5 )

c. P ( X  5 )

d. P ( X  5 | X  2 )

41
Chapter 4 – Discrete Probability Distributions Section 4.7 – The Poisson Distribution

Example – page 160 #4.93: Customer arrivals at a checkout counter in a department store have
a Poisson distribution with an average of 7 per hour. For a given hour, find the
probabilities of the following events.
a. Exactly 7 customers arrive.

b. No more than 2 customers arrive.

c. At least 2 customers arrive.

42
Chapter 4 – Discrete Probability Distributions Section 4.7 – The Poisson Distribution

Example – page 160 # 4.95: Referring to exercise 4.93, suppose that it takes 10 minutes to
service each customer. Assume that an unlimited number of servers are available so
that no customer has to wait for service.
a. Find the mean and variance of the total service time connected to the customer
arrivals for 1-hour.

b. Is total service time highly likely to exceed 200 minutes?

43
Chapter 4 – Discrete Probability Distributions Section 4.7 – The Poisson Distribution

Example – page 160 # 4.96: Referring to exercise 4.93, find the probabilities that exactly 5
customers will arrive in the following 2-hour periods.
a. Between 2:00 pm and 4:00 pm (one continuous 2-hour period).

b. Between 1:00 pm and 2:00 pm and between 3:00 pm and 4:00 pm (two separate
1-hour periods for a total of 2 hours).

Group work – Page 161 # 4.102: Turn in one page per group.

44
Chapter 4 – Discrete Probability Distributions Section 4.8 – The Hypergeometric Distribution

Section 4.8 – The Hypergeometric Distribution


So far, we have discussed distributions that all used the building block series of independent
Bernoulli trials. Now we have to develop a distribution that involves dependent trials.
In 4.1, we looked at an example using the Hypergeometric Distribution. We will now define that
slightly differently.
Suppose we have a lot of N items of which k are successes and N – k are failures. Suppose that n
items are sampled randomly and sequentially from the lot, and suppose that none of the sampled
items is replaced (sampling without replacement). Then we have the hypergeometric
distribution.

Distribution – Hypergeometric Distribution:

 k  N − k 
  
x n−x  b
f ( x) = P ( X = x ) =   , x = 0,1,...k with   = 0 if a  b
N a
 
n
k 
E ( X ) =  = n 
N
 k  k  N − n 
V ( X ) =  2 = n  1 −  
 N  N  N − 1 

Example – In a small pond there are 50 fish 10 of which have been tagged. If a fisherman’s
catch consists of 7 fish selected at random and without replacement, and X denotes
the number of tagged fish, find the probability that exactly 2 tagged fish are caught.

45
Chapter 4 – Discrete Probability Distributions Section 4.8 – The Hypergeometric Distribution

Example – page 167 # 4.109: From a box containing five white and four red balls, two balls are
selected at random without replacement. Find the probabilities of the following
events.
a. Exactly one white ball is selected.

b. At least one white ball is selected.

46
Chapter 4 – Discrete Probability Distributions Section 4.8 – The Hypergeometric Distribution

From a box containing five white and four red balls, two balls are selected at random without
replacement. Find the probabilities of the following events.

c. Two white balls are selected, given that at least one white ball is selected.

47
Chapter 4 – Discrete Probability Distributions Section 4.8 – The Hypergeometric Distribution

From a box containing five white and four red balls, two balls are selected at random without
replacement. Find the probabilities of the following events.
d. The second ball drawn is white.

48
Chapter 4 – Discrete Probability Distributions Section 4.8 – The Hypergeometric Distribution

Example – page 168 # 4.112: The pool of qualified jurors called for a high-profile case that has
12 whites, 9 blacks, 4 Hispanics, and 2 Asians. From these, 12 will be selected to
serve on the jury. Assume that all qualified jurors meet the criteria for serving. Find
the probabilities of the following events.
a. No white juror is on the jury.

b. All nine of the black jurors serve on the jury.

c. No Hispanics or Asians serve on the jury.

49
Chapter 4 – Discrete Probability Distributions Section 4.8 – The Hypergeometric Distribution

Example – page 168 # 4.117: Two assembly lines (I and II) have the same rate of defectives in
their production of voltage regulators. Five regulators are sampled from each line and
tested. Among the total of 10 tested regulators, 4 are defective. Find the probability
that exactly two of the defectives came from line I.

50
Chapter 4 – Discrete Probability Distributions Section 4.9 – The Moment-Generating Function

Section 4.9 – The Moment-Generating Function

A special function with many theoretical uses in probability theory is the expected value of etX,
for a random variable X, and this expected value is called the moment generating function (mgf).
Definition – Moment generating function (mfg): The mfg or moment generating function of a
random variable is denoted by M(t) and defined to be M ( t ) = E ( etX ) .

The expected values of powers of a random variable are often called moments. Thus, E(X ) is the
first moment of X, and E(X 2) is the second moment of X .
Let’s take the derivatives!

In general,

M  ( 0 ) =  xf ( x ) = E  X 
xS

M  ( 0 ) =  x 2 f ( x ) = E  X 2 
xS

M (k )
( 0 ) =  x k f ( x ) = E  X k 
xS

51
Chapter 4 – Discrete Probability Distributions Section 4.9 – The Moment-Generating Function

3 2 1


Example – If X has the mgf of M ( t ) = et   + e 2t   + e3t   , −   t   then find the
6 6 6
associated probabilities.

52
Chapter 4 – Discrete Probability Distributions Section 4.9 – The Moment-Generating Function

2 1 2
Example – If X has the mgf of M ( t ) = et + e 2t + e3t , then find the mean, variance, and pmf
5 5 5
of X.

53
Chapter 4 – Discrete Probability Distributions Section 4.9 – The Moment-Generating Function

Example – page 175 # 4.127: Show that the moment generating function for the Poisson random
variable with mean  is given by M ( t ) = e ( ) .
 et −1

Let X ~ Poisson (). Then the pmf of X is given by

Before we derive the mgf for X, we recall from calculus the Taylor series expansion of the
exponential function ey:

Example – page 175 # 4.131: Derive the mean and variance of the Poisson random variable
using the moment generating function derived above.

54
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

Chapter 5 – Continuous Probability Distributions


Section 5.1 – Continuous Random Variables and their Probability Distributions

A continuous random variable differs from a discrete random variable in that it takes on an
uncountably infinite number of possible outcomes. For example, if we let X denote the height (in
meters) of a randomly selected maple tree, then X is a continuous random variable. The goal is to
extend much of what we learned about discrete random variables to the case in which a random
variable is continuous.

Our specific goals include:


1. Finding the probability that X falls in some interval, that is finding P(a < X < b),
where a and b are some constants.
We'll do this by using f(x), the probability density function ("pdf") of X, and F(x),
the cumulative distribution function ("cdf") of X.
2. Finding the mean μ, variance σ2, and standard deviation of X.
We'll do this through the definitions E(X) and Var(X) extended for a continuous random
variable, as well as through the moment generating function M(t) extended for a continuous
random variable.

Example – Even though a fast-food chain might advertise a hamburger as weighing a quarter-
pound, you can well imagine that it is not exactly 0.25 pounds. One randomly selected
hamburger might weigh 0.23 pounds while another might weigh 0.27 pounds. What is
the probability that a randomly selected hamburger weighs between 0.20 and 0.30
pounds? That is, if we let X denote the weight of a randomly selected quarter-pound
hamburger in pounds, what is P(0.20 < X < 0.30)?

55
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

56
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

Definition – Probability density function (pdf): A random variable X is said to be continuous


if there is a function f (x), called the pdf, such that

1. f ( x )  0 for all x

2.  f ( x) = 1
−
Note: we integrate over the support.
b
3. P ( a  X  b ) =  f ( x ) dx
a

Example - Let X be a continuous random variable whose probability density function is


f ( x ) = 3x 2 for 0  x  1.

NOTE: f(x) ≠ P(X = x). For example, f (0.9) = 3(0.9)2 = 2.43, which is clearly not
a probability! In the continuous case, f (x) is instead the height of the curve
at X = x, so that the total area under the curve is 1. In the continuous case, it is
areas under the curve that define the probabilities.
a. Verify that f (x) is a valid pdf.
1.

2.

b. What is the probability that X falls between ½ and 1?

 1
c. What is P  X =  ?
 2

57
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

An implication of the fact that P(X = x) = 0 for all x when X is continuous is that you can be
careless about the endpoints of intervals when finding probabilities of continuous random
variables. That is:
P(a ≤ X ≤ b) = P(a < X ≤ b) = P(a ≤ X < b) = P(a < X < b)

Definition – Cumulative distribution function (cdf): The cdf for a random variable X is
b

defined as P ( X  b ) . If X is continuous with pdf f (x), then F ( b ) =  f ( x ) dx . Notice


−

that F ( x) = f ( x ) .

Example - Let X be a continuous random variable whose probability density function is


f ( x ) = 3x 2 for 0  x  1. What is the cdf?

58
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

Example – Page 199 # 5.3: Suppose that a random variable X has a probability density function
 x2
 , −1  x  2
given by f ( x ) =  3
0, otherwise

a. Find the probability that −1  X  1 .

b. Find the probability that 1  X  3 .

59
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

c. Find the probability that X  1 given that X  1.5 .

d. Find the distribution function (cdf) of X.

60
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

Example – Page 199 # 5.5: The distance X between trees in a given forest has the pdf given by
ce − x /10 , x  0
f ( x) =  with measurement in feet.
0, otherwise
a. Find the value of c that makes this function a valid pdf.

b. Find and sketch the cdf of X.

61
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

c. What is the probability that the distance from a randomly selected tree to its nearest
neighbor is at least 15 feet?

d. What is the probability that the distance from a randomly selected tree to its nearest
neighbor is at least 20 feet given that it is at least 5 feet?

62
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

Example – Page 200 # 5.7: The cdf of a random variable X is as follows


0, x0
 3
x , 0  x 1

F ( x) =  2
x , 1 x  2
2
1, x2

a. Graph the cdf.

b. Find the probability that X is between 0.25 and 0.75.

63
Chapter 5 – Continuous Probability Distributions Section 5.1 – Continuous RV & Prob. Distr.

c. Find the pdf of X.

d. Graph the pdf of X.

64
Chapter 5 – Continuous Probability Distributions Section 5.2 – Expected Values of Continuous RV

Section 5.2 – Expected Values of Continuous Random Variables

Definition – Expected Value: The expected value of a continuous random variable X that has a
probability density function f (x) is given by

NOTE: We assume the absolute convergence of all integrals so that the exceptions exist.

Definition – Variance: The variance of a continuous random variable X that has a probability
density function f (x) is given by

For constants a and b,

65
Chapter 5 – Continuous Probability Distributions Section 5.2 – Expected Values of Continuous RV

Example – page 207 # 5.13: The effectiveness of solar-energy heating units depends on the
amount of radiation available from the sun. During a typical October day, daily solar
radiation in Tampa, Florida, approximately follows the following probability density
function (units are in hundreds of calories).
3
 ( x − 2 )( 6 − x ) 2 x6
f ( x ) =  32
0 otherwise
Find the mean, variance, and standard deviation of the distribution of the daily total solar
radiation in Tampa in October.

66
Chapter 5 – Continuous Probability Distributions Section 5.2 – Expected Values of Continuous RV

Example – page 208 #5.15: The weekly repair cost, X for a certain machine has a probability
density function given by
6 x (1 − x ) 0  x 1
f ( x) = 
0 otherwise
with measurements in the $100s.
a. Find the mean and variance of the distribution of repair costs.

b. Find an interval within which these weekly repair costs should lie at least 75% of the
time using Tchebysheff’s Theorem.

67
Chapter 5 – Continuous Probability Distributions Section 5.2 – Expected Values of Continuous RV

c. Find an interval within which these weekly repair costs lie exactly 75% of the time
with exactly half of those not lying in the interval above the upper limit and the other
half below the lower limit. Compare this interval to the one obtained in part (b).

68
Chapter 5 – Continuous Probability Distributions Section 5.3 – The Uniform Distribution

Section 5.3 – The Uniform Distribution

Definition – Uniform Distribution: A continuous random variable X has a uniform distribution,


denoted U(a, b), if its probability density function is
 1
 , a xb
f ( x) = b − a
0, elsewhere
A graph of the pdf looks like this:

Definition – The cumulative distribution function of a uniform random variable X is


0, xa
x
 1 x−a
F ( x ) =  dt = , a xb
a b − a b − a

1, xb
A graph of the cdf looks like this:

69
Chapter 5 – Continuous Probability Distributions Section 5.3 – The Uniform Distribution

Distribution – The Uniform Distribution:

0, xa
 1 x
 , a xb  1 x−a
f ( x) = b − a F ( x ) =  dt = , a xb
0, a b − a b − a
elsewhere
1, xb
a+b
E(X ) =  =
2
(b − a )
2

V (X ) = =2

12

Example – page 214 # 5.29: The space shuttle has a 2-hour window during which it can launch
for an upcoming mission. Launch time is uniformly distributed in the launch window.
Find the probability that the launch will occur as follows:
a. During the first 30 minutes of the launch.

b. During the last 10 minutes of the launch.

c. Within 10 minutes of the center of the launch window.

70
Chapter 5 – Continuous Probability Distributions Section 5.3 – The Uniform Distribution

Example – page 214 # 5.33: A researcher has been observing a certain volcano for a long time.
He knows that an eruption is imminent an is equally likely to occur any time in the next
24 hours.
a. What is the probability that the volcano will not erupt for at least 15 hours?

b. Find a time such that there is only a 10% chance that the volcano would not have
erupted by that time.

Some continuous random variables in the physical, management, and biological sciences have
approximately uniform probability distributions. For example, suppose we are counting events
that have a Poisson distribution, such as telephone calls coming into a switchboard. If it is known
that exactly one such event has occurred in a given interval, say (0,t),then the actual time of
occurrence is distributed uniformly over this interval.

71
Chapter 5 – Continuous Probability Distributions Section 5.3 – The Uniform Distribution

Example – page 215 # 5.35: Arrivals of customers at a bank follow a Poisson distribution.
During the first hour that the bank is open, one customer arrives at the bank.
a. Find the probability that he arrives during the first 15 minutes that the bank is open.

b. Find the probability that he arrives after the bank has been open 30 minutes.

72
Chapter 5 – Continuous Probability Distributions Section 5.5 – The Gamma Distribution

Section 5.4 – The Exponential Distribution

Suppose X, following an (approximate) Poisson process, equals the number of customers arriving at a
bank in an interval of length 1. If λ, the mean number of customers arriving in an interval of length 1,
is 6, say, then we might observe something like this:

As the picture suggests, however, we could alternatively be interested in the continuous random
variable W, the waiting time until the first customer arrives. Let's push this a bit further to see if
we can find F(w), the cumulative distribution function of W:

Now let’s find the probability density function.

73
Chapter 5 – Continuous Probability Distributions Section 5.5 – The Gamma Distribution

Definition – Exponential Distribution: A continuous random variable X follows an exponential


distribution if its probability density function is
 1 − x /
 e , for x  0 and   0
f ( x ) = 
0, elsewhere

The moment generating function of an exponential random variable X with parameter  is


1
M (t ) = for t  1/ 
1−t

The mean of an exponential random variable X with parameter  is  = E ( X ) =  .

Proof:

The mean of an exponential random variable X with parameter  is  2 = V ( X ) =  2 .

Proof:

74
Chapter 5 – Continuous Probability Distributions Section 5.5 – The Gamma Distribution

Distribution – The Exponential Distribution:

 1 − x / x0
 e , for x  0 and   0 0,
f ( x ) =  F ( x) = 
 P ( X  x ) = 1 − e ,
− x /
0, x0
elsewhere
E(X ) =  =
V (X ) = 2 =2

Example – Students arrive at a local bar and restaurant according to an approximate Poisson
process at a mean rate of 30 students per hour. What is the probability that the bouncer
has to wait more than 3 minutes to card the next student?

75
Chapter 5 – Continuous Probability Distributions Section 5.5 – The Gamma Distribution

Example – Telephone calls arrive at a doctor’s office according to a Poisson process on the
average of two every 3 minutes. Let X denote the waiting time until the first call that
arrives after 10 am.
a. What is the pdf of X?

b. Find P(X > 2).

76
Chapter 5 – Continuous Probability Distributions Section 5.5 – The Gamma Distribution

Example – Let X have an exponential distribution with mean  > 0. Show that
P ( X  x + y | X  x) = P ( X  y) .

77
Chapter 5 – Continuous Probability Distributions Section 5.5 – The Gamma Distribution

Example – page 224 #5.49: The interaccident times (times between accidents) for all fatal
accidents on scheduled American domestic passenger airplane flights for the period 1948
to 1961 were found to follow an exponential distribution with a mean of approximately
44 days (Pyke 1965).
a. If one of those accidents occurred on July 1, find the probability that another one
occurred in that same (31-day) month.

b. Find the variance of the interaccident times.

c. What does this information suggest about the clumping of airline accidents?

78

You might also like