CH 2
CH 2
net/publication/290738771
Exponential Distribution
CITATIONS READS
2 5,706
3 authors, including:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Yanhong Wu on 21 January 2016.
Exponential Distribution
2.1 Introduction
In establishing a probability model for a real-world phenomenon, it is always
necessary to make certain simplifying assumptions to render the mathematics
tractable. On the other hand, however, we cannot make too many simplifying
assumptions, for then our conclusions, obtained from the probability model,
would not be applicable to the real-world phenomenon. Thus, we must make
enough simplifying assumptions to enable us to handle the mathematics but
not so many that the model no longer resembles the real-world phenomenon.
The reliability of instruments and systems can be measured by the sur-
vival probabilities such as P [X > x]. During the early 1950’s, Epstein and
Sobel, and Davis analyzed statistical data of the operating time of an instru-
ment up to failure, and found that in many cases the lifetime has exponential
distribution. Consequently, the exponential distribution became the under-
lying life distribution in research on reliability and life expectancy in the
1950’s. Although further research revealed that for a number of problems in
reliability theory the exponential distribution is inappropriate for modeling
the life expectancy, however, it can be useful to get a first approximation (see
the reference by Barlow and Proschan (1975)). Exponential distributions are
also used in measuring the length of telephone calls and the time between
successive impulses in the spinal cords of various mammals.
This chapter is devoted to the study of exponential distribution, its prop-
erties and characterizations, and models which lead to it and illustrate its
applications.
23
24 CHAPTER 2. EXPONENTIAL DISTRIBUTION
Property 1.
1 1
E(X) =
, V ar(X) = 2 ,
λ λ
and the moment generating function is given by
λ
M (t) = E[etX ] = ,
λ−t
for t < λ.
In fact,
Z ∞
M (t) = E[etX ] = etx λe−λx dx
0
λ Z∞
= (λ − t)e−(λ−t)x dx
λ−t 0
λ
= ,
λ−t
since the last integral is an exponential density with parameter λ − t. The
first two moments of X can be calculated as
d
E[X] = M (t)|t=0
dt
2.2. EXPONENTIAL DISTRIBUTION 25
λ
= |t=0
(λ − t)2
1
= ,
λ
d2
E[X 2 ] = M (t)|t=0
dt2
2λ
= |t=0
(λ − t)3
2
= ,
λ2
and thus
In fact,
f (t)
r(t) =
F̄ (t)
λe−λt
= = λ.
e−λt
Property 3. (Lack of Memory) The residual lifetime (X − t)|X > t has the
same distribution as X. That means, for all x, t ≥ 0,
F̄ (x + t)
P [X > x + t|X > t] =
F̄ (t)
−λ(x+t)
e
=
e−λt
−λx
= e = P [X > x].
26 CHAPTER 2. EXPONENTIAL DISTRIBUTION
In fact,
x
P [n min(X1 , ..., Xn ) > x] = P [min(X1 , ..., Xn ) > ]
n
x x
= P [X1 > ]...P [Xn > ]
n n
x n
= (P [X > ])
n
= (e−λx/n )n = e−λx .
Example 2.1 Suppose the lifetime of certain items follows exponential dis-
tribution with parameter λ = 0.1.
1
(i) From Property 1, µ = 0.1 = 10; V ar(X) = 0.11 2 = 100.
(ii)The density function is
f (x) = 0.1e−0.1x ,
F̄ (x) = e−0.1x .
f (t)
r(t) = = 0.1.
F̄ (t)
2.3. CHARACTERIZATION OF EXPONENTIAL DISTRIBUTION 27
(iii) Suppose one item has survived 10 units of time, then from Property
3, the conditional survival function given X > 10 is
P [X > x + 10|X > 10] = P [X > x] = e−0.1x .
In particular, the mean residual life µ(t) = µ = 10.
(iv) (Serial system) Consider a serial system consists of five identical items
which are working independently. The system fails as long as one item fails.
Then the lifetime of the system is X = min(X1 , X2 , ...X5 ), where Xi denotes
the lifetime of i-th item. From Property 4, as X follows exponential distri-
µ
bution with parameter 0.5, then the mean system lifetime will be 0.5 = 2.
F̄ (x + t)
= F̄ (x), for x, t ≥ 0,
F̄ (t)
equivalently,
F̄ (x + t) = F̄ (x)F̄ (t), for x, t ≥ 0.
Following the lines of the proof for Theorem 1.2, this implies that for any
rational number r = m/n,
Remark. A more delicate analysis shows that we only have to check the
memoryless property at two points t1 and t2 such that t1 /t2 is irrational and
In fact, from
F̄ (x + ti ) = F̄ (x)F̄ (ti ), for i = 1, 2,
we see that
t2 = n0 t1 + x1
t1 = n1 x1 + x2
xi = ni+1 xi+1 + xi+2 , for i = 1, 2, · · · .
Note that each xi is of the form mt1 + nt2 by looking at the equations
backward, and more important,
x1 > x2 > · · · → 0.
Thus, for any given > 0, there exists a k such that 0 < xk < . This
implies that for any x > 0, there exists a positive integer j such that
0 ≤ zj − x ≤ xk < .
z1 > z2 > · · · → x.
then X is exponential.
30 CHAPTER 2. EXPONENTIAL DISTRIBUTION
1 F̄ (t)
= R∞
µ t F̄ (x)dx
d Z∞
= − ln F̄ (x)dx, for F̄ (t) > 0.
dt t
Thus, Z ∞
t
ln F̄ (x)dx = C − .
t µ
By taking t = 0, we get C = ln µ. Thus,
Z ∞
t
F̄ (x)dx = µe− µ ,
t
and hence
t
F̄ (t) = e− µ .
Proof.
x
P [nX1,n > x] = P [X1,n > ]
n
x
= P [min(X1 , ..., Xn ) > ]
n
x x
= P [X1 > , ...., Xn > ]
n n
x
= (P [X > ])n .
n
If nX1,n has the same distribution as X, then
x
F̄ ( ) = (F̄ (x))1/n , for n = 1, 2, ....
n
By a similar argument as in Theorem 2.1, F̄ (x) = e−λx for some λ > 0.
Remark 1. Similar to the remark after Theorem 2.1, we can show that if
Theorem 2.4 Suppose for some n, nX1,n has the same distribution as X. If
F (x)
0 < lim+ = λ < ∞,
x→0 x
Proof. For this sample size n, we have F̄ (x) = (F̄ (x/n))n . Inductively, this
implies that for any integer k,
k
F̄ (x) = (F̄ (x/nk ))n , for x ≥ 0.
That means
F (x) = 1 − F̄ (x)
k
= 1 − (F̄ (x/nk ))n
k
= 1 − (1 − F (x/nk ))n .
Since
F (x)
lim+ = λ > 0,
x→0 x
and
F (x/nk ) = λx/nk + o(1/nk ),
as k → ∞. Consequently,
x nk
F (x) = 1 − lim (1 − λ )
k→∞ nk
−λx
= 1−e .
n!
y. Thus, there are total 1!(k−1)!(n−k)! combinations of observations and each
occurs with the same probability f (y)[F (y)]k−1 [1 − F (y)]n−k .
More generally, by using the similar explanation, the joint density of
Theorem 2.5 Let dr,n be the r-th spacing statistic from the exponential dis-
tribution F (x) = 1 − e−λx . Then
1 1
(2) E(dr,n ) = (n−r)λ
, V ar(dr,n ) = (n−r)2 λ2
, r = 1, 2, ..., n − 1;
(F (s))r−1 (1 − F (t))n−r−1
fr,r+1 (s, t) = n!f (s)f (t)
(r − 1)! (n − r − 1)!
(1 − e−λs )r−1 e−(n−r−1)λt
= n!λe−λs λe−λt , 0 < s < t.
(r − 1)! (n − r − 1)!
From the transformation
for 0 < x, y < ∞. Since the form is separable in x and y, the marginal
density function for dr,n is
It follows that X1,n and d1,n , ..., dn−1,n are mutually independent.
Corollary 2.2 For the exponential distribution, the normalized spacings Dr,n =
(n−r)dr,n , for r = 1, 2, ..., n−1, are identically and independently distributed
with common distribution function F (x) = 1 − e−λx .
36 CHAPTER 2. EXPONENTIAL DISTRIBUTION
Proof. The independence of Dr,n follows from the last theorem. Since
the density function of dr,n is (n − r)λe−(n−r)λx , the density function of
Dr,n = (n − r)dr,n is thus λe−λx .
Example 2.3 Consider the life testing for n identical components which
follow exponential distribution with λ.
(i) X1,n is exponential with mean nµ = nλ
1
.
(ii) In general, we can write
Remark. If we know that the normalized spacings are all exponential with
F (x) = 1 − e−λx , does it imply that the population distribution is also expo-
nential without additional assumptions? The answer is no.
Theorem 2.7 Let the population distribution F (x) be such that F (0+ ) = 0
and F (x) > 0 for all x > 0. Assume that
Z ∞
ur (s) = e−sx dF r (x) 6= 0
0
Denote by R1 < R2 < ... the successive record values. The following
simple result is due to Tata (1969).
This is exactly the Memoryless Property. In this case, the insurance company
does not need to guess the risk of such a driver, but can go ahead with a well-
defined model of setting the amount of premium based on the driver’s record.
P [N = k] = (1 − p)pk−1 ,
as
In particular, when λ1 = λ2 = λ,
P [X1 + X2 ≤ y] = 1 − (1 + λy)e−λy .
42 CHAPTER 2. EXPONENTIAL DISTRIBUTION
Problems
1. Suppose the lifetime of a certain model of car battery follows an exponential
distribution with the mean lifetime of 5 years.
a) Write down the survival function.
b) What is the probability that the lifetime will be over 2 years?
c) What is probability that the battery will work more than 4 years given
that it worked at least two years?
2. The time to repair a machine is an exponentially distributed random variable
with parameter λ = 1/2. What is
a) the probability that the repair time exceeds 2 hours;
b) the conditional probability that a repair takes at least 10 hours, given
that its duration exceeds 9 hours?
3. Let X and Y be independent Exp(λ). Prove that the density function of
Z = X/Y is given by
h(z) = (1 + z)−2 , z > 0.
8. Suppose that independent trials, each having a success probability p, (0 < p <
1), are performed until a success occurs. Denote by X the number of trials
needed. Then
P [X = k] = (1 − p)k−1 p, k = 1, 2, ...,
10. By conditioning on the value of the first record value R1 , find the distribution
function of the second record value R2 .
12. If X is uniformly distributed over (0, 1), find the density function of Y = eX .
14. The duration of pauses (and the duration of vocalizations) that occur in a
monologue follows an exponential distribution with mean 0.70 seconds.
(i) What is the variance of the duration of pauses?
(ii) Given that the duration of a pause is longer than 1.0 seconds, what will
be the expected total duration time?
44 CHAPTER 2. EXPONENTIAL DISTRIBUTION
15. Suppose the time X between two successive arrivals at the drive-up window
of a local bank is an exponential random variable with λ = 0.2.
(i) What is the expected time between two successive arrivals?
(ii) Find the probability P [X > 4].
(iii) Find the probability P [2 < X < 6].
16. Consider the two similar unit parallel systems. That means the two units
follow the same exponential distribution with parameter λ.
(i) What is the distribution function of system’s lifetime?
(ii) What is the density function?
(iii) Calculate the failure rate.
17. Consider the two similar unit cold standby systems. That means, the two
units have the same exponential distribution with parameter λ.
(i) What is the distribution function of system’s lifetime?
(ii) What is the density function of system’s lifetime?
(iii) Calculate the failure rate and verify that it is monotone increasing.
18. Suppose X and Y are two independent exponential random variables with
parameters λ and δ respectively.
(i) Show that min(X, Y ) is exponential with parameter λ + δ.
δ
(ii) Show that the probability P [X > Y ] = λ+δ .
(iii)Show by the memoryless property that given X > Y , Y and X − Y are
independent. Thus,
19. Suppose a bank branch has two tellers and their service times for each cus-
tomer are exponential with parameters 0.2 and 0.25 respectively. When a
customer arrives at the branch, he finds that both tellers are serving and no
customers are waiting.
(i) What is the distribution of his waiting time?
(ii) What is the probability that he will be served by the first teller?
2.5. MORE APPLICATIONS 45
Tl = E[L|L ≥ l].
N = inf{n ≥ 2 : X1 ≥ X2 ≥ · · · ≥ Xn−1 ≤ Xn }.