Homework Assignment 6
Homework Assignment 6
The process is not WSS since the autocorrelation is not a function of t2 − t2 only.
Hence, the process is also not SSS .
1
2. A random process Z(t) takes values 0 and 1. A transition from 0 to 1 or form 1 to
0 occurs randomly, and the probability of having n transitions in a time interval of
duration τ (τ > 0) is given by
n
1 ατ
pN (n) = , n = 0, 1, 2, · · ·
1 + ατ 1 + ατ
where α > 0 is a constant. Assume that at t = 0, X(0) is equally likely to be 0 or 1.
a. Find µZ (t).
b. Find RZ,Z (t + τ, t).
c. Is Z(t) WSS?
Solution:
We find first the probability of an even number of transitions in the interval (0, τ ].
pN (n = even) = pN (0) + pN (2) + pN (4) + · · ·
∞ 2
1 X ατ
=
1 + ατ l=0 1 + ατ
1 1
=
1 + ατ 1 − (ατ )2 2
(1+ατ )
1 + ατ
=
1 + 2ατ
ατ
The probability pN (n = odd) is simply 1 − pN (n = even) = 1+2ατ . The random process
Z(t) takes the value of 1 (at time instant t ) if an even number of transitions occurred
given that Z(0) = 1, or if an odd number of transitions occurred given that Z(0) = 0.
Thus,
µZ (t) = E[Z(t)] = 1 · p(Z(t) = 1) + 0 · p(Z(t) = 0)
= p(Z(t) = 1 | Z(0) = 1)p(Z(0) = 1) + p(Z(t) = 1 | Z(0) = 0)p(Z(0) = 0)
1 1
= pN (n = even ) · + pN (n = odd) ·
2 2
1
=
2
To determine RZZ (t1 , t2 ) note that Z(t + τ ) = 1 if Z(t) = 1 and an even number of
transitions occurred in the interval (t, t + τ ], or if Z(t) = 0 and an odd number of
transitions have taken place in (t, t + τ ]. Hence,
RZZ (t + τ, t) =E[Z(t + τ )Z(t)]
=1 · p(Z(t + τ ) = 1, Z(t) = 1) + 0 · p(Z(t + τ ) = 1, Z(t) = 0)
+ 0 · p(Z(t + τ ) = 0, Z(t) = 1) + 0 · p(Z(t + τ ) = 0, Z(t) = 0)
=p(Z(t + τ ) = 1, Z(t) = 1) = p(Z(t + τ ) = 1 | Z(t) = 1)p(Z(t) = 1)
1 1 + ατ
= ·
2 1 + 2ατ
2
3. Which one of the following functions can be the autocorrelation of a random process
and explain why?
a. f (τ ) = sin (2πf0 τ ).
b. f (τ ) = τ 2 .
(
1 − τ, |τ | ≤ 1
c. f (τ ) =
1 + τ, |τ | > 1
Solution:
3
Solution:
Just the notation is a bit advanced but the solution is easy to follow. EX is the same
as E(X).
6. We know that i.i.d. noise is white noise. However, white noise is not necessarily i.i.d.
Suppose that {Wt } and {Zt } are independent and identically distributed (i.i.d.) se-
quences, with P (Wt = 0) = P (Wt = 1) = 1/2 and P (Zt = −1) = P (Zt = 1) = 1/2.
Define the time series model
Xt = Wt (1 − Wt−1 ) Zt
Show that {Xt } is white but not i.i.d.
Solution:
To check that {Xt } is white noise, we need to compute its means and covariances.
For the means, EXt = EWt (1 − Wt−1 ) Zt = (EWt ) (1− EWt−1 ) (EZt ) = 0. For the
covariances,
γ(s, t) = E (Ws (1 − Ws−1 ) Zs Wt (1 − Wt−1 ) Zt )
= E (Ws (1 − Ws−1 ) Wt (1 − Wt−1 )) · EZs Zt .
If s ̸= t then the last term is EZs Zt = EZs · EZt = 0. Therefore {Xt } is uncorrelated.
If s = t then EZs Zt = EZt2 = 1 and so
1
γ(t, t) = EWt2 (1 − Wt−1 )2 =
4
Thus, {Xt } has constant variance. Hence it is white noise. To show that {Xt } is not
i.i.d, note that Xt−1 = 1 implies that Wt−1 = 1, which implies that Xt = 0. Therefore
P (Xt−1 = 1, Xt = 1) = 0.
Since this is not equal to P (Xt−1 = 1) P (Xt = 1) = 1/64, Xt and Xt−1 are not inde-
pendent.
7. Let the autocorrelation function of any real function f (t) be defined as:
Z ∞
Rf,f (τ ) = f (t)f (t + τ )dt
−∞
Prove that the autocorrelation function of any real function f (t) has a maximum value
at τ = 0 or, mathematically,
Rf,f (0) ≥ Rf,f (τ ) for all τ
4
Solution:
Now, let ϵ be any real number. Then
Z ∞
I= [f (t) + ϵf (t + τ )]2 dt ≥ 0
−∞
The value of I must be greater than or equal to zero, since it’s the integral of the square
of a real function. In fact, it will only be exactly equal to zero if f (t) = 0 for all t.
Expanding, we have
Z ∞
2
f (t) + 2ϵf (t)f (t + τ ) + ϵ2 f 2 (t + τ ) dt
I=
−∞
Now look at the first and last terms in the above. Since the range of the integral is
infinite, it must be true that
Z ∞ Z ∞
2
f (t)dt = f 2 (t + τ )dt
−∞ −∞
Let us define Z ∞ Z ∞
2
A= f (t)dt = f 2 (t + τ )dt
−∞ −∞
from which we see, for the same reason that I ≥ 0, that A ≥ 0. Hence,
I = A + 2ϵRf,f (τ ) + ϵ2 A ≥ 0
This is a quadratic equation in ϵ. If in general we have an expression aϵ2 + bϵ + c ≥ 0,
this means that the quadratic equation aϵ2 + bϵ + c = 0 has either one (repeated) real
root, or no real roots at all. (Think of a parabola that either just touches the x-axis, or
is entirely above it.) This can only happen if b2 − 4ac ≤ 0. Applying this to equation
(1) gives
4[Rf,f (τ )]2 − 4A2 ≤ 0
or A2 ≥ [Rf,f (τ )]2 . Now remember how A was defined above - clearly, A = Rf,f (0),
from the definition of correlation. Hence we deduce that
Rf,f (0) ≥ Rf,f (τ )
8. Show that a Gaussian process is strongly stationary if and only if it is weakly stationary.
Solution:
If X is Gaussian and strongly stationary, then it is weakly stationary since it has a fi-
nite variance. Conversely, suppose X is Gaussian and weakly stationary. Then c(s, t) =
cov(X(s), X(t)) depends on t−s only. The joint distribution of X (t1 ) , X (t2 ) , . . . , X (tn )
depends only on the common mean and the covariances c (ti , tj ). Now c (ti , tj ) depends
on tj − ti only, whence X (t1 ) , X (t2 ) , . . . , X (tn ) have the same joint distribution as
X (s + t1 ) , X (s + t2 ) , . . . , X (s + tn ). Therefore X is strongly stationary.
9. Let {Xn } be a Markov chain on the state space S = {0, 1} with transition matrix
1−α
α
P=
β 1−β
where α + β > 0.
5
a. Find the correlation ρ (Xm , Xm+n ), and its limit as m → ∞ with n remaining fixed.
b. Find limn→∞ n−1 nr=1 P (Xr = 1).
P
Solution:
With ai (n) = P (Xn = i), we have that
cov (Xm , Xm+n ) = P (Xm+n = 1 | Xm = 1) P (Xm = 1) − P (Xm+n = 1) P (Xm = 1)
= a1 (m)p11 (n) − a1 (m)a1 (m + n),
and therefore,
The process is strictly stationary if and only if X0 has the stationary distribution.
10. Let {N (t) : t ≥ 0} be a Poisson process of intensity λ, and let T0 be an independent
random variable such that P (T0 = ±1) = 12 . Define T (t) = T0 (−1)N (t) .
a. Show that {T (t) : t ≥ 0} is stationary.
b. Find ρ(T (s), T (s + t)).
Rt
c. Find the mean and variance of X(t) = 0
T (s) ds.
Solution:
a. We have that E(T (t)) = 0 and var(T (t)) = var (T0 ) = 1. Hence {T(t): t ≥ 0} is stationary.
b. ρ(T (s), T (s + t)) = E(T (s)T (s + t)) = E (−1)N (t+s)−N (s) = e−2λt .
6
11. Customers arrive at a desk according to a Poisson process of intensity λ. There is
one clerk, and the service times are independent and exponentially distributed with
parameter µ. At time 0 there is exactly one customer, currently in service. Show that
the probability that the next customer arrives before time t and finds the clerk busy is
λ
1 − e−(λ+µ)t
λ+µ
Solution:
The given event occurs if the time X to the next arrival is less than t, and also less
than the time Y of service of the customer present. Now,
Z t
λ
λe−λx e−µx dx = 1 − e−(λ+µ)t
P(X ≤ t, X ≤ Y ) =
0 λ+µ
12. Customers enter a shop at the instants of a Poisson process of rate 2. At the door,
two representatives separately demonstrate a new corkscrew. This typically occupies
the time of a customer and the representative for a period which is exponentially dis-
tributed with parameter 1, independently of arrivals and other demonstrators. If both
representatives are busy, customers pass directly into the shop. No customer passes a
free representative without being stopped, and all customers leave by another door. If
both representatives are free at time 0, show the probability that both are busy at time
t is 25 − 23 e−2t + 15
4 −5t
e .
Solution:
By considering possible transitions during the interval (t, t + h), the probability pi (t)
that exactly i demonstrators are busy at time t satisfies:
p2 (t + h) = p1 (t)2h + p2 (t)(1 − 2h) + o(h),
p1 (t + h) = p0 (t)2h + p1 (t)(1 − h)(1 − 2h) + p2 (t)2h + o(h),
p0 (t + h) = p0 (t)(1 − 2h) + p1 (t)h + o(h).
Hence,
p′2 (t) = 2p1 (t) − 2p2 (t), p′1 (t) = 2p0 (t) − 3p1 (t) + 2p2 (t), p′0 (t) = −2p0 (t) + p1 (t),
and therefore p2 (t) = a + be−2t + ce−5t for some constants a, b, c. By considering the
values of p2 and its derivatives at t = 0, the boundary conditions are found to be
a + b + c = 0, −2b − 5c = 0, 4b + 25c = 4, and the claim follows.
13. Let {Xn : n ≥ 1}Pbe independent identically distributed integer-valued random
Pn vari-
n
ables. Let Sn = r=1 Xr , with S0 = 0, Yn = Xn +Xn−1 with X0 = 0, and Zn = r=0 Sr .
Which of the following constitute Markov chains?
a. Sn
b. Yn
c. Zn
d. The sequence of pairs (Sn , Zn )
Solution:
7
a. Since Sn+1 = Sn +Xn+1 , a sum of independent random variables, S is a Markov chain .
b. We have that