0% found this document useful (0 votes)
105 views

Homework Assignment 6

Uploaded by

pniyati508
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views

Homework Assignment 6

Uploaded by

pniyati508
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

School of Engineering and Applied Science (SEAS), Ahmedabad University

Probability and Stochastic Processes (MAT 277)

Homework Assignment-6 Solutions

Deadline : 06-04-2023 11:59 PM

1. A random process X(t) has the following member functions:


Rt
• x1 (t) = π π/2 sin(t) dt
Rt
• x2 (t) = −π 0 cos(t) dt
Rt
• x3 (t) = π[1 + 0 (cos(t) − sin(t)) dt]
Rt
• x4 (t) = 1 − 0 (cos(t) + sin(t)) dt
Rt
• x5 (t) = 1 + π/2 (sin(t) + cos(t)) dt

a. Find the mean function, µX (t).


b. Find the autocorrelation function, RX,X (t1 , t2 ).
c. Is the process WSS or SSS?
Solution:
Member functions are: x1 (t) = −2cos(t), x2 (t) = −2sin(t), x3 (t) = 2[cos(t) + sin(t)],
x4 (t) = [cos(t) − sin(t)], and x5 (t) = [sin(t) − cos(t)].
1
µX (t) = [−2 cos(t) − 2 sin(t) + 2(cos(t) + sin(t))
5
+ (cos(t) − sin(t)) − (cos(t) − sin(t))]
= 0

RX,X (t1 , t2 ) = E [X (t1 ) X (t2 )]


1
= [4 cos (t1 ) cos (t2 ) + 4 sin (t1 ) sin (t2 ) + 4 (cos (t1 ) + sin (t1 )) (cos (t2 ) + sin (t2 ))
5
+2 (cos (t1 ) − sin (t1 )) (cos (t2 ) − sin (t2 ))]
1
= [10 cos (t1 ) cos (t2 ) + 10 sin (t1 ) sin (t2 ) + 2 cos (t1 ) sin (t2 ) + 2 sin (t1 ) cos (t2 )
5
2
= 2 cos (t2 − t1 ) + sin (t1 + t2 )
5

The process is not WSS since the autocorrelation is not a function of t2 − t2 only.
Hence, the process is also not SSS .

1
2. A random process Z(t) takes values 0 and 1. A transition from 0 to 1 or form 1 to
0 occurs randomly, and the probability of having n transitions in a time interval of
duration τ (τ > 0) is given by
 n
1 ατ
pN (n) = , n = 0, 1, 2, · · ·
1 + ατ 1 + ατ
where α > 0 is a constant. Assume that at t = 0, X(0) is equally likely to be 0 or 1.
a. Find µZ (t).
b. Find RZ,Z (t + τ, t).
c. Is Z(t) WSS?
Solution:
We find first the probability of an even number of transitions in the interval (0, τ ].
pN (n = even) = pN (0) + pN (2) + pN (4) + · · ·
∞  2
1 X ατ
=
1 + ατ l=0 1 + ατ
1 1
=
1 + ατ 1 − (ατ )2 2
(1+ατ )
1 + ατ
=
1 + 2ατ
ατ
The probability pN (n = odd) is simply 1 − pN (n = even) = 1+2ατ . The random process
Z(t) takes the value of 1 (at time instant t ) if an even number of transitions occurred
given that Z(0) = 1, or if an odd number of transitions occurred given that Z(0) = 0.
Thus,
µZ (t) = E[Z(t)] = 1 · p(Z(t) = 1) + 0 · p(Z(t) = 0)
= p(Z(t) = 1 | Z(0) = 1)p(Z(0) = 1) + p(Z(t) = 1 | Z(0) = 0)p(Z(0) = 0)
1 1
= pN (n = even ) · + pN (n = odd) ·
2 2
1
=
2
To determine RZZ (t1 , t2 ) note that Z(t + τ ) = 1 if Z(t) = 1 and an even number of
transitions occurred in the interval (t, t + τ ], or if Z(t) = 0 and an odd number of
transitions have taken place in (t, t + τ ]. Hence,
RZZ (t + τ, t) =E[Z(t + τ )Z(t)]
=1 · p(Z(t + τ ) = 1, Z(t) = 1) + 0 · p(Z(t + τ ) = 1, Z(t) = 0)
+ 0 · p(Z(t + τ ) = 0, Z(t) = 1) + 0 · p(Z(t + τ ) = 0, Z(t) = 0)
=p(Z(t + τ ) = 1, Z(t) = 1) = p(Z(t + τ ) = 1 | Z(t) = 1)p(Z(t) = 1)
1 1 + ατ
= ·
2 1 + 2ατ

As it is observed RZZ (t + τ, t) depends only on τ and thus the process is WSS .

2
3. Which one of the following functions can be the autocorrelation of a random process
and explain why?
a. f (τ ) = sin (2πf0 τ ).
b. f (τ ) = τ 2 .
(
1 − τ, |τ | ≤ 1
c. f (τ ) =
1 + τ, |τ | > 1
Solution:

a. f (τ ) cannot be the autocorrelation function of a random process for f (0) = 0 <


f (1/4f0 ) = 1. Thus the maximum absolute value of f (τ ) is not achieved at the
origin τ = 0.
b. f (τ ) cannot be the autocorrelation function of a random process for f (0) = 0
whereas f (τ ) ̸= 0 for τ ̸= 0. The maximum absolute value of f (τ ) is not achieved
at the origin.
c. f (0) = 1 whereas f (τ ) > f (0) for |τ | > 1. Thus f (τ ) cannot be the autocorrelation function
of a random process.
4. Let the random process X(t) be defined by X(t) = A + Bt3 where A and B are
independent random variables, and each uniformly distributed on [0, 2]. Find µX (t)
and RXX (t1 , t2 ).
Solution:

µX (t) = E[A + Bt3 ] = E[A] + E[B]t3 = 1 + t3


where the last equality follows from the fact that A, B are uniformly distributed over
[0, 2] so that E[A] = E[B] = 1.
RXX (t1 , t2 ) = E [X (t1 ) X (t2 )] = E A + Bt31 A + Bt32
  

= E A2 + E[AB]t32 + E[BA]t31 + E B 2 t31 t32


   

The random variables A, B are independent so that E[AB] = E[A]E[B] = 1. Similarly


E[BA] = 1. Furthermore
Z 2 2
2 1 1 3 4
 2  2
E A =E B = x · dx = x =
0 2 6 0 3
Thus
4 4
RXX (t1 , t2 ) = + t32 + t31 + t31 t32
3 3
5. Determine if each of the following is a stationary process. If they are, give the mean
and autocovariance functions. Here, {Wt } is i.i.d. N(0, 1).
a. Xt = W3
b. Xt = t + W3
c. Xt = Wt2

3
Solution:
Just the notation is a bit advanced but the solution is easy to follow. EX is the same
as E(X).

a. Xt = W3 is a stationary process because EXt = EW3 = 0 and EXs Xt = EW32 = 1 .


b. Xt = W3 +t is not a stationary process because its mean is not constant: EXt = t.

c. Xt = Wt2 is a stationary process : EXt = EWt2 = 1 and


(
3 if s = t
EXs Xt = EWs2 Wt2 =
1 if s ̸= t

6. We know that i.i.d. noise is white noise. However, white noise is not necessarily i.i.d.
Suppose that {Wt } and {Zt } are independent and identically distributed (i.i.d.) se-
quences, with P (Wt = 0) = P (Wt = 1) = 1/2 and P (Zt = −1) = P (Zt = 1) = 1/2.
Define the time series model
Xt = Wt (1 − Wt−1 ) Zt
Show that {Xt } is white but not i.i.d.
Solution:
To check that {Xt } is white noise, we need to compute its means and covariances.
For the means, EXt = EWt (1 − Wt−1 ) Zt = (EWt ) (1− EWt−1 ) (EZt ) = 0. For the
covariances,
γ(s, t) = E (Ws (1 − Ws−1 ) Zs Wt (1 − Wt−1 ) Zt )
= E (Ws (1 − Ws−1 ) Wt (1 − Wt−1 )) · EZs Zt .
If s ̸= t then the last term is EZs Zt = EZs · EZt = 0. Therefore {Xt } is uncorrelated.
If s = t then EZs Zt = EZt2 = 1 and so
1
γ(t, t) = EWt2 (1 − Wt−1 )2 =
4
Thus, {Xt } has constant variance. Hence it is white noise. To show that {Xt } is not
i.i.d, note that Xt−1 = 1 implies that Wt−1 = 1, which implies that Xt = 0. Therefore
P (Xt−1 = 1, Xt = 1) = 0.
Since this is not equal to P (Xt−1 = 1) P (Xt = 1) = 1/64, Xt and Xt−1 are not inde-
pendent.
7. Let the autocorrelation function of any real function f (t) be defined as:
Z ∞
Rf,f (τ ) = f (t)f (t + τ )dt
−∞

Prove that the autocorrelation function of any real function f (t) has a maximum value
at τ = 0 or, mathematically,
Rf,f (0) ≥ Rf,f (τ ) for all τ

4
Solution:
Now, let ϵ be any real number. Then
Z ∞
I= [f (t) + ϵf (t + τ )]2 dt ≥ 0
−∞

The value of I must be greater than or equal to zero, since it’s the integral of the square
of a real function. In fact, it will only be exactly equal to zero if f (t) = 0 for all t.
Expanding, we have
Z ∞
 2
f (t) + 2ϵf (t)f (t + τ ) + ϵ2 f 2 (t + τ ) dt

I=
−∞

Now look at the first and last terms in the above. Since the range of the integral is
infinite, it must be true that
Z ∞ Z ∞
2
f (t)dt = f 2 (t + τ )dt
−∞ −∞

Let us define Z ∞ Z ∞
2
A= f (t)dt = f 2 (t + τ )dt
−∞ −∞
from which we see, for the same reason that I ≥ 0, that A ≥ 0. Hence,
I = A + 2ϵRf,f (τ ) + ϵ2 A ≥ 0
This is a quadratic equation in ϵ. If in general we have an expression aϵ2 + bϵ + c ≥ 0,
this means that the quadratic equation aϵ2 + bϵ + c = 0 has either one (repeated) real
root, or no real roots at all. (Think of a parabola that either just touches the x-axis, or
is entirely above it.) This can only happen if b2 − 4ac ≤ 0. Applying this to equation
(1) gives
4[Rf,f (τ )]2 − 4A2 ≤ 0
or A2 ≥ [Rf,f (τ )]2 . Now remember how A was defined above - clearly, A = Rf,f (0),
from the definition of correlation. Hence we deduce that
Rf,f (0) ≥ Rf,f (τ )

8. Show that a Gaussian process is strongly stationary if and only if it is weakly stationary.
Solution:
If X is Gaussian and strongly stationary, then it is weakly stationary since it has a fi-
nite variance. Conversely, suppose X is Gaussian and weakly stationary. Then c(s, t) =
cov(X(s), X(t)) depends on t−s only. The joint distribution of X (t1 ) , X (t2 ) , . . . , X (tn )
depends only on the common mean and the covariances c (ti , tj ). Now c (ti , tj ) depends
on tj − ti only, whence X (t1 ) , X (t2 ) , . . . , X (tn ) have the same joint distribution as
X (s + t1 ) , X (s + t2 ) , . . . , X (s + tn ). Therefore X is strongly stationary.
9. Let {Xn } be a Markov chain on the state space S = {0, 1} with transition matrix
1−α
 
α
P=
β 1−β
where α + β > 0.

5
a. Find the correlation ρ (Xm , Xm+n ), and its limit as m → ∞ with n remaining fixed.
b. Find limn→∞ n−1 nr=1 P (Xr = 1).
P

c. Under what condition is the process strongly stationary?

Solution:
With ai (n) = P (Xn = i), we have that
cov (Xm , Xm+n ) = P (Xm+n = 1 | Xm = 1) P (Xm = 1) − P (Xm+n = 1) P (Xm = 1)
= a1 (m)p11 (n) − a1 (m)a1 (m + n),
and therefore,

a1 (m)p11 (n) − a1 (m)a1 (m + n)


ρ (Xm , Xm+n ) = p
a1 (m) (1 − a1 (m)) a1 (m + n) (1 − a1 (m + n))

Now, a1 (m) → α/(α + β) as m → ∞, and


α β
p11 (n) = + (1 − α − β)n
α+β α+β
whence ρ (Xm , Xm+n ) → (1 − α − β)n as m → ∞. Finally,
n
1X α
lim P (Xr = 1) =
n→∞ n α+β
r=1

The process is strictly stationary if and only if X0 has the stationary distribution.
10. Let {N (t) : t ≥ 0} be a Poisson process of intensity λ, and let T0 be an independent
random variable such that P (T0 = ±1) = 12 . Define T (t) = T0 (−1)N (t) .
a. Show that {T (t) : t ≥ 0} is stationary.
b. Find ρ(T (s), T (s + t)).
Rt
c. Find the mean and variance of X(t) = 0
T (s) ds.
Solution:
a. We have that E(T (t)) = 0 and var(T (t)) = var (T0 ) = 1. Hence {T(t): t ≥ 0} is stationary.

b. ρ(T (s), T (s + t)) = E(T (s)T (s + t)) = E (−1)N (t+s)−N (s) = e−2λt .
 

c. Evidently, E(X(t)) = 0 , and


Z t Z t 
2
 
E X(t) = E T (u)T (v) du dv
0 0
Z Z t Z v
=2 E(T (u)T (v)) du dv = 2 e−2λ(v−u) du dv
0<u<v<t v=0 u=0
 
1 1 1 −2λt t
= t− + e ∼ as t → ∞
λ 2λ 2λ λ

6
11. Customers arrive at a desk according to a Poisson process of intensity λ. There is
one clerk, and the service times are independent and exponentially distributed with
parameter µ. At time 0 there is exactly one customer, currently in service. Show that
the probability that the next customer arrives before time t and finds the clerk busy is
λ
1 − e−(λ+µ)t

λ+µ

Solution:
The given event occurs if the time X to the next arrival is less than t, and also less
than the time Y of service of the customer present. Now,
Z t
λ
λe−λx e−µx dx = 1 − e−(λ+µ)t

P(X ≤ t, X ≤ Y ) =
0 λ+µ

12. Customers enter a shop at the instants of a Poisson process of rate 2. At the door,
two representatives separately demonstrate a new corkscrew. This typically occupies
the time of a customer and the representative for a period which is exponentially dis-
tributed with parameter 1, independently of arrivals and other demonstrators. If both
representatives are busy, customers pass directly into the shop. No customer passes a
free representative without being stopped, and all customers leave by another door. If
both representatives are free at time 0, show the probability that both are busy at time
t is 25 − 23 e−2t + 15
4 −5t
e .
Solution:
By considering possible transitions during the interval (t, t + h), the probability pi (t)
that exactly i demonstrators are busy at time t satisfies:
p2 (t + h) = p1 (t)2h + p2 (t)(1 − 2h) + o(h),
p1 (t + h) = p0 (t)2h + p1 (t)(1 − h)(1 − 2h) + p2 (t)2h + o(h),
p0 (t + h) = p0 (t)(1 − 2h) + p1 (t)h + o(h).
Hence,
p′2 (t) = 2p1 (t) − 2p2 (t), p′1 (t) = 2p0 (t) − 3p1 (t) + 2p2 (t), p′0 (t) = −2p0 (t) + p1 (t),
and therefore p2 (t) = a + be−2t + ce−5t for some constants a, b, c. By considering the
values of p2 and its derivatives at t = 0, the boundary conditions are found to be
a + b + c = 0, −2b − 5c = 0, 4b + 25c = 4, and the claim follows.
13. Let {Xn : n ≥ 1}Pbe independent identically distributed integer-valued random
Pn vari-
n
ables. Let Sn = r=1 Xr , with S0 = 0, Yn = Xn +Xn−1 with X0 = 0, and Zn = r=0 Sr .
Which of the following constitute Markov chains?
a. Sn
b. Yn
c. Zn
d. The sequence of pairs (Sn , Zn )
Solution:

7
a. Since Sn+1 = Sn +Xn+1 , a sum of independent random variables, S is a Markov chain .
b. We have that

P (Yn+1 = k | Yi = xi + xi−1 for 1 ≤ i ≤ n) = P (Yn+1 = k | Xn = xn )


by the Markov property of X. However, conditioning on Xn is not generally equiva-
lent to conditioning on Yn = Xn +Xn−1 , so Y does not generally constitute a Markov chain .
c. Zn = nX1 + (n − 1)X2 + · · · + Xn , so Zn+1 is the sum of Xn+1 and a certain linear
combination of Z1 , Z2 , . . . , Zn , and so cannot be Markovian .
d. Since Sn+1 = Sn + Xn+1 , Zn+1 = Zn + Sn + Xn+1 , and Xn+1 is independent of
X1 , . . . , Xn , this is a Markov chain .
14. Find the n-step transition probabilities pij (n) for the chain X having transition matrix
 
0 12 12
P =  13 14 12 5 


2 1 1
3 4 12

Using it, derive the infinite-step transition probabilities.

Hint: Use the eigen stuff!


Solution:
1 1
 
|P − λI| = (λ − 1) λ+ 2
λ+ 6
. Tedious computation yields the eigenvectors, and
thus
     
1 1 1  n 8 −4 −4  n 0 0 0
n 1 1 1   1 1 
P =  1 1 1 + − 2 −1 −1  + −  −2 3 −1 
 
3 12 2 4 6

1 1 1 −10 5 5 2 −3 1
 
1 1 1
1
P∞ = lim Pn =  1 1 1 

n→∞ 3
1 1 1

You might also like