EC484 CLASSES: WEEK 2
KAMILA NOWAKOWICZ
Question 2, Problem Set 1
This question is concerned with set algebra. You may find it useful to draw a Venn diagram.
Note:
For any event A and B, we have:
P r(A) = P (A \ B) + P (A \ B c ) (1)
Solution.
P r(B) = P (B \ C) + P (B \ C c ) [by result (1)]
= P (C) + P (B \ C c ) [since C ✓ B]
P (C) [since P r(· ) 0]
Useful fact. Notice that if A ) B then A ✓ B.
Hence we have the following result: A ) B implies P r(A) P r(B).
Question 3, Problem Set 1
In this question we practice using Slutzky’s Theorem and its Corollaries. It’s a very useful
theorem which we will use a lot in this course.
Slutzky’s Theorem:
p
Let Xn , X be k ⇥ 1 vectors. Assume that Xn ! X, and let g be a continuous function in the
domain of X. Then:
p
g(Xn ) ! g(X).
Solution.
These solutions are adapted from solutions by Chen Qiu, which were based on Prof Hidalgo’s notes and
solutions for EC484. Their aim is to fill some gaps between notes and exercises. Prof Hidalgo’s notes should
always be the final reference. If you spot any mistakes in this file please contact me: [Link]@[Link].
1
p p p
(i). We are given Xn !
! c for some! constant c and Yn Xn ! 0. We want to show that Yn ! c.
Xn c p
Let xn = ,x= , so that xn ! x.
Yn X n 0
!!
a
Define g = a + b (notice that g is continuous). Then by Slutzky’s Theorem:
b
Yn = X n + Yn Xn
!!
Xn
=g
Yn Xn
= g(xn )
p
! g(x)
!!
c
=g
0
=c+0
=c
p
(ii). First, note an ! a implies an ! a as well (why?). Then since an Xn is continuous in both
an and Xn the conclusion follows by Slutzky’s Theorem.
p sin(Xn ) p sin(x)
(iii). For Xn ! 0 we want to show that Xn
! 1. Unfortunately, the function f (x) = x
is not defined (hence also not continuous) at x = 0.
sin(x)
x
x
sin(x)
However, as x ! 0 the value of the function approaches 1. By L’Hopital’s Rule:
sin(x) cos(0) 1
lim = = = 1.
x!0 x 1 1
We can define a continuous version of f as:
8
< sin(x) if x 6= 0
x
g(x) =
:1 if x = 0
2
and use it in combination with the Slutzky’s Theorem:
sin(Xn ) p
= g(Xn ) ! g(0) = 1.
Xn
Question 4, Problem Set 1
This question is a good exercise to get you started in basic regression manipulation in matrix
form. All you need is Slutzky’s Theorem, matrix algebra, and conditions provided in the
question.
p p p
Solution. Notice 1
t
! 0 , t ! 1. Hence, it suffices to show t ! 1 under H1 : j > 0. Take
a closer look at t:
p ˆj
t= np .
ˆ 2 m̂(j)
p p p ˆ p
Since n ! 1 (hence also n ! 1), all we need to show is p j
ˆ 2 m̂(j)
! c for some
constant c. By Slutzky’s Theorem, it suffices to find the probability limit of each of the three
element: ˆj , ˆ 2 , m̂(j) under H1 : j > 0.
Step 1: Find the probability limit of m̂(j) .
p
We are given that M̂ ! M > 0 (M is positive definite, so also invertible) and the
function g(A) = A 1
is continuous on the set of invertible matrices. By Slutzky’s
p
Theorem, M̂ 1
!M 1
and M 1
must also be positive definite1. Since m̂(j) is the jth
p
diagonal of M̂ 1
, m̂(j) ! mj where mj is the jth diagonal of deterministic matrix M 1
.
Since M 1
is positive definite, for any x 2 R dim(M )
\ {0} we have x M 0 1
x > 0. In
particular, for the vector ej which has 1 in the jth position and zeros everywhere else,
p
m(j) = e0j M 1 ej > 0, so m(j) is well-defined.
Step 2: Find the probability of ˆj
It is convenient to first find the probability limit of ˆ using matrix form. Let
y =Z +U
1One way to see this is to notice that M is positive definite iff all eigenvalues of M are strictly positive.
i
Eigenvalues of M 1
are given by 1
i
, hence they are also strictly positive.
3
then
ˆ = (Z 0 Z) 1 (Z 0 y)
= + (Z 0 Z) 1 (Z 0 U ) [plug in true value of y]
✓ 0 ◆ 1✓ 0 ◆
ZZ ZU
= +
n n
0
1 Z U
= + M̂
|{z} n
p
!M
|{z} 1
p
!0
p
! +M 1
·0 [by Slutzky’s Theorem]
=
p
We can conclude that also, ˆj ! j > 0 under H1 .
Step 3: Find the probability limit of ˆ 2
Define the residual ûi = yi ˆ0 zi and Û = (û1 , · · · ûi · · · ûn )0 = Y X ˆ, ˆ 2 can be
written as
n
X
2 1 ˆ0 zi )2
ˆ = (yi
n k i=1
Xn
1
= û2i
n k i=1
1
= Û 0 Û
n k
But:
1 1 ⇣ ⌘0 ⇣ ⌘
Û 0 Û = Y Zˆ Y Zˆ
n k n k
1 h ⇣ ⌘i0 h ⇣ ⌘i
= U +Z ˆ U +Z ˆ [plug in true value of y]
n k
⇣ ⌘0 ⇣ ⌘0 ⇣ ⌘
1 ˆ Z 0U + ˆ Z 0Z ˆ
= U 0U + 2
n k
0 ⇣ ⌘0 0 ⇣ ⌘0 0 ⇣ ⌘
n UU ˆ ZU + ˆ ZZ ˆ
= +2
n k n n n
Since
4
n p
! 1;
n k
U 0U p 2
! > 0;
n
p
ˆ! 0 [by Step 1];
Z 0U p
! 0;
n
Z 0Z p
! M > 0.
n
by Slutzky’s Theorem, we have
1 p
Û 0 Û ! 1· ( 2 + 2· 0· 0 + 0· M · 0)
n k
= 2
p
To conclude, ˆ 2 ! 2
.
Step 4: Summing up:
ˆj p j
p !p = c,
ˆ 2 m̂(j) 2 m(j)
Since under Hj , j > 0, which implies c is some positive constant. The conclusion
follows by noting
p ˆj
t= np
ˆ 2 m̂(j)
p
! 1· c
=1
Useful fact. Let MZ = I Z(Z 0 Z) 1 Z 0 , the residual maker. It can also be shown:
1
ˆ2 = U 0 MZ U.
n k