0% found this document useful (0 votes)
48 views6 pages

Eigenvalues and Eigenvectors Explained

Uploaded by

Fahad Rahman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views6 pages

Eigenvalues and Eigenvectors Explained

Uploaded by

Fahad Rahman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Eigen values and eigen vectors

Characteristic roots or Eigen values:


The roots of characteristic equation
|𝜆𝐼 − 𝐴| = 0 are called characteristic roots or
characteristic values or eigen values or latent
roots or proper values of the matrix A.
Characteristic vectors or Eigen vectors:
A non zero vector X is called a Characteristic
vectors or Eigen vectors of a square matrix A,
there exists a eigen vector 𝜆 such that 𝐴𝑋 = 𝜆𝑋
Properties of eigen vectors
1) The eigen vectors X of a square matrix is
not unique
2) No two distinct eigen value corresponds to
a eigen vector X of a sqare matrix A.
3) Different eigen vectors correspond to the
same eigen value 𝜆 of a sqare matrix A.
Example 1: Find the eigen values and eigen
2 0 0
vectors of the matrix 𝐴 = [−1 1 0]
1 −4 0
Solution: The characteristic equation is
|𝜆𝐼 − 𝐴| = 0
𝜆−2 0 0
=> | 1 𝜆 − 1 0| = 0
−1 4 𝜆
=> (𝜆 − 2)(𝜆 − 1)( 𝜆) = 0
∴ 𝜆 = 0, 1, 2 (eigen values)
𝑥1
Now by definition 𝑋 = [ 𝑥2 ] is an eigen vectors
𝑥3
of A corresponding to the eigenvalue of 𝜆 if and
only if X is a non-trivial solution of
[𝜆𝐼 − 𝐴]𝑋 = 0
𝜆−2 0 0 𝑥1 0
=> [ 1 𝜆 − 1 0] [𝑥2 ] = [0] ------(1)
−1 4 𝜆 𝑥3 0
Case 1: Eigen vector corresponding to eigen
value 𝜆 =0, put 𝜆 = 0 in (1), we get
−2 0 0 𝑥1 0
[ 1 −1 0] [𝑥2 ] = [0] =>
−1 4 0 𝑥3 0
Forming the linear system, we have
2𝑥1 = 0 − − − − − − − −(2)
{ 𝑥1 − 𝑥2 = 0 − − − − − − − (3)
−𝑥1 + 4𝑥2 = 0 − − − − − −(4)
Solving these we get 𝑥1 = 0 and 𝑥2 = 0
Hence 𝑥3 is a free variable, say 𝑥3 = 𝑎.
Therefore, the eigen vector of A corresponding to
the eigenvalue of 𝜆 = 0 are non-zero vector of
0
the form X1 = [0]
𝑎
0
In particular let, a = 1 then X1= [0] is an eigen
1
vector corresponding to the eigenvalue 𝜆 = 0.
Case 2: Eigen vector corresponding to eigen
value 𝜆 = 1, put 𝜆 = 1 in (1), we get
−1 0 0 𝑥1 0
[ 1 1 0] [𝑥2 ] = [0] =>
−1 4 1 𝑥3 0
Performing i) R13, ii) R32(1), iii) R1(-1),
1 −4 −1 𝑥1 0
[1 0 0 ] [𝑥2 ] = [0] =>
0 0 0 𝑥3 0
Now, the rank of the coefficient matrix A and that of the
augmented matrix [𝐴𝐻 ] are the same. i,e. 𝜌(𝐴) = 2 =
𝜌(𝐴𝐻 ) . So the given system is consistent.
Since the system has 3 unknown variables whereas rank is
2. So the system has infinitely many solution. The general
solution contains n - r = 3 - 2 = 1 arbitrary constant.

Solving these we get 𝑥1 = 0 and 4𝑥2 + 𝑥3 = 0


Hence 𝑥3 is a free variable, say 𝑥3 = 𝑎.
𝑥3 𝑎
 𝑥2 = − =−
4 4

0 0
Thus X2 = [ 0𝑎]= [ 0 ] is an eigen vector
− −4
4
corresponding to the eigenvalue of 𝜆 = 1.
Case 3: Eigen vector corresponding to eigen
value 𝜆 = 2, put 𝜆 = 2 in (1), we get
0 0 0 𝑥1 0
[ 1 1 0] [𝑥2 ] = [0] =>
−1 4 2 𝑥3 0
Performing i) R13, ii) R21(1), iii) R1(-1), R2(− 41)
1 −4 −2 𝑥1 0
2
[0 1 ] [𝑥2 ] = [0] =>
5
𝑥3 0
0 0 0
Obviously, the rank of the coefficient matrix A and that of
the augmented matrix [𝐴𝐻 ] are the same. i,e. 𝜌(𝐴) =
2 = 𝜌(𝐴𝐻 ) . So the given system is consistent.
Since the system has 3 unknown variables whereas rank is
2. So, the system has infinitely many solutions. The
general solution contains n - r = 3 - 2 = 1 arbitrary
constant.

Hence 𝑥3 is a free variable, say 𝑥3 = 𝑎.


2𝑥3 2𝑎
 𝑥2 = − =−
5 5
2𝑎
 𝑥1 = 4𝑥2 + 2𝑥3 =
5

2𝑎
5 2
Thus X3 = [ 2𝑎] = [−2] is an eigen vector

5 5
𝑎
corresponding to the eigenvalue of 𝜆 = 2.

You might also like