0% found this document useful (0 votes)
4 views

Lecture4_SpectralDecomposition

Uploaded by

devenil540
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Lecture4_SpectralDecomposition

Uploaded by

devenil540
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 10

Lecture 4:

Spectral
decomposition
Eigenvalue problem
A question: for arbitrary operator , will it always produce vector, different from the one
it operates on? Is it possible, that:

here is the eigenvector of the operator , is called eigenvalue. (note the notation: the
same letter is used to identify eigenvalue and eigenvector, the latter within the vector
sign).
In the matrix representation, let’s rewrite this as:

And expand this equation in some basis :

This is a system of linear equations. Non-trivial solutions are given by the condition of ,
so this is your standard matrix eigenvalue equation.
Eigenvalue problem
Is a characteristic equation, it is algebraic equation for A, left-hand side being a
polynomial of degree n in terms of A.
Fundamental algebra theorem tells us this algebraic equation has precisely n
solutions, if coefficients are real or complex, thus for a given operator in a given
vector space of dimensionality n there are n eigenvalues and n corresponding
eigenvectors.
Definition: If some eigenvalues are the same, this is called degeneracy….
Note, that the norm of the eigenvector is undefined:

i.e. if is eigenvector, then any vector is also an eigenvector.


Definition: All eigenvalues of the operator called a “spectrum”
Example: rotation operator (not the best, but illustrate other points)
Adjoint operators
Definition: Adjoint operator is defined by (inner product)
In some basis, the RHS one means:
The LHS, in the same basis, means:; thus
Thus the rules to operate to the left (to make an adjoint of the operator) in the matrix language are:
1. Transpose the matrix representing the operator.
2. Do complex conjugation of the matrix.
Show that (exercise)

For the dual vector spaces we can define a “dual extension” : of the operator in the V linear space of
functionals. Brackets here indicate functional F acting on vector A. can be proven to be unique, if is
“continuous”- we have not talked about the latter property. operates in the space of the functionals. For
spaces where V=V, dual extension is an adjoint operator.
For every algebraic expression in the vector space V, there is a corresponding expression in dual vector space.
Together with the rule for product above, dual (adjoint, conjugated) expression has all scalars conjugated,
vectors replaced with dual vectors (ket’s with bra’s), and all operators replace with adjoint operators in
reversed order.
Generalized Eigenvalue problem
We will also need this notion, since for some interesting operators
eigenvalue problem does not have a solution (within the space of
interest), even though we would really like to.
For the dual space the generalized eigenvalue problem for the
operator is defined by the equation:

For every vector from space V. here identifies a functional from the
dual vector space, and this is a generalized eigenvector, while is a
generalized eigenvalue. Since this must be true for arbitrary vector, it
is often omitted, and eigenvalue equation looks like:

Which is almost indistinguishable from the standard eigenvalue


problem. BE CAREFUL.
For a “nice” vector spaces, when , one by the rules of writing adjoint
expressions:
Unitary operators {maybe not here:
I did not need it yet}
Definition: If , then such operator is called unitary operator ().
Theorem: Eigenvalues of the unitary operator are complex numbers of magnitude 1, while
eignevectors are orthogonal.
Proof: Let , and , then:

The same thing with adjoint operator:

Scalar product of the two:

Thus if i=j, , and if =0


Theorem: Unitary operators preserve inner product.
Can be viewed as a responsible for the change of basis (think rotation…)
Hermitian operators
Definition: If , then such operator is called self-adjoint, or Hermitian (there is some difference, for infinite
dimensions).
Theorem: Eigenvalues of the Hermitian operator are real
Proof: Let , and , a non-degenerate eigenvalue. Then:

The same thing with adjoint operator:

Difference, while keeping in mind that :

Thus A is real. Note that if I add labels to eigenvectors, then this mean that , if there are no degeneracies.
Theorem: For every Hermitian operator, there exist at least one orthogonal basis formed by its eigenvectors; the
matrix of the Hermitian operator in that basis is diagonal, with eigenvalues on the diagonal. (this is also known as
a spectral theorem)
Proof: by construction, most difficulties are due to the degeneracies.
Note 1: “diagonalization” really means finding all eigenvalues….
Note 2: matrix formed by eigenvectors is unitary (check) and forms a transformation matrix to the basis where
Hermitian operator is diagonal.
Spectral decomposition
On that second theorem and note 2:
Recall projection operators and resolution of the identity in terms of
projectors to some orthonormal basis:

Let’s consider basis that is formed by the eigenvectors of the Hermitian


operator . Then:

Since operator is diagonal in that basis. Representation of this type is


known as spectral decomposition.
Eigenvectors of commuting
Hermitian operators
• Theorem: For two commuting Hermitian operators there are exist at least
one basis that diagonalizes them simultaneously.
• Proof: Let be Hermitian operators, and . Assume non-degeneracy, for a
moment. Consider vector . Due to commutability, ; thus is an eigenvector of
, with eigenvalue But we assumed non-degeneracy, thus eigenvector is
unique up to a scale, thus , meaning that is an eigenvector of . The same
true for all eigenvectors….
Exercises
• Demonstrate that:
• Verify that and show explicitly that through their matrix elements:)
• Prove that: From definition: ;
• :

• Notice that in Dirac notation, this means very nice things: , and when
I act to the right first, I apply and then . But if I act to the left (have to
use conjugated operators) I automatically use and then

You might also like