Lecture4_SpectralDecomposition
Lecture4_SpectralDecomposition
Spectral
decomposition
Eigenvalue problem
A question: for arbitrary operator , will it always produce vector, different from the one
it operates on? Is it possible, that:
here is the eigenvector of the operator , is called eigenvalue. (note the notation: the
same letter is used to identify eigenvalue and eigenvector, the latter within the vector
sign).
In the matrix representation, let’s rewrite this as:
This is a system of linear equations. Non-trivial solutions are given by the condition of ,
so this is your standard matrix eigenvalue equation.
Eigenvalue problem
Is a characteristic equation, it is algebraic equation for A, left-hand side being a
polynomial of degree n in terms of A.
Fundamental algebra theorem tells us this algebraic equation has precisely n
solutions, if coefficients are real or complex, thus for a given operator in a given
vector space of dimensionality n there are n eigenvalues and n corresponding
eigenvectors.
Definition: If some eigenvalues are the same, this is called degeneracy….
Note, that the norm of the eigenvector is undefined:
For the dual vector spaces we can define a “dual extension” : of the operator in the V linear space of
functionals. Brackets here indicate functional F acting on vector A. can be proven to be unique, if is
“continuous”- we have not talked about the latter property. operates in the space of the functionals. For
spaces where V=V, dual extension is an adjoint operator.
For every algebraic expression in the vector space V, there is a corresponding expression in dual vector space.
Together with the rule for product above, dual (adjoint, conjugated) expression has all scalars conjugated,
vectors replaced with dual vectors (ket’s with bra’s), and all operators replace with adjoint operators in
reversed order.
Generalized Eigenvalue problem
We will also need this notion, since for some interesting operators
eigenvalue problem does not have a solution (within the space of
interest), even though we would really like to.
For the dual space the generalized eigenvalue problem for the
operator is defined by the equation:
For every vector from space V. here identifies a functional from the
dual vector space, and this is a generalized eigenvector, while is a
generalized eigenvalue. Since this must be true for arbitrary vector, it
is often omitted, and eigenvalue equation looks like:
Thus A is real. Note that if I add labels to eigenvectors, then this mean that , if there are no degeneracies.
Theorem: For every Hermitian operator, there exist at least one orthogonal basis formed by its eigenvectors; the
matrix of the Hermitian operator in that basis is diagonal, with eigenvalues on the diagonal. (this is also known as
a spectral theorem)
Proof: by construction, most difficulties are due to the degeneracies.
Note 1: “diagonalization” really means finding all eigenvalues….
Note 2: matrix formed by eigenvectors is unitary (check) and forms a transformation matrix to the basis where
Hermitian operator is diagonal.
Spectral decomposition
On that second theorem and note 2:
Recall projection operators and resolution of the identity in terms of
projectors to some orthonormal basis:
• Notice that in Dirac notation, this means very nice things: , and when
I act to the right first, I apply and then . But if I act to the left (have to
use conjugated operators) I automatically use and then