0% found this document useful (0 votes)
186 views5 pages

Matrix Preconditioning Techniques

The document discusses preconditioning methods for solving linear systems of equations iteratively. Preconditioning transforms the original matrix A into an equivalent matrix A' with a smaller condition number, allowing iterative solvers to converge faster. Common preconditioning methods include left preconditioning, right preconditioning, and split preconditioning. The Jacobi/diagonal preconditioner uses the diagonal of A as the preconditioner P, which is effective for diagonally dominant matrices. Preconditioning is needed when iterative methods make little progress or the residual stagnates, such as for ill-conditioned or highly scaled matrices. Incomplete factorizations are often used as preconditioners when natural preconditioners are absent.

Uploaded by

atalasa-1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
186 views5 pages

Matrix Preconditioning Techniques

The document discusses preconditioning methods for solving linear systems of equations iteratively. Preconditioning transforms the original matrix A into an equivalent matrix A' with a smaller condition number, allowing iterative solvers to converge faster. Common preconditioning methods include left preconditioning, right preconditioning, and split preconditioning. The Jacobi/diagonal preconditioner uses the diagonal of A as the preconditioner P, which is effective for diagonally dominant matrices. Preconditioning is needed when iterative methods make little progress or the residual stagnates, such as for ill-conditioned or highly scaled matrices. Incomplete factorizations are often used as preconditioners when natural preconditioners are absent.

Uploaded by

atalasa-1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

9/7/2020

Computational Science:
Computational Methods in Engineering

Preconditioning

Condition Number
Condition number of a matrix  𝐴 measures how much the answer to a linear 
algebra problem changes due to small changes in  𝐴 .

The condition number can be calculated from the norms of the matrix.
The condition number of a matrix  𝐴 is defined as cond 𝐴 𝐴 𝐴

A large condition number means the system is ill conditioned and a small
condition number means the system is well conditioned.

Slide 2

1
9/7/2020

Problem with ill‐Conditioned Matrices
The number of iterations needed for an iterative solver to find a solution to 
𝐴 𝑥 𝑏 depends heavily on the condition number of the matrix  𝐴 .
# Iterations

Condition Number
3

Preconditioning
Suppose  𝐴 𝑥 𝑏 is to be solved iteratively, but  𝐴 has a high condition number.

A preconditioner  𝑃 is a nonsingular matrix such any of the following have a smaller 
condition number than  𝐴 alone.

 P   A
1
Left Preconditioned

 A P 
1
Right Preconditioned

 P   A   P  
1 1 T
Split Preconditioned

The ideal preconditioner transforms  𝐴 into an identity matrix and makes any iterative 
method converge in one iteration.

Slide 4

2
9/7/2020

Right Preconditioning
Formulation Implementation
Original Equation 1. Solve  𝑦 𝐴 𝑏
2. Calculate  𝑥 𝑃 𝑦
 A x   b
Insert Preconditioner Iterative Solvers
 A P   P  x   b
1
1. Least‐squares (LSQR)
2. Transpose‐free quasi‐minimal residual 
(TFQMR)
Derive Preconditioned Equation 3. Stabilized biconjugate gradients 
 A y   b (BICGSTAB)
4. Stabilized biconjugate gradients (I) 
(BICGSTABL)
 A   A P   y    P  x 
1
5. Conjugate gradient square (CGS)

Slide 5

Left Preconditioning
Formulation Implementation
Original Equation 1. Solve  𝑥 𝐴 𝑏′
 A x   b
Iterative Solvers
Insert Preconditioner
1. Generalized minimum residual (GMRES)
 P   A x    P  b
1 1
2. Quasi‐minimal residual (QMR)
3. Biconjugate gradients (BICG)
Derive Preconditioned Equation
 A x   b
 A   P   A
1
 b    P   b 
1

Slide 6

3
9/7/2020

Split Preconditioning
Formulation Implementation
Original Equation 1. Solve  𝑦 𝐴 𝑏′
 A x   b 2. Calculate  𝑥 𝑃 𝑦

Insert Preconditioner Iterative Solvers
 P   A P   P   x    P  b
1 T T 1
1. Preconditioned conjugate gradients (PCG)
2. Minimum residual (MINRES)
3. Symmetric LQ (SYMMLQ)
Derive Preconditioned Equation
 A y   b
 A   P   A P   b    P   b 
1 T 1

 y    P  x
T

Slide 7

Jacobi (or Diagonal) Preconditioner
Perhaps the simplest preconditioner is the Jacobi preconditioner.  In this case, the 
preconditioner  𝑃 is simply the diagonal of the matrix  𝐴 .
 a11 a12  a1N   a11 
a  a2 N   
 P   diag  A
a22 a22
 A   21   
 P    


   
 aN 1 aN 2 aNN   aNN 

Jacobi preconditioning is particularly effective for diagonally dominant matrices.

Slide 8

4
9/7/2020

Notes About Preconditioning


• The preconditioner  𝑃 is chosen to accelerate the convergence of an 
iterative method. 
• Preconditioning is needed when little progress is made between 
iterations or the residual error of an iterative solution stagnates.
• When the numbers in  𝐴 have massively different magnitudes (i.e. 2‐
3 orders of magnitude or more),  𝐴 will typically require a 
preconditioner.
• Incomplete factorizations, such as the incomplete LU factorization or 
the incomplete Cholesky factorization, are often used as 
preconditioners when natural preconditioners are absent.

Slide 9

You might also like