CG -- The Method of Conjugate Directions

High-level idea

CG -- The Method of Steepest Descent often finds itself taking steps in the same direction as earlier steps. So we have an idea to make it converge faster:

Let’s pick a set of orthogonal search directions d(0),d(1),...,d(n1). In each search direction, we'll take exactly one step and that step will be just the right length to line up with x. After n steps, we'l be done.


Pasted image 20221229225619.png

Update procedure


Compute \alpha

In order not to step in the previous directions (d(i)) again, e(i+1) should be orthogonal to d(i). So,
Pasted image 20230102151637.png

But we cannot do anything without knowing e(i). But if we know e(i), we can solve the problem.

A-orthogonal instead of orthogonal


Two vectors d(i) and d(j) are A-orthogonal, or conjugate if


Pasted image 20230102152835.png

Thus, α becomes from #Compute alpha to
Pasted image 20230102153408.png

Prove we can compute x in n steps

Pasted image 20230102154006.png
From the above formula, we concludes that α(i)=δ(i)
Pasted image 20230102154720.png

Construct d(i) using Gram-Schmidt Conjugation

CG -- Gram-Schmidt Conjugation

Properties if using the Method of Conjugate Directions

  1. The error term is evermore A-orthogonal to all the old search directions since we never step back in the previous directions (Also from equation 35).
  2. From 1, since r(i)=Ae(i), the residual is evermore orthogonal to all the old search directions, that is Pasted image 20230102165215.png
  3. From 2, because the search directions ({d(i)}) are constructed from the u vectors, the residual r(i) is orthogonal to these previous u vectors, that is Pasted image 20230102165302.png
  4. From 2 and 3, we have Pasted image 20230102165323.png

Pasted image 20230102165357.png

New updates formula

Pasted image 20230102165444.png