Lab: Preconditioned Conjugate Gradient Method to Minimize Rosenbrock Function
by Wei Dai (Oct. 8, 2011)
Objective:
- Learn to implement preconditioned conjugate gradient to solve linear and non-linear optimization problem.
Introduction:
The goal is to find the minimum point of the Rosenbrock function f(x1,x2)=100*(x2-x1^2)^2+(1-x1)^2. It is a non-linear problem.
Platform:
Matlab.
Download:
Please download the file:rose_pre.m.
Exercises:
- Read the script and understand the algorithm.
Figure 1. Convergence path of the conjugate gradient method.
- Execute rose_pre.m and produce above figure.
Figure 2. Convergence path of the preconditioned conjugate gradient method.
- Change the control parameter pre=1, and execute rose_pre.m and produce above figure.
Questions:
- Examine the number of iterations required for the conjugate gradient method and preconditioned conjugate gradient method.
- Compare the result to the steepest descent method (reduce to steepest descent by setting beta=0).
- Read the script and understand why beta is set to be zero every other iteartion. Try to run the script without resetting beta.
- Solve the problem Lx=t by preconditioned conjugate gradient (L=[4 6;2 5;0 3;1 4], t=[10 7 3 5]). Since it is a linear problem, conjugate gradient method should converge after 2 iterations.