2D Newton's Method: Rosenbrock Function
Figure 1. Contours of Rosenbrock function with iterates denoted by asterisks.
The green asterisk is the starting model and the red one is the final model after
10 iterations.
Objective:
Find global minimum of Rosenbrock function using 2D version
of Newton's method.
Introduction:
Iterative Newton method for finding minimizer
for Rosenbrock function
f=100.*(x2-x1.^2).^2+(1-x1).^2 . The minimizer point=(1,1)
Procedure:
- Load program RosenbrockJ.m.
- Run program RosenbrockJ.m for different starting points. Why does
convergence rate depend on starting point? Plot out
the residual |f((X,Y))-0| as a function of iteration number
to display the convergence rate
- Add a regularizer term such as a penalty function
that penalizes solution far from (0.8,0.8). That is,
the penalty term is lambda ||(X,Y)-(1,1)||^2. Does this speed up
the convergence rate?
- Try different values of lambda, and determine if convergence
rate depends on value of lambda.
- Start with a large positive value of lambda and gradually reduce it
zero as iterations proceed. Does this help increase convergence rate?
Why?
- Contour iterative solutions for your choice of a quadratic objective
functions. How many mimima are there? Are you guaranteed to converge
to global minima here?
- Same as previous questions, except use a quintic objective function.