2D Newton's Method: Rosenbrock Function





Figure 1. Contours of Rosenbrock function with iterates denoted by asterisks. The green asterisk is the starting model and the red one is the final model after 10 iterations.

Objective: Find global minimum of Rosenbrock function using 2D version of Newton's method.

Introduction: Iterative Newton method for finding minimizer for Rosenbrock function f=100.*(x2-x1.^2).^2+(1-x1).^2 . The minimizer point=(1,1)

Procedure:

  1. Load program RosenbrockJ.m.
  2. Run program RosenbrockJ.m for different starting points. Why does convergence rate depend on starting point? Plot out the residual |f((X,Y))-0| as a function of iteration number to display the convergence rate
  3. Add a regularizer term such as a penalty function that penalizes solution far from (0.8,0.8). That is, the penalty term is lambda ||(X,Y)-(1,1)||^2. Does this speed up the convergence rate?
  4. Try different values of lambda, and determine if convergence rate depends on value of lambda.
  5. Start with a large positive value of lambda and gradually reduce it zero as iterations proceed. Does this help increase convergence rate? Why?
  6. Contour iterative solutions for your choice of a quadratic objective functions. How many mimima are there? Are you guaranteed to converge to global minima here?
  7. Same as previous questions, except use a quintic objective function.