This example involves solving a nonlinear minimization problem with a tridiagonal Hessian matrix H(x) first computed explicitly, and then by providing the Hessian's sparsity structure for the finite-differencing routine.
The problem is to find x to minimize
where n = 1000.
The file is lengthy so is not included here. View the code with the command
Because brownfgh computes the gradient and Hessian values as well as the objective function, you need to use optimoptions to indicate that this information is available in brownfgh, using the GradObj and Hessian options.
n = 1000; xstart = -ones(n,1); xstart(2:2:n,1) = 1; options = optimoptions(@fminunc,'GradObj','on','Hessian','on'); [x,fval,exitflag,output] = fminunc(@brownfgh,xstart,options);
This 1000 variable problem is solved in about 7 iterations and 7 conjugate gradient iterations with a positive exitflag indicating convergence. The final function value and measure of optimality at the solution x are both close to zero. For fminunc, the first order optimality is the infinity norm of the gradient of the function, which is zero at a local minimum:
fval,exitflag,output fval = 2.8709e-17 exitflag = 1 output = iterations: 7 funcCount: 8 cgiterations: 7 firstorderopt: 4.7948e-10 algorithm: 'large-scale: trust-region Newton' message: 'Local minimum found. Optimization completed because the size of the grad...' constrviolation: