Two-step methods are secant-like techniques of the quasi-Newton type that,
unlike the classical methods, construct nonlinear alternatives to the
quantities used in the so-called Secant equation. Two-step methods
instead incorporate data available from the two most recent
iterations and thus create an alternative to the Secant equation with the
intention of creating better Hessian approximations that induce faster
convergence to the minimizer of the objective function. Such methods,
based on reported numerical results published in
several research papers related to the subject, have introduced substantial
savings in both iteration and function evaluation counts. Encouraged by the
successful performance of the methods, we explore in this paper employing
them in developing a new Conjugate Gradient (CG) algorithm. CG methods gain
popularity on big problems and in situations when memory resources are
scarce. The numerical experimentations on the new methods are encouraging
and open venue for further investigation of such techniques to explore their
merits in a multitude of applications.
You will need Adobe Acrobat reader. For more information and free download of the reader, please follow this link.
References
[1] M. Al-Baali, New property and global convergence of the Fletcher-Reeves
method with inexact line searches, IMAJ. Numer. Anal., 5 (1985), 122-124.
[2] N. Anderi, A scaled BFGS preconditioned conjugate gradient algorithm
for unconstrained optimization, Appl. Math. Letters, 20 (2007), 645-650.
[3] N. Andrei, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method. Calcolo, 57, No 17
(2020); doi:10.1007/s10092-020-00365-7.
[4] C.G. Broyden, The convergence of a class of double-rank minimization
algorithms - Part 2: The new algorithm, J. Inst. Math. Applic., 6 (1970),
222-231.
[5] R.H. Byrd, R.B. Schnabel, G.A. Shultz, Parallel quasi-Newton methods
for unconstrained optimization, Math. Programing, 42 (1988), 273-306.
[6] Y.H. Dai, Y. Yuan, A nonlinear conjugate gradient method with a strong
global convergence property, SIAM J. Optim., 10 (1999), 177-182.
[7] J.E. Dennis, R.B. Schnabel, Least change secant updates for quasi-Newton
methods, SIAM Review, 21 (1979), 443-459.
[8] R. Fletcher, Practical Methods of Optimization, 2nd Ed., Wiley, New York
(1987).
[9] R. Fletcher, A new approach to variable metric algorithms, Comput. J.,
13 (1970), 317-322.
[10] R. Fletcher, C. Reeves, Function minimization by conjugate gradients,
Computer J., 7 (1964), 149-154.
[11] J.A. Ford, I.A.R. Moghrabi, Using function-values in multi-step quasiNewton methods, J. Comput. Appl. Math., 66 (1996), 201-211.
[12] J.A. Ford, I.A.R. Moghrabi, Multi-step quasi-Newton methods for optimization, J. Comput. Appl. Math., 50 (1994), 305-323.
[13] J.A. Ford, I.A.R. Moghrabi, Alternative parameter choices for multi-step
quasi-Newton methods, Optimization Methods and Software, 2 (1993), 357370.
[14] J.A. Ford, Y. Narushima, H. Yabe, Multi-step nonlinear conjugate gradient methods for unconstrained minimization, Comput. Optim. Appl., 40
(2008), 191-216.
[15] W. Hager, H.C. Zhang, A new conjugate gradient method with guaranteed
descent and an efficient line search, SIAM J. Optim., 16 (2005), 170-192.
[16] M.R. Hestenes, E. Stiefel, Methods of conjugate gradients for solving linear
systems, J. Res. Nat. Bur. Stan. Sec. B, 48 (1952), 409-436.
[17] Y. Liu, C. Storey, Efficient generalized conjugate gradient algorithms,
Part1: Theory, J. Optim. Theory Appl., 69 (1991), 129-137.
[18] I.A.R. Moghrabi, Numerical experience with multiple update quasi-newton
methods for unconstrained optimization, J. of Math. Modeling and Algorithms
, 6 (2007), 231-238.
[21] A. Perry, A modified conjugate gradient algorithm, Oper. Res., 26 (1978),
1073-1078.
[22] E. Polak, G. Ribi´ere, Notesurla convergence de directions conjugu´ees, Rev.
Francaise Infomat Recherche Operatonelle, 3e Ann´ee, 16 (1969), 35-43.
[23] B.T. Polyak, The conjugate gradient method in extreme problems, USSR
Comp. Math. Phys., 9 (1969), 94-112.
[24] M.J.D. Powell, Restart procedures for the conjugate gradient method,
Math. Program., 12 (1977), 241-254.
[25] D. Salane, R.P. Tewarson, On symmetric minimum norm updates, IMA J.
Numer. Anal., 9, No 1 (1983), 235-240.
[26] D.F. Shanno, Conditioning of quasi-Newton methods for function minimization, Math. Comp., 24 (1970), 647-656.