Applications of Mathematics, first online, pp. 1-20


A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization

Yongjin Kim, Yunchol Jong, Yong Kim

Received October 30, 2023.   Published online November 15, 2024.

Abstract:  Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.
Keywords:  unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search
Classification MSC:  65K05, 90C06

PDF available at:  Institute of Mathematics CAS

References:
[1] Y. Chen, M. Cao, Y. Yang: A new accelerated conjugate gradient method for large-scale unconstrained optimization. J. Inequal. Appl. 2019 (2019), Article ID 300, 13 pages. DOI 10.1186/s13660-019-2238-9 | MR 4036512 | Zbl 1499.90216
[2] Y.-H. Dai, C.-X. Kou: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23 (2013), 296-320. DOI 10.1137/100813026 | MR 3033109 | Zbl 1266.49065
[3] Y.-H. Dai, L.-Z. Liao: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43 (2001), 87-101. DOI 10.1007/s002450010019 | MR 1804396 | Zbl 0973.65050
[4] E. D. Dolan, J. J. Moré: Benchmarking optimization software with performance profiles. Math. Program. 91 (2002), 201-213. DOI 10.1007/s101070100263 | MR 1875515 | Zbl 1049.90004
[5] X.-L. Dong, Z.-F. Dai, R. Ghanbari, X.-L. Li: An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J. Oper. Res. Soc. China 9 (2021), 411-425. DOI 10.1007/s40305-019-00257-w | MR 4263215 | Zbl 1488.49058
[6] X. Dong, D. Han, Z. Dai, L. Li, J. Zhu: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 179 (2018), 944-961. DOI 10.1007/s10957-018-1377-3 | MR 3869490 | Zbl 1402.90176
[7] J. A. Ford, I. A. Moghrabi: Alternative parameter choices for multi-step quasi-Newton methods. Optim. Methods Softw. 2 (1993), 357-370. DOI 10.1080/10556789308805550
[8] J. A. Ford, I. A. Moghrabi: Multi-step quasi-Newton methods for optimization. J. Comput. Appl. Math. 50 (1994), 305-323. DOI 10.1016/0377-0427(94)90309-3 | MR 1284271 | Zbl 0807.65062
[9] J. A. Ford, Y. Narushima, H. Yabe: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40 (2008), 191-216. DOI 10.1007/s10589-007-9087-z | MR 2390824 | Zbl 1181.90221
[10] W. W. Hager, H. Zhang: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16 (2005), 170-192. DOI 10.1137/030601880 | MR 2177774 | Zbl 1093.90085
[11] C. X. Kou, Y. H. Dai: A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165 (2015), 209-224. DOI 10.1007/s10957-014-0528-4 | MR 3327422 | Zbl 1319.49042
[12] D. Li, M. Fukushima: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129 (2001), 15-35. DOI 10.1016/S0377-0427(00)00540-9 | MR 1823208 | Zbl 0984.65055
[13] X. Li, J. Shi, X. Dong, J. Yu: A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization. J. Comput. Appl. Math. 350 (2019), 372-379. DOI 10.1016/j.cam.2018.10.035 | MR 3877400 | Zbl 1524.90295
[14] P. Mtagulwa, P. Kaelo: An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems. Appl. Numer. Math. 145 (2019), 111-120. DOI 10.1016/j.apnum.2019.06.003 | MR 3963704 | Zbl 1477.65095
[15] Y. Narushima, H. Yabe, J. A. Ford: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21 (2011), 212-230. DOI 10.1137/080743573 | MR 2765496 | Zbl 1250.90087
[16] J. Nocedal, S. J. Wright: Numerical Optimization. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006). DOI 10.1007/978-0-387-40065-5 | MR 2244940 | Zbl 1104.65059
[17] A. Perry: A class of conjugate gradient algorithms with a two-step variable metric memory. Discussion paper 269 (1977), 16 pages Available at https://EconPapers.repec.org/RePEc:nwu:cmsems:269.
[18] D. F. Shanno: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15 (1978), 1247-1257. DOI 10.1137/0715085 | MR 0512697 | Zbl 0408.90071
[19] Z. Wei, G. Li, L. Qi: New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175 (2006), 1156-1188. DOI 10.1016/j.amc.2005.08.027 | MR 2220320 | Zbl 1100.65054
[20] H. Yabe, M. Takano: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28 (2004), 203-225. DOI 10.1023/B:COAP.0000026885.81997.88 | MR 2072533 | Zbl 1056.90130
[21] G. Yuan, Z. Wei, G. Li: A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255 (2014), 86-96. DOI 10.1016/j.cam.2013.04.032 | MR 3093406 | Zbl 1291.90315
[22] Y. Zheng, B. Zheng: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175 (2017), 502-509. DOI 10.1007/s10957-017-1140-1 | MR 3717341 | Zbl 1380.65108
[23] Q. Zhou, D. Hang: Nonmonotone adaptive trust region method with line search based on new diagonal updating. Appl. Numer. Math. 91 (2015), 75-88. DOI 10.1016/j.apnum.2014.12.009 | MR 3312325 | Zbl 1310.65070
[24] W. Zhou, L. Zhang: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21 (2006), 707-714. DOI 10.1080/10556780500137041 | MR 2238653 | Zbl 1112.90096

Affiliations:   Yongjin Kim, Yunchol Jong (corresponding author), Yong Kim, Department of Mathematics, University of Sciences, Unjong District 355, 950003, Pyongyang, DPR Korea, e-mail: jyc1963@star-co.net.kp


 
PDF available at: