A new modified conjugate gradient method under the strong Wolfe line search for solving unconstrained optimization problems
dc.citation.epage | 118 | |
dc.citation.issue | 1 | |
dc.citation.spage | 111 | |
dc.contributor.affiliation | Університет Путра Малайзія | |
dc.contributor.affiliation | Universiti Putra Malaysia | |
dc.contributor.author | Ішак, М. І. | |
dc.contributor.author | Маржуги, С. М. | |
dc.contributor.author | Джун, Л. В. | |
dc.contributor.author | Ishak, M. I. | |
dc.contributor.author | Marjugi, S. M. | |
dc.contributor.author | June, L. W. | |
dc.coverage.placename | Львів | |
dc.coverage.placename | Lviv | |
dc.date.accessioned | 2023-12-13T09:10:58Z | |
dc.date.available | 2023-12-13T09:10:58Z | |
dc.date.created | 2021-03-01 | |
dc.date.issued | 2021-03-01 | |
dc.description.abstract | Метод спряженого градієнта (СГ) добре відомий своєю ефективністю для вирішення проблем необмеженої оптимізації через його збіжні властивості та низьку вартість обчислень. На сьогоднішній день цей метод широко розроблений, щоб конкурувати з існуючими методами за їх ефективностями. У цій статті пропонується модифікація методу СГ при сильному лінійному пошуку Вульфа. Новий коефіцієнт СГ подано на підставі ідеї використання деяких частин попередніх існуючих методів СГ, щоб зберегти їхні переваги. Чисельне тестування однозначно вказує на те, що запропонований метод має кращу можливість для розв’язання необмеженої оптимізації у порівнянні з іншими методами при неточному сильному лінійному пошуку Вулфа–Пауелла. | |
dc.description.abstract | Conjugate gradient (CG) method is well-known due to efficiency to solve the problems of unconstrained optimization because of its convergence properties and low computation cost. Nowadays, the method is widely developed to compete with existing methods in term of their efficiency. In this paper, a modification of CG method will be proposed under strong Wolfe line search. A new CG coefficient is presented based on the idea of make use some parts of the previous existing CG methods to retain the advantages. The proposed method guarantees that the sufficient descent condition holds and globally convergent under inexact line search. Numerical testing provides strong indication that the proposed method has better capability when solving unconstrained optimization compared to the other methods under inexact line search specifically strong Wolfe–Powell line search. | |
dc.format.extent | 111-118 | |
dc.format.pages | 8 | |
dc.identifier.citation | Ishak M. I. A new modified conjugate gradient method under the strong Wolfe line search for solving unconstrained optimization problems / M. I. Ishak, S. M. Marjugi, L. W. June // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2022. — Vol 9. — No 1. — P. 111–118. | |
dc.identifier.citationen | Ishak M. I. A new modified conjugate gradient method under the strong Wolfe line search for solving unconstrained optimization problems / M. I. Ishak, S. M. Marjugi, L. W. June // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2022. — Vol 9. — No 1. — P. 111–118. | |
dc.identifier.doi | 10.23939/mmc2022.01.111 | |
dc.identifier.uri | https://ena.lpnu.ua/handle/ntb/60541 | |
dc.language.iso | en | |
dc.publisher | Видавництво Львівської політехніки | |
dc.publisher | Lviv Politechnic Publishing House | |
dc.relation.ispartof | Mathematical Modeling and Computing, 1 (9), 2022 | |
dc.relation.references | [1] Hestenes M. R., Stiefel E. Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards. 49 (6), 409–436 (1952). | |
dc.relation.references | [2] Fletcher R., Reeves C. M. Function minimization by conjugate gradients. The Computer Journal. 7 (2), 149–154 (1964). | |
dc.relation.references | [3] Polak E., Ribiere G. Note sur la convergence de m´ethodes de directions conjugu´ees. Revue Franзaise d’informatique et de Recherche Op´erationnelle. S´erie Rouge. 3 (16), 35–43 (1969). | |
dc.relation.references | [4] Polyak B. T. The conjugate gradient method in extremal problems. USSR Computational Mathematics and Mathematical Physics. 9 (4), 94-–112 (1969). | |
dc.relation.references | [5] Fletcher R. Practical Methods of Optimization. Wiley-Interscience, New York, NY, USA (1987). | |
dc.relation.references | [6] Zoutendijk G. Some algorithms based on the principle of feasible directions. Nonlinear programming. Proceedings of a Symposium Conducted by the Mathematics Research Center, the University of Wisconsin–Madison, May 4–6, 1970. 93–121 (1970). | |
dc.relation.references | [7] Powell M. J. D. Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths D. F. (eds) Numerical Analysis. Lecture Notes in Mathematics, vol. 1066. Springer, Berlin, Heielberg. 122–141 (1984). | |
dc.relation.references | [8] Powell M. J. D. Convergence Properties of Algorithms for Nonlinear Optimization. SIAM Review. 28 (4), 487–500 (1986). | |
dc.relation.references | [9] Powell M. J. D. Restart procedures for the conjugate gradient method. Mathematical programming. 12 (1), 241–254 (1977). | |
dc.relation.references | [10] Al-Baali M. Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA Journal of Numerical Analysis. 5 (1), 121–124 (1985). | |
dc.relation.references | [11] Touati-Ahmed D., Storey C. Efficient hybrid conjugate gradient techniques. Journal of optimization theory and applications. 64 (2), 379–397 (1990). | |
dc.relation.references | [12] Gilbert J. C., Nocedal J. Global Convergence Properties of Conjugate Gradient Methods for Optimization. SIAM Journal on Optimization. 2 (1), 21–42 (1992). | |
dc.relation.references | [13] Jiang X., Jian J. Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search. Journal of Computational and Applied Mathematics. 348, 525–534 (2019). | |
dc.relation.references | [14] Mtagulwa P., Kaelo P. An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems. Applied Numerical Mathematics. 145, 111–120 (2019). | |
dc.relation.references | [15] Rivaie M., Mamat M., June L. W., Mohd I. A new class of nonlinear conjugate gradient coefficients with global convergence properties. Applied Mathematics and Computation. 218 (22), 11323–11332 (2012). | |
dc.relation.references | [16] Pytlak R. Conjugate gradient algorithms in nonconvex optimization. Springer Science & Business Media. Vol. 89 (2008). | |
dc.relation.references | [17] Hamoda M., Mamat M., Rivaie M., Salleh Z. A conjugate gradient method with Strong Wolfe–Powell line search for unconstrained optimization. Applied Mathematical Sciences. 10 (15), 721–734 (2016). | |
dc.relation.references | [18] Rivaie M., Mamat M., Abashar A. A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches. Applied Mathematics and Computation. 268, 1152–1163 (2015). | |
dc.relation.references | [19] Andrei N. An Unconstrained Optimization Test Functions Collection. Advanced Modelling and Optimization. 10 (1), 147–161 (2008). | |
dc.relation.references | [20] Hillstrom K. E. A Simulation Test Approach to the Evaluation of Nonlinear Optimization Algorithms. ACM Transactions on Mathematical Software. 3 (4), 305–315 (1977). | |
dc.relation.references | [21] Dolan E. D., Mor´e J. J. Benchmarking optimization software with performance profiles. Mathematical Programming. Series B. 91 (2), 201–213 (2012). | |
dc.relation.referencesen | [1] Hestenes M. R., Stiefel E. Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards. 49 (6), 409–436 (1952). | |
dc.relation.referencesen | [2] Fletcher R., Reeves C. M. Function minimization by conjugate gradients. The Computer Journal. 7 (2), 149–154 (1964). | |
dc.relation.referencesen | [3] Polak E., Ribiere G. Note sur la convergence de m´ethodes de directions conjugu´ees. Revue Franzaise d’informatique et de Recherche Op´erationnelle. S´erie Rouge. 3 (16), 35–43 (1969). | |
dc.relation.referencesen | [4] Polyak B. T. The conjugate gradient method in extremal problems. USSR Computational Mathematics and Mathematical Physics. 9 (4), 94-–112 (1969). | |
dc.relation.referencesen | [5] Fletcher R. Practical Methods of Optimization. Wiley-Interscience, New York, NY, USA (1987). | |
dc.relation.referencesen | [6] Zoutendijk G. Some algorithms based on the principle of feasible directions. Nonlinear programming. Proceedings of a Symposium Conducted by the Mathematics Research Center, the University of Wisconsin–Madison, May 4–6, 1970. 93–121 (1970). | |
dc.relation.referencesen | [7] Powell M. J. D. Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths D. F. (eds) Numerical Analysis. Lecture Notes in Mathematics, vol. 1066. Springer, Berlin, Heielberg. 122–141 (1984). | |
dc.relation.referencesen | [8] Powell M. J. D. Convergence Properties of Algorithms for Nonlinear Optimization. SIAM Review. 28 (4), 487–500 (1986). | |
dc.relation.referencesen | [9] Powell M. J. D. Restart procedures for the conjugate gradient method. Mathematical programming. 12 (1), 241–254 (1977). | |
dc.relation.referencesen | [10] Al-Baali M. Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA Journal of Numerical Analysis. 5 (1), 121–124 (1985). | |
dc.relation.referencesen | [11] Touati-Ahmed D., Storey C. Efficient hybrid conjugate gradient techniques. Journal of optimization theory and applications. 64 (2), 379–397 (1990). | |
dc.relation.referencesen | [12] Gilbert J. C., Nocedal J. Global Convergence Properties of Conjugate Gradient Methods for Optimization. SIAM Journal on Optimization. 2 (1), 21–42 (1992). | |
dc.relation.referencesen | [13] Jiang X., Jian J. Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search. Journal of Computational and Applied Mathematics. 348, 525–534 (2019). | |
dc.relation.referencesen | [14] Mtagulwa P., Kaelo P. An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems. Applied Numerical Mathematics. 145, 111–120 (2019). | |
dc.relation.referencesen | [15] Rivaie M., Mamat M., June L. W., Mohd I. A new class of nonlinear conjugate gradient coefficients with global convergence properties. Applied Mathematics and Computation. 218 (22), 11323–11332 (2012). | |
dc.relation.referencesen | [16] Pytlak R. Conjugate gradient algorithms in nonconvex optimization. Springer Science & Business Media. Vol. 89 (2008). | |
dc.relation.referencesen | [17] Hamoda M., Mamat M., Rivaie M., Salleh Z. A conjugate gradient method with Strong Wolfe–Powell line search for unconstrained optimization. Applied Mathematical Sciences. 10 (15), 721–734 (2016). | |
dc.relation.referencesen | [18] Rivaie M., Mamat M., Abashar A. A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches. Applied Mathematics and Computation. 268, 1152–1163 (2015). | |
dc.relation.referencesen | [19] Andrei N. An Unconstrained Optimization Test Functions Collection. Advanced Modelling and Optimization. 10 (1), 147–161 (2008). | |
dc.relation.referencesen | [20] Hillstrom K. E. A Simulation Test Approach to the Evaluation of Nonlinear Optimization Algorithms. ACM Transactions on Mathematical Software. 3 (4), 305–315 (1977). | |
dc.relation.referencesen | [21] Dolan E. D., Mor´e J. J. Benchmarking optimization software with performance profiles. Mathematical Programming. Series B. 91 (2), 201–213 (2012). | |
dc.rights.holder | © Національний університет “Львівська політехніка”, 2022 | |
dc.subject | спряжений градієнт | |
dc.subject | глобальна збіжність | |
dc.subject | неточний лінійний пощук | |
dc.subject | сильний лінійний пошук Вульфа–Пауелла | |
dc.subject | необмежена оптимізація | |
dc.subject | conjugate gradient | |
dc.subject | global convergence | |
dc.subject | inexact line search | |
dc.subject | strong Wolfe–Powell line search | |
dc.subject | unconstrained optimization | |
dc.title | A new modified conjugate gradient method under the strong Wolfe line search for solving unconstrained optimization problems | |
dc.title.alternative | Новий модифікований метод спряженого градієнта при сильному лінійному пошуку Вульфа для розв’язання проблеми необмеженої оптимізації | |
dc.type | Article |
Files
License bundle
1 - 1 of 1