A NEW MODIFIED LINE SEARCH ALGORITHM TO SOLVE LARGE-SCALE NON-SMOOTH NON-CONVEX OPTIMIZATION PROBLEM

Document Type : Original Article

Authors
1 Department of Mathematics, Bu-Ali Sina University, Hamedan, Iran
2 Department of Mathematics, Bu-Ali Sina University, Hamedan, Iran.
Abstract
In this paper, a new modified line search Armijo is used in the diagonal discrete gradient bundle method to solve large-scale non-smooth optimization problems. The new principle causes the step in each iteration to be longer, which reduces the number of iterations, evaluations, and the computational time. In other words, the efficiency and performance of the method are improved. We prove that the diagonal discrete gradient bundle method converges with the proposed monotone line search principle for semi-smooth functions, which are not necessarily differentiable or convex. In addition, the numerical results confirm the efficiency of the proposed correction.

Keywords


[1] A.M. Bagirov and B. Karasözen and M. Sezer, Discrete gradient method: derivative-free method for nonsmooth optimization. Journal of Optimization Theory and Applications, (2): 317-334, 2008.
[2] A.M. Bagirov and N. Karmitsa and M.M. Mäkelä, Introduction to Nonsmooth Optimization, volume 12 of Theory, Practice and Software. Springer International Publishing, Switzerland, 2014.
[3] J. Barzilai and J.M. Borwein, Two-point step size gradient methods. IMA journal of numerical analysis, (1): 141-148, 1988.
[4] S. Bojari and M.R. Eslahchi, Global convergence of a family ofmodified BFGSmethods under a modified weak-Wolfe–Powell line search for nonconvex functions. Scientific Reports, (18): 219-244, 2020.
[5] Y.H. Dai and L.Z. Liao, R‐linear convergence of the Barzilai and Borwein gradient method. IMA Journal of Numerical Analysis, (1): 1-10, 2002.
[6] R. Grimaldi, Fibonacci and Catalan Numbers: an introduction, John Wiley and Sons, 2012.
[7] I. Griva and S.G. Nash and A. Sofar, Linear and nonlinear optimization., volume 108 of Frontiers in Applied Mathematics. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 2009.
[8] M. Haarala, The Large-scale nonsmooth optimization: variable metric bundle method with limited memory, University of Jyväskylä, 2004.
[9] N. Haarala and K. Miettinen and M.M. Mäkelä, Globally convergent limited memory bundle method for large-scale nonsmooth optimization. Mathematical Programming, (1): 181-205, 2007.
[10] M. Haarala and K. Miettinen and M.M. Mäkelä, New limited memory bundle method for large-scale nonsmooth optimization. Optimization Methods and Software, (6): 673-692, 2004.
[11] N. Karmitsa, Diagonal discrete gradient bundle method for derivative free nonsmooth optimization. Optimization, (8): 1599-1614, 2016.
[12] N. Karmitsa, Diagonal bundle method for nonsmooth sparse optimization. Journal of Optimization Theory and Applications, (3): 886-905, 2015.
[13] C. Lemarechal and J.J. Strodiot and A. Bihain, On a bundle algorithm for nonsmooth optimization. Nonlinear programming 4, 245-282, 1981.
[14] R. Mifflin, A modification and an extension of Lemaréchal’s algorithm for nonsmooth minimization,. Nondifferential and variational techniques in optimization, 77-90, 1980.
[15] M. Raydan, The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM Journal on Optimization, (1): 26-33, 1997.
[16] M. Raydan, On the Barzilai and Borwein choice of steplength for the gradient method. IMA Journal of Numerical Analysis, (3): 321-326, 1993.
[17] J. Vlˇcek and L. Lukšan, Globally convergent variable metric method for nonconvex nondifferentiable unconstrained minimization. Journal of Optimization Theory and Applications, (2): 407-430, 2001.