Preferred Language
Articles
/
ijs-11945
A Modified Hestenes-Stiefel Conjugate Gradient Method and its Global convergence for unconstrained optimization
...Show More Authors

In this paper, we proposed a modified Hestenes-Stiefel (HS) conjugate
gradient method. This achieves a high order accuracy in approximating the second
order curvature information of the objective function by utilizing the modified
secant condition which is proposed by Babaie-Kafaki [1], also we derive a nonquadratic
conjugate gradient model. The important property of the suggestion
method that is satisfy the descent property and global convergence independent of
the accuracy of the line search. In addition, we prove the global convergence under
some suitable conditions, and we reported the numerical results under these
conditions.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Jul 29 2020
Journal Name
Iraqi Journal Of Science
A Descent Modification of Conjugate Gradient Method for Optimization Models
...Show More Authors

In this paper, we suggest a descent modification of the conjugate gradient method which converges globally provided that the exact minimization condition is satisfied. Preliminary numerical experiments on some benchmark problems show that the method is efficient and promising.  

View Publication Preview PDF
Scopus (5)
Crossref (2)
Scopus Crossref
Publication Date
Mon Oct 30 2023
Journal Name
Iraqi Journal Of Science
New Class of Conjugate Gradient Methods for Removing Impulse Noise Images
...Show More Authors

The conjugate coefficient optimal is the very establishment of a variety of  conjugate gradient methods. This paper proposes a new class coefficient of conjugate gradient (CG) methods for impulse noise removal, which is based on the quadratic model. Our proposed method ensures descent independent of the accuracy of the line search and it is globally convergent under some conditions, Numerical experiments are also presented for the impulse noise removal in images.

View Publication Preview PDF
Scopus Crossref
Publication Date
Sun Sep 01 2019
Journal Name
Baghdad Science Journal
Symmetric and Positive Definite Broyden Update for Unconstrained Optimization
...Show More Authors

Broyden update is one of the one-rank updates which solves the unconstrained optimization problem but this update does not guarantee the positive definite and the symmetric property of Hessian matrix.

In this paper the guarantee of positive definite and symmetric property for the Hessian matrix will be established by updating the vector  which represents the difference between the next gradient and the current gradient of the objective function assumed to be twice continuous and differentiable .Numerical results are reported to compare the proposed method with the Broyden method under standard problems.

View Publication Preview PDF
Scopus (9)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sat Feb 26 2022
Journal Name
Iraqi Journal Of Science
New Class of Rank 1 Update for Solving Unconstrained Optimization Problem: New Class of Rank 1 Update for solving Unconstrained Optimization Problem
...Show More Authors

     The focus of this article is to add a new class of rank one of  modified Quasi-Newton techniques to solve the problem of unconstrained optimization by updating the inverse Hessian matrix with an update of rank 1, where a diagonal matrix is the first component of the next inverse Hessian approximation, The inverse Hessian matrix is  generated by the method proposed which is symmetric and it satisfies the condition of modified quasi-Newton, so the global convergence is retained. In addition, it is positive definite that  guarantees the existence of the minimizer at every iteration of the objective function. We use  the program MATLAB to solve an algorithm function to introduce the feasibility of

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (1)
Scopus Crossref
Publication Date
Tue Sep 08 2020
Journal Name
Baghdad Science Journal
Modified BFGS Update (H-Version) Based on the Determinant Property of Inverse of Hessian Matrix for Unconstrained Optimization
...Show More Authors

The study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never  approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers

... Show More
View Publication Preview PDF
Scopus (10)
Crossref (10)
Scopus Clarivate Crossref
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Positive Definiteness of Symmetric Rank 1 (H-Version) Update for Unconstrained Optimization
...Show More Authors

Several attempts have been made to modify the quasi-Newton condition in order to obtain rapid convergence with complete properties (symmetric and positive definite) of the inverse of  Hessian matrix (second derivative of the objective function). There are many unconstrained optimization methods that do not generate positive definiteness of the inverse of Hessian matrix. One of those methods is the symmetric rank 1( H-version) update (SR1 update), where this update satisfies the quasi-Newton condition and the symmetric property of inverse of Hessian matrix, but does not preserve the positive definite property of the inverse of Hessian matrix where the initial inverse of Hessian matrix is positive definiteness. The positive definite prope

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (7)
Scopus Clarivate Crossref
Publication Date
Thu Sep 26 2019
Journal Name
Processes
Fine-Tuning Meta-Heuristic Algorithm for Global Optimization
...Show More Authors

This paper proposes a novel meta-heuristic optimization algorithm called the fine-tuning meta-heuristic algorithm (FTMA) for solving global optimization problems. In this algorithm, the solutions are fine-tuned using the fundamental steps in meta-heuristic optimization, namely, exploration, exploitation, and randomization, in such a way that if one step improves the solution, then it is unnecessary to execute the remaining steps. The performance of the proposed FTMA has been compared with that of five other optimization algorithms over ten benchmark test functions. Nine of them are well-known and already exist in the literature, while the tenth one is proposed by the authors and introduced in this article. One test trial was shown t

... Show More
View Publication Preview PDF
Scopus (24)
Crossref (19)
Scopus Clarivate Crossref
Publication Date
Sun Jun 05 2022
Journal Name
Network
A Computationally Efficient Gradient Algorithm for Downlink Training Sequence Optimization in FDD Massive MIMO Systems
...Show More Authors

Future wireless networks will require advance physical-layer techniques to meet the requirements of Internet of Everything (IoE) applications and massive communication systems. To this end, a massive MIMO (m-MIMO) system is to date considered one of the key technologies for future wireless networks. This is due to the capability of m-MIMO to bring a significant improvement in the spectral efficiency and energy efficiency. However, designing an efficient downlink (DL) training sequence for fast channel state information (CSI) estimation, i.e., with limited coherence time, in a frequency division duplex (FDD) m-MIMO system when users exhibit different correlation patterns, i.e., span distinct channel covariance matrices, is to date ve

... Show More
View Publication
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Thu Nov 02 2023
Journal Name
Journal Of Engineering
An Improved Adaptive Spiral Dynamic Algorithm for Global Optimization
...Show More Authors

This paper proposes a new strategy to enhance the performance and accuracy of the Spiral dynamic algorithm (SDA) for use in solving real-world problems by hybridizing the SDA with the Bacterial Foraging optimization algorithm (BFA). The dynamic step size of SDA makes it a useful exploitation approach. However, it has limited exploration throughout the diversification phase, which results in getting trapped at local optima. The optimal initialization position for the SDA algorithm has been determined with the help of the chemotactic strategy of the BFA optimization algorithm, which has been utilized to improve the exploration approach of the SDA. The proposed Hybrid Adaptive Spiral Dynamic Bacterial Foraging (HASDBF)

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
A hybrid Grey Wolf optimizer with multi-population differential evolution for global optimization problems
...Show More Authors

View Publication
Scopus (2)
Crossref (1)
Scopus Crossref