The focus of this article is to add a new class of rank one of modified Quasi-Newton techniques to solve the problem of unconstrained optimization by updating the inverse Hessian matrix with an update of rank 1, where a diagonal matrix is the first component of the next inverse Hessian approximation, The inverse Hessian matrix is generated by the method proposed which is symmetric and it satisfies the condition of modified quasi-Newton, so the global convergence is retained. In addition, it is positive definite that guarantees the existence of the minimizer at every iteration of the objective function. We use the program MATLAB to solve an algorithm function to introduce the feasibility of the proposed procedure. Various numerical examples are given`.
Several attempts have been made to modify the quasi-Newton condition in order to obtain rapid convergence with complete properties (symmetric and positive definite) of the inverse of Hessian matrix (second derivative of the objective function). There are many unconstrained optimization methods that do not generate positive definiteness of the inverse of Hessian matrix. One of those methods is the symmetric rank 1( H-version) update (SR1 update), where this update satisfies the quasi-Newton condition and the symmetric property of inverse of Hessian matrix, but does not preserve the positive definite property of the inverse of Hessian matrix where the initial inverse of Hessian matrix is positive definiteness. The positive definite prope
... Show MoreBroyden update is one of the one-rank updates which solves the unconstrained optimization problem but this update does not guarantee the positive definite and the symmetric property of Hessian matrix.
In this paper the guarantee of positive definite and symmetric property for the Hessian matrix will be established by updating the vector which represents the difference between the next gradient and the current gradient of the objective function assumed to be twice continuous and differentiable .Numerical results are reported to compare the proposed method with the Broyden method under standard problems.
The study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers
... Show MoreMeerkat Clan Algorithm (MCA) is a nature-based metaheuristic algorithm which imitates the intelligent behavior of the meerkat animal. This paper presents an improvement on the MCA based on a chaotic map and crossover strategy (MCA-CC). These two strategies increase the diversification and intensification of the proposed algorithm and boost the searching ability to find more quality solutions. The 0-1 knapsack problem was solved by the basic MCA and the improved version of this algorithm (MCA-CC). The performance of these algorithms was tested on low and high dimensional problems. The experimental results demonstrate that the proposed algorithm had overcome the basic algorithm in terms of solution quality, speed a
... Show MoreIn this paper we will investigate some Heuristic methods to solve travelling salesman problem. The discussed methods are Minimizing Distance Method (MDM), Branch and Bound Method (BABM), Tree Type Heuristic Method (TTHM) and Greedy Method (GRM).
The weak points of MDM are manipulated in this paper. The Improved MDM (IMDM) gives better results than classical MDM, and other discussed methods, while the GRM gives best time for 5≤ n ≤500, where n is the number of visited cities.
In this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
In this paper, we proposed a modified Hestenes-Stiefel (HS) conjugate
gradient method. This achieves a high order accuracy in approximating the second
order curvature information of the objective function by utilizing the modified
secant condition which is proposed by Babaie-Kafaki [1], also we derive a nonquadratic
conjugate gradient model. The important property of the suggestion
method that is satisfy the descent property and global convergence independent of
the accuracy of the line search. In addition, we prove the global convergence under
some suitable conditions, and we reported the numerical results under these
conditions.
This article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MoreIn this article, the nonlinear problem of Jeffery-Hamel flow has been solved analytically and numerically by using reliable iterative and numerical methods. The approximate solutions obtained by using the Daftardar-Jafari method namely (DJM), Temimi-Ansari method namely (TAM) and Banach contraction method namely (BCM). The obtained solutions are discussed numerically, in comparison with other numerical solutions obtained from the fourth order Runge-Kutta (RK4), Euler and previous analytic methods available in literature. In addition, the convergence of the proposed methods is given based on the Banach fixed point theorem. The results reveal that the presented methods are reliable, effective and applicable to solve other nonlinear problems.
... Show More