In this paper, a compact genetic algorithm (CGA) is enhanced by integrating its selection strategy with a steepest descent algorithm (SDA) as a local search method to give I-CGA-SDA. This system is an attempt to avoid the large CPU time and computational complexity of the standard genetic algorithm. Here, CGA dramatically reduces the number of bits required to store the population and has a faster convergence. Consequently, this integrated system is used to optimize the maximum likelihood function lnL(φ1, θ1) of the mixed model. Simulation results based on MSE were compared with those obtained from the SDA and showed that the hybrid genetic algorithm (HGA) and I-CGA-SDA can give a good estimator of (φ1, θ1) for the ARMA(1,1) model. Another comparison has been conducted to show that the I-CGA-SDA has fewer function evaluations, minimum search space percentage, faster convergence speed and has a higher optimal precision than that of the HGA.
Abstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreSemi-parametric models analysis is one of the most interesting subjects in recent studies due to give an efficient model estimation. The problem when the response variable has one of two values either 0 ( no response) or one – with response which is called the logistic regression model.
We compare two methods Bayesian and . Then the results were compared using MSe criteria.
A simulation had been used to study the empirical behavior for the Logistic model , with different sample sizes and variances. The results using represent that the Bayesian method is better than the at small samples sizes.
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreSteganography is an important class of security which is widely used in computer and network security nowadays. In this research, a new proposed algorithm was introduced with a new concept of dealing with steganography as an algorithmic secret key technique similar to stream cipher cryptographic system. The proposed algorithm is a secret key system suggested to be used in communications for messages transmission steganography
As s widely use of exchanging private information in various communication applications, the issue to secure it became top urgent. In this research, a new approach to encrypt text message based on genetic algorithm operators has been proposed. The proposed approach follows a new algorithm of generating 8 bit chromosome to encrypt plain text after selecting randomly crossover point. The resulted child code is flipped by one bit using mutation operation. Two simulations are conducted to evaluate the performance of the proposed approach including execution time of encryption/decryption and throughput computations. Simulations results prove the robustness of the proposed approach to produce better performance for all evaluation metrics with res
... Show MoreThe integration of decision-making will lead to the robust of its decisions, and then determination optimum inventory level to the required materials to produce and reduce the total cost by the cooperation of purchasing department with inventory department and also with other company,s departments. Two models are suggested to determine Optimum Inventory Level (OIL), the first model (OIL-model 1) assumed that the inventory level for materials quantities equal to the required materials, while the second model (OIL-model 2) assumed that the inventory level for materials quantities more than the required materials for the next period. &nb
... Show MoreIn this research, we propose to use two local search methods (LSM's); Particle Swarm Optimization (PSO) and the Bees Algorithm (BA) to solve Multi-Criteria Travelling Salesman Problem (MCTSP) to obtain the best efficient solutions. The generating process of the population of the proposed LSM's may be randomly obtained or by adding some initial solutions obtained from some efficient heuristic methods. The obtained solutions of the PSO and BA are compared with the solutions of the exact methods (complete enumeration and branch and bound methods) and some heuristic methods. The results proved the efficiency of PSO and BA methods for a large number of nodes ( ). The proposed LSM's give the best efficient solutions for the MCTSP for
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
The aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show More