Solving problems via artificial intelligence techniques has widely prevailed in different aspects. Implementing artificial intelligence optimization algorithms for NP-hard problems is still challenging. In this manuscript, we work on implementing the Naked Mole-Rat Algorithm (NMRA) to solve the n-queens problems and overcome the challenge of applying NMRA to a discrete space set. An improvement of NMRA is applied using the aspect of local search in the Variable Neighborhood Search algorithm (VNS) with 2-opt and 3-opt. Introducing the Naked Mole Rat algorithm based on variable neighborhood search (NMRAVNS) to solve N-queens problems with different sizes. Finding the best solution or set of solutions within a plausible amount of time is the main goal of the NMRAVNS algorithm. The improvement of the proposed algorithm boosts the exploitation capability of the basic NMRA and gives a greater possibility, with the emerging search strategies, to find the global best solution. This algorithm proved successful and outperformed other algorithms and studies with a remarkable target. A detailed comparison is performed, and the data results are presented with the relevant numbers and values. NMRA and NMRAVNS comparisons are implemented and recorded. Later on, a comparison between the Meerkat Clan Algorithm, Genetic Algorithm, Particle Swarm Optimization, and NMRAVNS is tested and presented. Finally, NMRAVNS is evaluated against the examined genetic-based algorithm and listed to prove the success of the proposed algorithm. NMRAVNS outperformed previous findings and scored competitive results with a high number of queen sizes, where an average time reduction reached about 87% of other previous findings.
The traveling salesman problem is addressed in this paper by introducing a distributed multi-ant colony algorithm that is implemented on a Raspberry Pi cluster. The implementation of a master and eight workers, each running on Raspberry Pi nodes, is the central component of this novel technique. Each worker is responsible for managing their own colony of ants, while the master coordinates communications among workers’ nodes and assesses the most optimal approach. To put the newly built cluster through its paces, several datasets of traveling salesman problem are used to test the created cluster. The findings of the experiment indicate that a single board computer cluster, which makes use of multi-ant colony algorithm, is a via
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreThe control of prostheses and their complexities is one of the greatest challenges limiting wide amputees’ use of upper limb prostheses. The main challenges include the difficulty of extracting signals for controlling the prostheses, limited number of degrees of freedom (DoF), and cost-prohibitive for complex controlling systems. In this study, a real-time hybrid control system, based on electromyography (EMG) and voice commands (VC) is designed to render the prosthesis more dexterous with the ability to accomplish amputee’s daily activities proficiently. The voice and EMG systems were combined in three proposed hybrid strategies, each strategy had different number of movements depending on the combination protocol between voic
... Show MoreHuman beings are greatly inspired by nature. Nature has the ability to solve very complex problems in its own distinctive way. The problems around us are becoming more and more complex in the real time and at the same instance our mother nature is guiding us to solve these natural problems. Nature gives some of the logical and effective ways to find solutions to these problems. Nature acts as an optimized source for solving the complex problems. Decomposition is a basic strategy in traditional multi-objective optimization. However, it has not yet been widely used in multi-objective evolutionary optimization.
Although computational strategies for taking care of Multi-objective Optimization Problems (MOPs) h
... Show MoreIn modern times face recognition is one of the vital sides for computer vision. This is due to many reasons involving availability and accessibility of technologies and commercial applications. Face recognition in a brief statement is robotically recognizing a person from an image or video frame. In this paper, an efficient face recognition algorithm is proposed based on the benefit of wavelet decomposition to extract the most important and distractive features for the face and Eigen face method to classify faces according to the minimum distance with feature vectors. Faces94 data base is used to test the method. An excellent recognition with minimum computation time is obtained with accuracy reaches to 100% and recognition time decrease
... Show MoreBackground and Aim: due to the rapid growth of data communication and multimedia system applications, security becomes a critical issue in the communication and storage of images. This study aims to improve encryption and decryption for various types of images by decreasing time consumption and strengthening security. Methodology: An algorithm is proposed for encrypting images based on the Carlisle Adams and Stafford Tavares CAST block cipher algorithm with 3D and 2D logistic maps. A chaotic function that increases the randomness in the encrypted data and images, thereby breaking the relation sequence through the encryption procedure, is introduced. The time is decreased by using three secure and private S-Boxes rather than using si
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreClustering algorithms have recently gained attention in the related literature since
they can help current intrusion detection systems in several aspects. This paper
proposes genetic algorithm (GA) based clustering, serving to distinguish patterns
incoming from network traffic packets into normal and attack. Two GA based
clustering models for solving intrusion detection problem are introduced. The first
model coined as handles numeric features of the network packet, whereas
the second one coined as concerns all features of the network packet.
Moreover, a new mutation operator directed for binary and symbolic features is
proposed. The basic concept of proposed mutation operator depends on the most
frequent value
Copula modeling is widely used in modern statistics. The boundary bias problem is one of the problems faced when estimating by nonparametric methods, as kernel estimators are the most common in nonparametric estimation. In this paper, the copula density function was estimated using the probit transformation nonparametric method in order to get rid of the boundary bias problem that the kernel estimators suffer from. Using simulation for three nonparametric methods to estimate the copula density function and we proposed a new method that is better than the rest of the methods by five types of copulas with different sample sizes and different levels of correlation between the copula variables and the different parameters for the function. The
... Show MoreA rapid growth has occurred for the act of plagiarism with the aid of Internet explosive growth wherein a massive volume of information offered with effortless use and access makes plagiarism the process of taking someone else’s work (represented by ideas, or even words) and representing it as other's own work easy to be performed. For ensuring originality, detecting plagiarism has been massively necessitated in various areas so that the people who aim to plagiarize ought to offer considerable effort for introducing works centered on their research.
In this paper, work has been proposed for improving the detection of textual plagiarism through proposing a model for can
... Show More