A new human-based heuristic optimization method, named the Snooker-Based Optimization Algorithm (SBOA), is introduced in this study. The inspiration for this method is drawn from the traits of sales elites—those qualities every salesperson aspires to possess. Typically, salespersons strive to enhance their skills through autonomous learning or by seeking guidance from others. Furthermore, they engage in regular communication with customers to gain approval for their products or services. Building upon this concept, SBOA aims to find the optimal solution within a given search space, traversing all positions to obtain all possible values. To assesses the feasibility and effectiveness of SBOA in comparison to other algorithms, we conducted tests on ten single-objective functions from the 2019 benchmark functions of the Evolutionary Computation (CEC), as well as twenty-four single-objective functions from the 2022 CEC benchmark functions, in addition to four engineering problems. Seven comparative algorithms were utilized: the Differential Evolution Algorithm (DE), Sparrow Search Algorithm (SSA), Sine Cosine Algorithm (SCA), Whale Optimization Algorithm (WOA), Butterfly Optimization Algorithm (BOA), Lion Swarm Optimization (LSO), and Golden Jackal Optimization (GJO). The results of these diverse experiments were compared in terms of accuracy and convergence curve speed. The findings suggest that SBOA is a straightforward and viable approach that, overall, outperforms the aforementioned algorithms.
This research aims to conduct a linguistic analysis of the translation of the novel "The Corpse Washer" by the Iraqi author Sinan Antoon. The main objective is to explore the challenges and strategies involved in translating this literary work, particularly the difficulties in translating the Baghdadi dialect and the obstacles it poses for non-native speakers. Employing a descriptive research methodology, the study examines the linguistic aspects of the translation, specifically selected conversational texts in the novel. It identifies the difficulties faced by translators in preserving the essence of the original novel and presents instances where errors occurred in translating vocabulary, conversational expressions, proverbs, and idi
... Show More
Abstract
The human mind knew the philosophy and logic in the ancient times, and the history afterwards, while the semiotics concept appeared in the modern time, and became a new knowledge field like the other knowledge fields. It deals, in its different concepts and references, with the processes that lead to and reveals the meaning through what is hidden in addition to what is disclosed. It is the result of human activity in its pragmatic and cognitive dimensions together. The semiotic token concept became a knowledge key to access all the study, research, and investigation fields, due to its ability of description, explanation, and dismantling. The paper is divided into two sections preceded by a the
... Show MoreMany attempts have been made to modify the surface of orthodontic micro-implants and prevent the development of microbes by coating them with antimicrobial nanoparticles (NPs). The purpose of the present study was to evaluate the cytotoxicity of different NPs, namely, TiO2 and zinc oxide (ZnO) NPs, that are used to coat titanium orthodontic micro-implants.
Thirty orthodontic micro-implants were included in this study. Those were divided into three groups: control group without coating, TiO2-coated orthodontic micro-implants, and TiO2- and ZnO-coated orthodontic micro-implants. Scann
In the lifetime process in some systems, most data cannot belong to one single population. In fact, it can represent several subpopulations. In such a case, the known distribution cannot be used to model data. Instead, a mixture of distribution is used to modulate the data and classify them into several subgroups. The mixture of Rayleigh distribution is best to be used with the lifetime process. This paper aims to infer model parameters by the expectation-maximization (EM) algorithm through the maximum likelihood function. The technique is applied to simulated data by following several scenarios. The accuracy of estimation has been examined by the average mean square error (AMSE) and the average classification success rate (ACSR). T
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreIntended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreRA Ali, LK Abood, Int J Sci Res, 2017 - Cited by 2