A new human-based heuristic optimization method, named the Snooker-Based Optimization Algorithm (SBOA), is introduced in this study. The inspiration for this method is drawn from the traits of sales elites—those qualities every salesperson aspires to possess. Typically, salespersons strive to enhance their skills through autonomous learning or by seeking guidance from others. Furthermore, they engage in regular communication with customers to gain approval for their products or services. Building upon this concept, SBOA aims to find the optimal solution within a given search space, traversing all positions to obtain all possible values. To assesses the feasibility and effectiveness of SBOA in comparison to other algorithms, we conducted tests on ten single-objective functions from the 2019 benchmark functions of the Evolutionary Computation (CEC), as well as twenty-four single-objective functions from the 2022 CEC benchmark functions, in addition to four engineering problems. Seven comparative algorithms were utilized: the Differential Evolution Algorithm (DE), Sparrow Search Algorithm (SSA), Sine Cosine Algorithm (SCA), Whale Optimization Algorithm (WOA), Butterfly Optimization Algorithm (BOA), Lion Swarm Optimization (LSO), and Golden Jackal Optimization (GJO). The results of these diverse experiments were compared in terms of accuracy and convergence curve speed. The findings suggest that SBOA is a straightforward and viable approach that, overall, outperforms the aforementioned algorithms.
Purpose: This research seeks to provide a point of view based on the creation of sustainable value to the customer of the banks in the context of total quality management and relationship marketing. It aims to develop a model to measure the value of sustainable customer peduncular under total quality management PAL (administrative leadership, involvement of employees, continuous improvement, process improvement, staff training), through the mediation of relationship marketing and objective dimensions (administrative leadership, involvement of employees, continuous improvement, improving processes , staff training), and to explore any of the variables and dimensions more influential in the creation of sustainable value to the cust
... Show More
We have presented the distribution of the exponentiated expanded power function (EEPF) with four parameters, where this distribution was created by the exponentiated expanded method created by the scientist Gupta to expand the exponential distribution by adding a new shape parameter to the cumulative function of the distribution, resulting in a new distribution, and this method is characterized by obtaining a distribution that belongs for the exponential family. We also obtained a function of survival rate and failure rate for this distribution, where some mathematical properties were derived, then we used the method of maximum likelihood (ML) and method least squares developed (LSD)
... Show MoreAbstract:
Interest in the topic of prediction has increased in recent years and appeared modern methods such as Artificial Neural Networks models, if these methods are able to learn and adapt self with any model, and does not require assumptions on the nature of the time series. On the other hand, the methods currently used to predict the classic method such as Box-Jenkins may be difficult to diagnose chain and modeling because they assume strict conditions.
... Show More
دُرِست العوامل المؤثرة في عدد ساعات تجهيز الكهرباء في مدينة بغداد، وتكونت عينة الدراسة من (365) مشاهدة يومية لعام 2018، وتمثلت بستة متغيرات استعملت في الدراسة. كان الهدف الرئيس هو دراسة العلاقة بين هذه المتغيرات، وتقدير تأثيرات المتغيرات التنبؤية في المتغير التابع (عدد ساعات تجهيز الكهرباء في مدينة بغداد). ولتحقيق ذلك استعملت نمذجة المعادلات الهيكلية/ تحليل المسار وبرنامج AMOS
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreIn this study, different methods were used for estimating location parameter and scale parameter for extreme value distribution, such as maximum likelihood estimation (MLE) , method of moment estimation (ME),and approximation estimators based on percentiles which is called white method in estimation, as the extreme value distribution is one of exponential distributions. Least squares estimation (OLS) was used, weighted least squares estimation (WLS), ridge regression estimation (Rig), and adjusted ridge regression estimation (ARig) were used. Two parameters for expected value to the percentile as estimation for distribution f
... Show MoreAbstract
The current research aims at identifying any of the dimensions of organizational learning abilities that are more influential in the knowledge capital of the university and the extent to which they can be applied effectively at Wasit University. The current research dealt with organizational learning abilities as an explanatory variable in four dimensions (Experimentation and openness, sharing and transfer of knowledge, dialogue, interaction with the external environment ), and knowledge capital as a transient variable, with four dimensions (human capital, structural capital, client capital, operational capital). The problem of research is the following questio
... Show MoreTransforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreThe grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a heavy fuel (HFO) and diesel fuel (D.O) and the use of tests to conf
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show More