This paper proposes a novel meta-heuristic optimization algorithm called the fine-tuning meta-heuristic algorithm (FTMA) for solving global optimization problems. In this algorithm, the solutions are fine-tuned using the fundamental steps in meta-heuristic optimization, namely, exploration, exploitation, and randomization, in such a way that if one step improves the solution, then it is unnecessary to execute the remaining steps. The performance of the proposed FTMA has been compared with that of five other optimization algorithms over ten benchmark test functions. Nine of them are well-known and already exist in the literature, while the tenth one is proposed by the authors and introduced in this article. One test trial was shown to check the performance of each algorithm, and the other test for 30 trials to measure the statistical results of the performance of the proposed algorithm against the others. Results confirm that the proposed FTMA global optimization algorithm has a competing performance in comparison with its counterparts in terms of speed and evading the local minima.
This paper describes a new finishing process using magnetic abrasives were newly made to finish effectively brass plate that is very difficult to be polished by the conventional machining processes. Taguchi experimental design method was adopted for evaluating the effect of the process parameters on the improvement of the surface roughness and hardness by the magnetic abrasive polishing. The process parameters are: the applied current to the inductor, the working gap between the workpiece and the inductor, the rotational speed and the volume of powder. The analysis of variance(ANOVA) was analyzed using statistical software to identify the optimal conditions for better surface roughness and hardness. Regressions models based on statistical m
... Show MoreIn this paper, the class of semi
A substantial portion of today’s multimedia data exists in the form of unstructured text. However, the unstructured nature of text poses a significant task in meeting users’ information requirements. Text classification (TC) has been extensively employed in text mining to facilitate multimedia data processing. However, accurately categorizing texts becomes challenging due to the increasing presence of non-informative features within the corpus. Several reviews on TC, encompassing various feature selection (FS) approaches to eliminate non-informative features, have been previously published. However, these reviews do not adequately cover the recently explored approaches to TC problem-solving utilizing FS, such as optimization techniques.
... Show MoreThe goal of this experimental study is to determine the effects of different parameters (Flow rate, cuttings density, cuttings size, and hole inclination degree) on hole cleaning efficiency. Freshwater was used as a drilling fluid in this experiment. The experiments were conducted by using flow loop consist of approximately 14 m (46 ft) long with transparent glass test section of 3m (9.84 ft.) long with 4 inches (101.6 mm) ID, the inner metal drill pipe with 2 inches (50.8 mm) OD settled with eccentric position positive 0.5. The results obtained from this study show that the hole cleanings efficiency become better with high flow rate (21 m3/hr) and it increase as the hole inclination angles increased from 60 to 90 degree due to dominated
... Show MoreThis paper aims to study the rate of star formation (SFR) in luminous infrared galaxies at different wavelengths using distance measurement techniques (dl, dm) and to know which methods are the most accurate to determine the rate of star formation as we present through this research the results of the statistical analysis (descriptive statistics) for a sample of luminous infrared galaxies. The data used in this research were collected from the NASA Extragalactic Database (NED) and HYPERLEDA, then used to calculate the star formation rate and indicate the accuracy of the distance methods used (dl, dm). Two methods were tested on Hα, OII, FIR, radio continuum at 1.4 GHz, FUV, NUV, and total (FUV + FIR). The results showed that the dl
... Show MoreIt is an established fact that substantial amounts of oil usually remain in a reservoir after primary and secondary processes. Therefore; there is an ongoing effort to sweep that remaining oil. Field optimization includes many techniques. Horizontal wells are one of the most motivating factors for field optimization. The selection of new horizontal wells must be accompanied with the right selection of the well locations. However, modeling horizontal well locations by a trial and error method is a time consuming method. Therefore; a method of Artificial Neural Network (ANN) has been employed which helps to predict the optimum performance via proposed new wells locations by incorporatin
This paper includes the application of Queuing theory with of Particle swarm algorithm or is called (Intelligence swarm) to solve the problem of The queues and developed for General commission for taxes /branch Karkh center in the service stage of the Department of calculators composed of six employees , and it was chosen queuing model is a single-service channel M / M / 1 according to the nature of the circuit work mentioned above and it will be divided according to the letters system for each employee, and it was composed of data collection times (arrival time , service time, departure time)
... Show More
Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show More