Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreThis paper proposes a better solution for EEG-based brain language signals classification, it is using machine learning and optimization algorithms. This project aims to replace the brain signal classification for language processing tasks by achieving the higher accuracy and speed process. Features extraction is performed using a modified Discrete Wavelet Transform (DWT) in this study which increases the capability of capturing signal characteristics appropriately by decomposing EEG signals into significant frequency components. A Gray Wolf Optimization (GWO) algorithm method is applied to improve the results and select the optimal features which achieves more accurate results by selecting impactful features with maximum relevance
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and
Scheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreThe research material was prune plums (
Waste recycling is one of the modern means of treating waste and minimizing its harmful effects that have caused problems for all countries of the world through the disposal of them in a safe and healthy manner as well as achieving economic and social benefits to the United Nations, and through the goals of sustainable development. 2015-2013 seeks to solve the environmental problems, including various peoples of the world, through various projects and programs, including waste recycling. Here is the question of whether there is a relationship between waste recycling and the goals of sustainable development, the research seeks to answer through five categories to determine the type of relationship between waste recycling and the g
... Show MoreThe analysis of the classic principal components are sensitive to the outliers where they are calculated from the characteristic values and characteristic vectors of correlation matrix or variance Non-Robust, which yields an incorrect results in the case of these data contains the outliers values. In order to treat this problem, we resort to use the robust methods where there are many robust methods Will be touched to some of them.
The robust measurement estimators include the measurement of direct robust estimators for characteristic values by using characteristic vectors without relying on robust estimators for the variance and covariance matrices. Also the analysis of the princ
... Show MoreThe research material was prune plums (
Investigation of the adsorption of Chromium (VI) on Fe3O4 is carried out using batch scale experiments according to statistical design using a software program minitab17 (Box-Behnken design). Experiments were carried out as per Box-Behnken design with four input parameters such as pH (2-8), initial concentration (50–150mg/L), adsorbent dosage (0.05–0.3 g) and time of adsorption (10–60min). The better conditions were showed at pH: 2; contact time: 60 min; chromium concentration: 50 mg/L and magnetite dosage: 0.3 g for maximum Chromium (VI) removal of (98.95%) with an error of 1.08%. The three models (Freundlich, Langmuir, and Temkin) were fitted to experimental data, Langmuir isotherm has bette
... Show MoreThe paper aims is to solve the problem of choosing the appropriate project from several service projects for the Iraqi Martyrs Foundation or arrange them according to the preference within the targeted criteria. this is done by using Multi-Criteria Decision Method (MCDM), which is the method of Multi-Objective Optimization by Ratios Analysis (MOORA) to measure the composite score of performance that each alternative gets and the maximum benefit accruing to the beneficiary and according to the criteria and weights that are calculated by the Analytic Hierarchy Process (AHP). The most important findings of the research and relying on expert opinion are to choose the second project as the best alternative and make an arrangement acco
... Show More