Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematically studied by exploring available studies of different metaheuristic algorithms used for FS to improve TC. This paper will contribute to the body of existing knowledge by answering four research questions (RQs): 1) What are the different approaches of FS that apply metaheuristic algorithms to improve TC? 2) Does applying metaheuristic algorithms for TC lead to better accuracy than the typical FS methods? 3) How effective are the modified, hybridized metaheuristic algorithms for text FS problems?, and 4) What are the gaps in the current studies and their future directions? These RQs led to a study of recent works on metaheuristic-based FS methods, their contributions, and limitations. Hence, a final list of thirty-seven (37) related articles was extracted and investigated to align with our RQs to generate new knowledge in the domain of study. Most of the conducted papers focused on addressing the TC in tandem with metaheuristic algorithms based on the wrapper and hybrid FS approaches. Future research should focus on using a hybrid-based FS approach as it intuitively handles complex optimization problems and potentiality provide new research opportunities in this rapidly developing field.
The researcher focused on the importance of the physical abilities of the tennis game, as this game is one of the games that are characterized by its specificity in performance as this game is characterized by continuous movement and dealing with different elements, so this game requires the development of muscle strength, which plays an important role in Performance skills in the game of tennis. There are several methods to develop strength, including flat hierarchical technique, which is one of the most common forms of training in the development of muscle strength. As for the research problem, the researcher found a method that has an effect on the development of force. Therefore, the researcher tried to diversify a
... Show MoreAuthentic materials are the most important tools that the teacher could use in class in order to make teaching go smoothly and effectively in transmitting the necessary knowledge to all students. This research has investigated experimentally the effect of using authentic materials in teaching English as a foreign Language, because a number of studies point out that the use of authentic materials is regarded a useful means to motivate learners, arouse their interest and expose them to the real language they will face in real life situations.
It is hypothesis that there is no statistical significance difference between the experimental group who taught English as a foreign language by using the authentic materials with those
... Show MoreThe study aims to discuss the relation between imported inflation and international trade of Iraqi economy for the period (1990-2015) by using annual data. To achieve the study aim, statistical and Econometrics methods are used through NARDL model to explain non-linear relation because it’s a model assigned to measure non-linear relations and as we know most economic relations are non-linear, beside explaining positive and negative effects of imported inflation, and to reach the research aim deductive approach was adopted through using descriptive method to describe and determine phenomenon. Beside the inductive approach by g statistical and standard tools to get the standard model explains the
... Show MoreThe objective of this research is employ the special cases of function trapezoid in the composition of fuzzy sets to make decision within the framework of the theory of games traditional to determine the best strategy for the mobile phone networks in the province of Baghdad and Basra, has been the adoption of different periods of the functions belonging to see the change happening in the matrix matches and the impact that the strategies and decision-making available to each player and the impact on societ
... Show MoreA phytoremediation experiment was carried out with kerosene as a model for total petroleum hydrocarbons. A constructed wetland of barley was exposed to kerosene pollutants at varying concentrations (1, 2, and 3% v/v) in a subsurface flow (SSF) system. After a period of 42 days of exposure, it was found that the average ability to eliminate kerosene ranged from 56.5% to 61.2%, with the highest removal obtained at a kerosene concentration of 1% v/v. The analysis of kerosene at varying initial concentrations allowed the kinetics of kerosene to be fitted with the Grau model, which was closer than that with the zero order, first order, or second order kinetic models. The experimental study showed that the barley plant designed in a subsu
... Show MoreIn this golden age of rapid development surgeons realized that AI could contribute to healthcare in all aspects, especially in surgery. The aim of the study will incorporate the use of Convolutional Neural Network and Constrained Local Models (CNN-CLM) which can make improvement for the assessment of Laparoscopic Cholecystectomy (LC) surgery not only bring opportunities for surgery but also bring challenges on the way forward by using the edge cutting technology. The problem with the current method of surgery is the lack of safety and specific complications and problems associated with safety in each laparoscopic cholecystectomy procedure. When CLM is utilize into CNN models, it is effective at predicting time series tasks like iden
... Show MoreA phytoremediation experiment was carried out with kerosene as a model for total petroleum hydrocarbons. A constructed wetland of barley was exposed to kerosene pollutants at varying concentrations (1, 2, and 3% v/v) in a subsurface flow (SSF) system. After a period of 42 days of exposure, it was found that the average ability to eliminate kerosene ranged from 56.5% to 61.2%, with the highest removal obtained at a kerosene concentration of 1% v/v. The analysis of kerosene at varying initial concentrations allowed the kinetics of kerosene to be fitted with the Grau model, which was closer than that with the zero order, first order, or second order kinetic models. The experimental study showed that the barley plant designed in a subsu
... Show MoreA simple, fast, inexpensive and sensitive method has been proposed to screen and optimize experimental factors that effecting the determination of phenylephrine hydrochloride (PHE.HCl) in pure and pharmaceutical formulations. The method is based on the development of brown-colored charge transfer (CT) complex with p-Bromanil (p-Br) in an alkaline medium (pH=9) with 1.07 min after heating at 80 °C. ‘Design of Experiments’ (DOE) employing ‘Central Composite Face Centered Design’ (CCF) and ‘Response Surface Methodology’ (RSM) were applied as an improvement to traditional ‘One Variable at Time’ (OVAT) approach to evaluate the effects of variations in selected factors (volume of 5×10-3 M p-Br, heating time, and temperature) on
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreA study on the treatment and reuse of oily wastewater generated from the process of fuel oil treatment of gas turbine power plant was performed. The feasibility of using hollow fiber ultrafiltration (UF) membrane and reverse osmosis (RO) membrane type polyamide thin-film composite in a pilot plant was investigated. Three different variables: pressure (0.5, 1, 1.5 and 2 bars), oil content (10, 20, 30 and 40 ppm), and temperature (15, 20, 30 and 40 ᵒC) were employed in the UF process while TDS was kept constant at 150 ppm. Four different variables: pressure (5, 6, 7 and 8 bar), oil content (2.5, 5, 7.5 and 10 ppm), total dissolved solids (TDS) (100, 200,300 and 400 ppm), and temperature (15, 20, 30 and 40 ᵒC) were mani
... Show More