Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematically studied by exploring available studies of different metaheuristic algorithms used for FS to improve TC. This paper will contribute to the body of existing knowledge by answering four research questions (RQs): 1) What are the different approaches of FS that apply metaheuristic algorithms to improve TC? 2) Does applying metaheuristic algorithms for TC lead to better accuracy than the typical FS methods? 3) How effective are the modified, hybridized metaheuristic algorithms for text FS problems?, and 4) What are the gaps in the current studies and their future directions? These RQs led to a study of recent works on metaheuristic-based FS methods, their contributions, and limitations. Hence, a final list of thirty-seven (37) related articles was extracted and investigated to align with our RQs to generate new knowledge in the domain of study. Most of the conducted papers focused on addressing the TC in tandem with metaheuristic algorithms based on the wrapper and hybrid FS approaches. Future research should focus on using a hybrid-based FS approach as it intuitively handles complex optimization problems and potentiality provide new research opportunities in this rapidly developing field.
The conjunctive ''and'' and its Arabic counterpart ''و'' are discourse markers that express certain meanings and presuppose the presence of other elements in discourse. They are indispensable aids to both the text writers and readers. The present study aims to show that such cohesive ties help the writer to organize his main argument and communicate his ideas vividly and smoothly. They also serve as explicit signals that help readers unfold text and follow its threads as realized in the progression of context. The researcher has utilized the Quirk Model of Semantic Implication for data analysis. A total of 42 (22 for English and 20 for Arabic) political texts selected from different elite newspapers in both Arabic and English for the analy
... Show MoreThis paper proposes a better solution for EEG-based brain language signals classification, it is using machine learning and optimization algorithms. This project aims to replace the brain signal classification for language processing tasks by achieving the higher accuracy and speed process. Features extraction is performed using a modified Discrete Wavelet Transform (DWT) in this study which increases the capability of capturing signal characteristics appropriately by decomposing EEG signals into significant frequency components. A Gray Wolf Optimization (GWO) algorithm method is applied to improve the results and select the optimal features which achieves more accurate results by selecting impactful features with maximum relevance
... Show More<p>Combating the COVID-19 epidemic has emerged as one of the most promising healthcare the world's challenges have ever seen. COVID-19 cases must be accurately and quickly diagnosed to receive proper medical treatment and limit the pandemic. Imaging approaches for chest radiography have been proven in order to be more successful in detecting coronavirus than the (RT-PCR) approach. Transfer knowledge is more suited to categorize patterns in medical pictures since the number of available medical images is limited. This paper illustrates a convolutional neural network (CNN) and recurrent neural network (RNN) hybrid architecture for the diagnosis of COVID-19 from chest X-rays. The deep transfer methods used were VGG19, DenseNet121
... Show MoreThe multiple linear regression model is an important regression model that has attracted many researchers in different fields including applied mathematics, business, medicine, and social sciences , Linear regression models involving a large number of independent variables are poorly performing due to large variation and lead to inaccurate conclusions , One of the most important problems in the regression analysis is the multicollinearity Problem, which is considered one of the most important problems that has become known to many researchers , As well as their effects on the multiple linear regression model, In addition to multicollinearity, the problem of outliers in data is one of the difficulties in constructing the reg
... Show MoreIn this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show MoreWe used to think of grammar as the bones of the language and vocabulary as the flesh to be added given that language consisted largely of life generated chunks of lexis. This “skeleton image” has been proverbially used to refer to that central feature of lexis named collocation- an idea that for the first 15 years of language study and analysis gave a moment‟s thought to English classroom material and methodology.
The work of John Sinclair, Dave Willis, Ron Carter, Michael McCarthy, Michael Lewis, and many others have all contributed to the way teachers today approach the area of lexis and what it means in the teaching/learning process of the language. This also seems to have incorporated lexical ideas into the teaching mechanis
The purpose of this paper is to solve the stochastic demand for the unbalanced transport problem using heuristic algorithms to obtain the optimum solution, by minimizing the costs of transporting the gasoline product for the Oil Products Distribution Company of the Iraqi Ministry of Oil. The most important conclusions that were reached are the results prove the possibility of solving the random transportation problem when the demand is uncertain by the stochastic programming model. The most obvious finding to emerge from this work is that the genetic algorithm was able to address the problems of unbalanced transport, And the possibility of applying the model approved by the oil products distribution company in the Iraqi Ministry of Oil to m
... Show MoreIn this paper, a cognitive system based on a nonlinear neural controller and intelligent algorithm that will guide an autonomous mobile robot during continuous path-tracking and navigate over solid obstacles with avoidance was proposed. The goal of the proposed structure is to plan and track the reference path equation for the autonomous mobile robot in the mining environment to avoid the obstacles and reach to the target position by using intelligent optimization algorithms. Particle Swarm Optimization (PSO) and Artificial Bee Colony (ABC) Algorithms are used to finding the solutions of the mobile robot navigation problems in the mine by searching the optimal paths and finding the reference path equation of the optimal
... Show MoreThis paper shows an approach for Electromyography (ECG) signal processing based on linear and nonlinear adaptive filtering using Recursive Least Square (RLS) algorithm to remove two kinds of noise that affected the ECG signal. These are the High Frequency Noise (HFN) and Low Frequency Noise (LFN). Simulation is performed in Matlab. The ECG, HFN and LFN signals used in this study were downloaded from ftp://ftp.ieee.org/uploads/press/rangayyan/, and then the filtering process was obtained by using adaptive finite impulse response (FIR) that illustrated better results than infinite impulse response (IIR) filters did.
I've made extensive studies on the distribution of the electric field stable heterogeneous within intensive that contain metal rings with slope diagonal positive to a site halfway to be in its maximum value, followed by decline negative and equally to the other end of the concentrated distributed by electric stable thanking sequentially and have focused empirical studies in the pastthe molecules that you focused Pantqaúha during passage