A substantial portion of today’s multimedia data exists in the form of unstructured text. However, the unstructured nature of text poses a significant task in meeting users’ information requirements. Text classification (TC) has been extensively employed in text mining to facilitate multimedia data processing. However, accurately categorizing texts becomes challenging due to the increasing presence of non-informative features within the corpus. Several reviews on TC, encompassing various feature selection (FS) approaches to eliminate non-informative features, have been previously published. However, these reviews do not adequately cover the recently explored approaches to TC problem-solving utilizing FS, such as optimization techniques. This study comprehensively analyzes different FS approaches based on optimization algorithms for TC. We begin by introducing the primary phases involved in implementing TC. Subsequently, we explore a wide range of FS approaches for categorizing text documents and attempt to organize the existing works into four fundamental approaches: filter, wrapper, hybrid, and embedded. Furthermore, we review four optimization algorithms utilized in solving text FS problems: swarm intelligence-based, evolutionary-based, physics-based, and human behavior-related algorithms. We discuss the advantages and disadvantages of state-of-the-art studies that employ optimization algorithms for text FS methods. Additionally, we consider several aspects of each proposed method and thoroughly discuss the challenges associated with datasets, FS approaches, optimization algorithms, machine learning classifiers, and evaluation criteria employed to assess new and existing techniques. Finally, by identifying research gaps and proposing future directions, our review provides valuable guidance to researchers in developing and situating further studies within the current body of literature.
In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreIn many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show MoreIn this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in
In this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
Schiff bases, named after Hugo Schiff, are aldehyde- or ketone-like compounds in which the carbonyl group is replaced by imine or azomethine group. They are widely used for industrial purposes and also have a broad range of applications as antioxidants. An overview of antioxidant applications of Schiff bases and their complexes is discussed in this review. A brief history of the synthesis and reactivity of Schiff bases and their complexes is presented. Factors of antioxidants are illustrated and discussed. Copyright © 2016 John Wiley & Sons, Ltd.
The research (Anthropology and Representations of magic in Arab Theatrical Text, Harut and Marut's play as a Model) is concerned with studying magic and the forms of its presence in the theatrical text in different human cultures where it belongs. The research consists of four chapters.
The first chapter includes the research problem that revolves around the following questions: (what is the mechanism of employing magic anthropology and its representations in the Arab theatrical text Harut and Marut's play as a model?), and the research importance which is attributed to the necessity of studying (magic) in the Arab theatrical text as it is considered the inauguration of one of the social phenomena that many researchers in the field o
The life insurance companies need a sound system to use it in selecting Insurable risks so they can avoid or reduce possible losses that may be insured to a minimum levels , But the application within IRAQI INSURANCE COMPANY reflects that it still depends on a traditional ways in the procedures used to select that risks .
This research represents an attempt to put acceptable suggestions about developing a system for selecting insurable risks which used now by iraqi insurance company by recognizing the risks of life insurance , determining kinds of risks which can difined as normal and upnormal risks , rectification of the
... Show MoreDrama is one of the means of transmitting human experiences, as it presents within it the life ideas and visions of the spectator, who is subject to their influence on him, robbed of the will in front of its charm and various display arts, which invade him with its dimensions and affect his references, and this art form is based on stories revolving around personalities involved in events that have grown As a result of the struggle of two conflicting opponents, or two opposing forces or emotions generated as a result of a voluntary conflict, as this dramatic conflict represents the most important elements of those events, as it is embodied in an inevitable scene that emerges from other scenes, and this scene is sometimes subject to the
... Show MoreDeconstruction theory is a theory that appeared After construction theory, and it tends, through some key principles, to reach the purposive and the main meaning of the text by the means of different perspectives. In other words, deconstruction is a critical literary theory and a contemporary philosophical approach that work together to reach exact concept of the text, and this is achieved through reading and analyzing the text. Therefore, deconstruction has specified some principles so as to reach the exact meaning of the text through these different principles.
پێشەكی:
تیۆری هەڵوەشاندنەوە تیۆرێكە پاش بوونیادگەری سەریهەڵداوە و دەیەوێت لەڕ
... Show More