Text categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy they got. Deep Learning (DL) and Machine Learning (ML) models were used to enhance text classification for Arabic language. Remarks for future work were concluded.
There is an evidence that channel estimation in communication systems plays a crucial issue in recovering the transmitted data. In recent years, there has been an increasing interest to solve problems due to channel estimation and equalization especially when the channel impulse response is fast time varying Rician fading distribution that means channel impulse response change rapidly. Therefore, there must be an optimal channel estimation and equalization to recover transmitted data. However. this paper attempt to compare epsilon normalized least mean square (ε-NLMS) and recursive least squares (RLS) algorithms by computing their performance ability to track multiple fast time varying Rician fading channel with different values of Doppler
... Show MoreAbstract
The aim of this work is to create a power control system for wind turbines based on fuzzy logic. Three power control loop was considered including: changing the pitch angle of the blade, changing the length of the blade and turning the nacelle. The stochastic law was given for changes and instant inaccurate assessment of wind conditions changes. Two different algorithms were used for fuzzy inference in the control loop, the Mamdani and Larsen algorithms. These two different algorithms are materialized and developed in this study in Matlab-Fuzzy logic toolbox which has been practically implemented using necessary intelligent control system in electrical engineerin
... Show MoreThe field of autonomous robotic systems has advanced tremendously in the last few years, allowing them to perform complicated tasks in various contexts. One of the most important and useful applications of guide robots is the support of the blind. The successful implementation of this study requires a more accurate and powerful self-localization system for guide robots in indoor environments. This paper proposes a self-localization system for guide robots. To successfully implement this study, images were collected from the perspective of a robot inside a room, and a deep learning system such as a convolutional neural network (CNN) was used. An image-based self-localization guide robot image-classification system delivers a more accura
... Show MoreThis study aimed to explore self and public stigma towards mental illness and associated factors among university students from 11 Arabic‐speaking countries. This cross‐sectional study included 4241 university students recruited from Oman, Saudi Arabia, the United Arab Emirates (UAE), Syria, Sudan, Bahrain, Iraq, Jordan, Lebanon, Palestine and Egypt. The participants completed three self‐administrative online questionnaires—Demographic Proforma (age, gender, family income, etc.), Peer Mental Health Stigmatization Scale and Mental Health Knowledge Questionnaire. There was a significant difference in the average mean between the 11 countries (
Products’ quality inspection is an important stage in every production route, in which the quality of the produced goods is estimated and compared with the desired specifications. With traditional inspection, the process rely on manual methods that generates various costs and large time consumption. On the contrary, today’s inspection systems that use modern techniques like computer vision, are more accurate and efficient. However, the amount of work needed to build a computer vision system based on classic techniques is relatively large, due to the issue of manually selecting and extracting features from digital images, which also produces labor costs for the system engineers. In this research, we pr
... Show MoreProducts’ quality inspection is an important stage in every production route, in which the quality of the produced goods is estimated and compared with the desired specifications. With traditional inspection, the process rely on manual methods that generates various costs and large time consumption. On the contrary, today’s inspection systems that use modern techniques like computer vision, are more accurate and efficient. However, the amount of work needed to build a computer vision system based on classic techniques is relatively large, due to the issue of manually selecting and extracting features from digital images, which also produces labor costs for the system engineers.
 
... Show MoreText science presented a set of criteria to make the text essentially a project to create
texts and use. Me and means of cohesion script text scientists, two standard foundries and
knitting. Find this means their equivalent in the Arab rhetorical Heritage has been found, it
means foundries find Accompanying represented (link grammar in the classroom and link),
and referrals represented by (Baldmair, Ldefinition, and the name of the signal), and
deletion, and repetition, and presentation delays. As in the standard knitting it has confirmed
Albulagjun Arabs on the semantic consistency between the text components, as reflected in
the moral link in Chapter interfaces, as well as in moral coherence between parts of the te
Deconstruction theory is a theory that appeared After construction theory, and it tends, through some key principles, to reach the purposive and the main meaning of the text by the means of different perspectives. In other words, deconstruction is a critical literary theory and a contemporary philosophical approach that work together to reach exact concept of the text, and this is achieved through reading and analyzing the text. Therefore, deconstruction has specified some principles so as to reach the exact meaning of the text through these different principles.
پێشەكی:
تیۆری هەڵوەشاندنەوە تیۆرێكە پاش بوونیادگەری سەریهەڵداوە و دەیەوێت لەڕ
... Show More