The support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample sizes (50, 100, 200). A comparison between non-linear SVM and two standard classification methods was illustrated using various compared features. Our study has shown that the non-linear SVM method gives better results by checking: sensitivity, specificity, accuracy, and time-consuming. © 2024 Author(s).
With the fast-growing of neural machine translation (NMT), there is still a lack of insight into the performance of these models on semantically and culturally rich texts, especially between linguistically distant languages like Arabic and English. In this paper, we investigate the performance of two state-of-the-art AI translation systems (ChatGPT, DeepSeek) when translating Arabic texts to English in three different genres: journalistic, literary, and technical. The study utilizes a mixed-method evaluation methodology based on a balanced corpus of 60 Arabic source texts from the three genres. Objective measures, including BLEU and TER, and subjective evaluations from human translators were employed to determine the semantic, contextual an
... Show MoreGroupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MoreWith the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreThe general health of palm trees, encompassing the roots, stems, and leaves, significantly impacts palm oil production, therefore, meticulous attention is needed to achieve optimal yield. One of the challenges encountered in sustaining productive crops is the prevalence of pests and diseases afflicting oil palm plants. These diseases can detrimentally influence growth and development, leading to decreased productivity. Oil palm productivity is closely related to the conditions of its leaves, which play a vital role in photosynthesis. This research employed a comprehensive dataset of 1,230 images, consisting of 410 showing leaves, another 410 depicting bagworm infestations, and an additional 410 displaying caterpillar infestations. Furthe
... Show MoreInformation from 54 Magnetic Resonance Imaging (MRI) brain tumor images (27 benign and 27 malignant) were collected and subjected to multilayer perceptron artificial neural network available on the well know software of IBM SPSS 17 (Statistical Package for the Social Sciences). After many attempts, automatic architecture was decided to be adopted in this research work. Thirteen shape and statistical characteristics of images were considered. The neural network revealed an 89.1 % of correct classification for the training sample and 100 % of correct classification for the test sample. The normalized importance of the considered characteristics showed that kurtosis accounted for 100 % which means that this variable has a substantial effect
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show More