The support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample sizes (50, 100, 200). A comparison between non-linear SVM and two standard classification methods was illustrated using various compared features. Our study has shown that the non-linear SVM method gives better results by checking: sensitivity, specificity, accuracy, and time-consuming. © 2024 Author(s).
The research included five sections containing the first section on the introduction of the research and its importance and was addressed to the importance of the game of gymnastic and skilled parallel effectiveness and the importance of learning, but the problem of research that there is a difference in learning this skill and difficulty in learning may be one of the most important reasons are fear and fear of falling and injury, And a lack of sense of the movement of the movement is one of the obstacles in the completion of the skill and the goal of research to design a device that helps in learning the skill of descending Almtor facing with half a cycle according to the typical locomotor track on the parallel device of the technical men'
... Show MoreAbstract
The prevention of bankruptcy not only prolongs the economic life of the company and increases its financial performance, but also helps to improve the general economic well-being of the country. Therefore, forecasting the financial shortfall can affect various factors and affect different aspects of the company, including dividends. In this regard, this study examines the prediction of the financial deficit of companies that use the logistic regression method and its impact on the earnings per share of companies listed on the Iraqi Stock Exchange. The time period of the research is from 2015 to 2020, where 33 companies that were accepted in the Iraqi Stock Exchange were selected as a sample, and the res
... Show MoreWater/oil emulsion is considered as the most refractory mixture to separate because of the interference of the two immiscible liquids, water and oil. This research presents a study of dewatering of water / kerosene emulsion using hydrocyclone. The effects of factors such as: feed flow rate (3, 5, 7, 9, and 11 L/min), inlet water concentration of the emulsion (5%, 7.5%, 10%, 12.5%, and 15% by volume), and split ratio (0.1, 0.3, 0.5, 0.7, and 0.9) on the separation efficiency and pressure drop were studied. Dimensional analysis using Pi theorem was applied for the first time to model the hydrocyclone based on the experimental data. It was shown that the maximum separation efficiency; at split ratio 0.1, was 94.3% at 10% co
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreEye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Shadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
Objective(s): to assess the factors which are associated with the prolonged prehospital delay of patients with
acute myocardial infarction.
Methodology: A descriptive study was conducted at the Coronary Care unit (CCU) in Al-Yarmok Teaching
Hospital, Ibn AL-Nafis Hospital for Cardiovascular Diseases, AL-Kadumia Teaching Hospital, Baghdad Teaching
Hospital, and AL-Kindy Teaching Hospital during the period of the study from February 2
nd
, 2009 to October 30th
,
2009. A random sample of (160) paƟent who were admiƩed to the hospitals were selected one by one. A
questionnaire was constructed for the purpose of the study, which is comprised of four parts that include (1)
sociodemographic data; (2) prehospital d