Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different cancer types is important for cancer diagnosis and drug discovery, SGD-SVM is applied for classifying the most common leukemia cancer type dataset. The results that are gotten using SGD-SVM are much accurate than other results of many studies that used the same leukemia datasets.
In this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between every
... Show MoreMany problems were encountered during the drilling operations in Zubair oilfield. Stuckpipe, wellbore instability, breakouts and washouts, which increased the critical limits problems, were observed in many wells in this field, therefore an extra non-productive time added to the total drilling time, which will lead to an extra cost spent. A 1D Mechanical Earth Model (1D MEM) was built to suggest many solutions to such types of problems. An overpressured zone is noticed and an alternative mud weigh window is predicted depending on the results of the 1D MEM. Results of this study are diagnosed and wellbore instability problems are predicted in an efficient way using the 1D MEM. Suitable alternative solutions are presented
... Show MoreThe research aims at analyzing the indicators of the sovereign credit of oil and without oil to determine the face of the Iraqi economy from the challenges that would impede the process of growth and economic development for the period (2004-2015).
the research tries to show some lessons to be learned from those indicators, Many of the most important conclusions, acceptance of the hypothesis of research and the weakness of sovereign credit capacity in Iraq to bear the sovereign debt and its burden and work to achieve sustainable economic and social development "in an economy in which oil is neutralized as a single commodity depends on them to meet the requirements of efficiency and efficiency
... Show MoreIn this study, the CR-39 detector technique was used, to estimate the uranium concentration from the soil in midland refineries Company (Doura refine (, Baghdad, Iraq. Uranium concentrations in soil samples have been measured using solid state nuclear track detector type CR-39. Nine soil samples were collected from different areas within the Doura refinery and other soil samples were collected form Abu Tayara Street and ALshortaa District outside the refinery for comparison. The results showed variable values for uranium concentrations. The average value of uranium concentration was found to be 0.37 ppm in doura refinery. For areas outside the refinery, the concentration of uranium was 0.008 ppm. Thes
... Show MoreIn this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between ev
... Show MoreIn this paper the design of hybrid retina matching algorithm that is used in identification systems is considered. Retina based recognition is apparent as the most secure method for identification of an identity utilized to differentiate persons.
The characteristics of Speeded up Robust Feature (SURF) and Binary Robust Invariant Scalable Key-Points (BRISK) algorithm have been used in order to produce a fast matching algorithm than the classical ones, those characteristics are important for real-time applications which usually need quick processing of a growing quantity of data. The algorithm is divided into three stages: retinal image processing and segmentation, extracting the lo
... Show MoreTexture recognition is used in various pattern recognition applications and texture classification that possess a characteristic appearance. This research paper aims to provide an improved scheme to provide enhanced classification decisions and to decrease processing time significantly. This research studied the discriminating characteristics of textures by extracting them from various texture images using discrete Haar transform (DHT) and discrete Fourier transform DFT. Two sets of features are proposed; the first set was extracted using the traditional DFT, while the second used DHT. The features from the Fourier domain are calculated using the radial distribution of spectra, while for those extracted from Haar Wavelet the statistical
... Show MoreAutomatic Speaker Profiling (ASP), is concerned with estimating the physical traits of a person from their voice. These traits include gender, age, ethnicity, and physical parameters. Reliable ASP has a wide range of applications such as mobile shopping, customer service, robotics, forensics, security, and surveillance systems. Research in ASP has gained interest in the last decade, however, it was focused on different tasks individually, such as age, height, or gender. In this work, a review of existing studies on different tasks of speaker profiling is performed. These tasks include age estimation and classification, gender detection, height, and weight estimation This study aims to provide insight into the work of ASP, available dat
... Show MoreAnthropogenic activities cause soil pollution with different serious pollutants, such as polycyclic aromatic hydrocarbon (PAHs) compounds. This study assessed the contamination of PAHs in soil samples collected from 30 sites divided into eight groups (residential areas, oil areas, agricultural areas, roads, petrol stations, power plants, public parks and electrical generators) in Basrah city-Iraq during 2019-2020. The soil characteristics including (moisture, pH, EC and TOC) were measured. Results showed the following ranges (soil moisture (0.03-0.18%),pH (6.90-8.16), EC (2.48-104.80) mS/cm and TOC (9.90-20.50%)). Gas Chromatography (GC) was used to measure PAHs in extracted soil samples. The total PAH range (499.96 - 5864.86) ng/g dr
... Show MoreClassification of network traffic is an important topic for network management, traffic routing, safe traffic discrimination, and better service delivery. Traffic examination is the entire process of examining traffic data, from intercepting traffic data to discovering patterns, relationships, misconfigurations, and anomalies in a network. Between them, traffic classification is a sub-domain of this field, the purpose of which is to classify network traffic into predefined classes such as usual or abnormal traffic and application type. Most Internet applications encrypt data during traffic, and classifying encrypted data during traffic is not possible with traditional methods. Statistical and intelligence methods can find and model traff
... Show More