Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different cancer types is important for cancer diagnosis and drug discovery, SGD-SVM is applied for classifying the most common leukemia cancer type dataset. The results that are gotten using SGD-SVM are much accurate than other results of many studies that used the same leukemia datasets.
Bootstrap is one of an important re-sampling technique which has given the attention of researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con
... Show MoreObjective: To evaluate knowledge towards smoking and its relationship with lung cancer among members of
Baghdad Nursing College.
Methodology: The study comprised 100 affiliates from the College of Nursing/ University of Baghdad that
included students, teaching staff and employees. All data was collected through a structured questionnaire
prepared by the National Cancer Research Center which were answered during a scientific symposium
organized by the center on lung Cancer Awareness in March 2016.The data were analyzed by using the SPSS,
version 22
Results: The age of the respondents ranged from (19-64 years); 76% were females and only 4% were smokers.
The results showed that the mean score for the level of knowled
Companies compete greatly with each other today, so they need to focus on innovation to develop their products and make them competitive. Lean product development is the ideal way to develop product, foster innovation, maximize value, and reduce time. Set-Based Concurrent Engineering (SBCE) is an approved lean product improvement mechanism that builds on the creation of a number of alternative designs at the subsystem level. These designs are simultaneously improved and tested, and the weaker choices are removed gradually until the optimum solution is reached finally. SBCE implementations have been extensively performed in the automotive industry and there are a few case studies in the aerospace industry. This research describe the use o
... Show MoreThis paper deals with constructing a model of fuzzy linear programming with application on fuels product of Dura- refinery , which consist of seven products that have direct effect ondaily consumption . After Building the model which consist of objective function represents the selling prices ofthe products and fuzzy productions constraints and fuzzy demand constraints addition to production requirements constraints , we used program of ( WIN QSB ) to find the optimal solution
Research summarized in applying the model of fuzzy goal programming for aggregate production planning , in General Company for hydraulic industries / plastic factory to get an optimal production plan trying to cope with the impact that fluctuations in demand and employs all available resources using two strategies where they are available inventories strategy and the strategy of change in the level of the workforce, these strategies costs are usually imprecise/fuzzy. The plant administration trying to minimize total production costs, minimize carrying costs and minimize changes in labour levels. depending on the gained data from th
... Show MoreThis study proposes a mathematical approach and numerical experiment for a simple solution of cardiac blood flow to the heart's blood vessels. A mathematical model of human blood flow through arterial branches was studied and calculated using the Navier-Stokes partial differential equation with finite element analysis (FEA) approach. Furthermore, FEA is applied to the steady flow of two-dimensional viscous liquids through different geometries. The validity of the computational method is determined by comparing numerical experiments with the results of the analysis of different functions. Numerical analysis showed that the highest blood flow velocity of 1.22 cm/s occurred in the center of the vessel which tends to be laminar and is influe
... Show MoreThe deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show More