Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different cancer types is important for cancer diagnosis and drug discovery, SGD-SVM is applied for classifying the most common leukemia cancer type dataset. The results that are gotten using SGD-SVM are much accurate than other results of many studies that used the same leukemia datasets.
This research basically gives an introduction about the multiple intelligence
theory and its implication into the classroom. It presents a unit plan based upon the
MI theory followed by a report which explains the application of the plan by the
researcher on the first class student of computer department in college of sciences/
University of Al-Mustansiryia and the teacher's and the students' reaction to it.
The research starts with a short introduction about the MI theory is a great
theory that could help students to learn better in a relaxed learning situation. It is
presented by Howard Gardener first when he published his book "Frames of
Minds" in 1983 in which he describes how the brain has multiple intelligen
Home Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad
... Show MoreThis research includes the application of non-parametric methods in estimating the conditional survival function represented in a method (Turnbull) and (Generalization Turnbull's) using data for Interval censored of breast cancer and two types of treatment, Chemotherapy and radiation therapy and age is continuous variable, The algorithm of estimators was applied through using (MATLAB) and then the use average Mean Square Error (MSE) as amusement to the estimates and the results showed (generalization of Turnbull's) In estimating the conditional survival function and for both treatments ,The estimated survival of the patients does not show very large differences
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreAcute lymphoblastic leukemia (ALL) is one of the most common diseases , so in this study the serum level of malondialdehyde and its relationship with metanephrine was investigated in acute lymphoblastic leukemia patients over one month of treatment. Some biochemical parameters (serum glucose , total serum protein , malondialdehyde ,vitamin C, and metanephrine) changed as well as white blood cell count and blood hemoglobinlevelswere analyzed in sixty patients diagnosed with acute lymphoblastic leukemia over one month of treatment compared to healthy control group.Statistically significant increases (p<0.01) in white blood cell (WBC) count, mean concentrations of malondialdehyde (MDA) (p< 0.05) and metanephrine (p< 0.001) were observed in
... Show MoreImproving" Jackknife Instrumental Variable Estimation method" using A class of immun algorithm with practical application
Maulticollinearity is a problem that always occurs when two or more predictor variables are correlated with each other. consist of the breach of one basic assumptions of the ordinary least squares method with biased estimates results, There are several methods which are proposed to handle this problem including the method To address a problem and method To address a problem , In this research a comparisons are employed between the biased method and unbiased method with Bayesian using Gamma distribution method addition to Ordinary Least Square metho
... Show MoreIn latest decades, genetic methods have developed into a potent tool in a number of life-attaching applications. In research looking at demographic genetic diversity, QTL detection, marker-assisted selection, and food traceability, DNA-based technologies like PCR are being employed more and more. These approaches call for extraction procedures that provide efficient nucleic acid extraction and the elimination of PCR inhibitors. The first and most important stage in molecular biology is the extraction of DNA from cells. For a molecular scientist, the high quality and integrity of the isolated DNA as well as the extraction method's ease of use and affordability are crucial factors. The present study was designed to establish a simple, fast
... Show More