Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different cancer types is important for cancer diagnosis and drug discovery, SGD-SVM is applied for classifying the most common leukemia cancer type dataset. The results that are gotten using SGD-SVM are much accurate than other results of many studies that used the same leukemia datasets.
Background: Type 2 diabetes mellitus is a condition characterized by an elevation of oxidative stress, which has been implicated in diabetic progression and its vascular complications. Aim: Assessing the impact of gliclazide modified release (MR) versus glimepiride on oxidative stress markers, glycemic indices, lipid profile, and estimated glomerular filtration rate in uncontrolled type 2 diabetic patients on metformin monotherapy. Methods: This was an observational comparative study conducted in Thi-Qar specialized diabetic, endocrine, and metabolism center. Sixty-six patients were randomized into two groups based on the addition of the sulfonylureas (SUs). Group 1 (33 patients) was on gliclazide MR, whereas Group 2 (33 patients)
... Show MoreThis study concerns the removal of a trihydrate antibiotic (Amoxicillin) from synthetically contaminated water by adsorption on modified bentonite. The bentonite was modified using hexadecyl trimethyl ammonium bromide (HTAB), which turned it from a hydrophilic to a hydrophobic material. The effects of different parameters were studied in batch experiments. These parameters were contact time, solution pH, agitation speed, initial concentration (C0) of the contaminant, and adsorbent dosage. Maximum removal of amoxicillin (93 %) was achieved at contact time = 240 min, pH = 10, agitation speed = 200 rpm, initial concentration = 30 ppm, and adsorbent dosage = 3 g bentonite per 1L of pollutant solution. The characterization of the adsorbent, modi
... Show MoreThe problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreObjectives: This study aims to assess and compare the micro-shear bond strength (μSBS) of a novel resin-modified glass-ionomer luting cement functionalized with a methacrylate co-monomer containing a phosphoric acid group, 30 wt% 2-(methacryloxy) ethyl phosphate (2-MEP), with different substrates (dentin, enamel, zirconia, and base metal alloy). This assessment is conducted in comparison with conventional resin-modified glass ionomer cement and self-adhesive resin cement. Materials and methods: In this in vitro study, ninety-six specimens were prepared and categorized into four groups: enamel (A), dentin (B), zirconia (C), and base metal alloys (D). Enamel (E) and dentin (D) specimens were obtained from 30 human maxillary first premolars e
... Show MoreConsistent "with the thought of tax talk is unified tax natural evolution for him, as the application leads to the inclusion of tax all branches of income and its sources and through truncated part of this entry through the application of price ascending it, it means the procedures of tax reform. Taxes on total income characterized by giving a clear picture of the total income of the taxpayer and its financial situation and its burden family which allows granting exemptions, downloads, and application of prices that fit this case. This requires reconsideration of the structure of the tax system in force and the transition from a system specific taxes to the tax system on the total income of the integration of income from the rental of re
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreLung cancer is the most common dangerous disease that, if treated late, can lead to death. It is more likely to be treated if successfully discovered at an early stage before it worsens. Distinguishing the size, shape, and location of lymphatic nodes can identify the spread of the disease around these nodes. Thus, identifying lung cancer at the early stage is remarkably helpful for doctors. Lung cancer can be diagnosed successfully by expert doctors; however, their limited experience may lead to misdiagnosis and cause medical issues in patients. In the line of computer-assisted systems, many methods and strategies can be used to predict the cancer malignancy level that plays a significant role to provide precise abnormality detectio
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show More