Preferred Language
Articles
/
jeasiq-2129
Using jack knife to estimation logistic regression model for Breast cancer disease
...Show More Authors

 

It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values  (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jacknaev method and comparing the capabilities according to the information standard (AIC)

The Jackknife method and the aforementioned statistical capabilities were applied to study the relationship between the response variable (incidence and absence of breast cancer) for a sample size of (100) samples for the year (2020) and the explanatory variables (the percentage of haemoglobin present in red cells in the blood, red blood cells, white blood cells, Platelets, the percentage of haemoglobin in the blood, the percentage of lymphocytes in the blood, the percentage of monocytes, the percentage of eosinophils, the percentage of basophils) And it was evident through comparison that the character regression method in estimating the two-response logistic regression model is the best in estimating the parameters of the logistic regression model in the case of a problem of linearity

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Detecting Outliers In Multiple Linear Regression
...Show More Authors

It is well-known that the existence of outliers in the data will adversely affect the efficiency of estimation and results of the current study. In this paper four methods will be studied to detect outliers for the multiple linear regression model in two cases :  first, in real data; and secondly,  after adding the outliers to data and the attempt to detect it. The study is conducted for samples with different sizes, and uses three measures for  comparing between these methods . These three measures are : the mask, dumping and standard error of the estimate.

View Publication Preview PDF
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Hybrid Framework To Exclude Similar and Faulty Test Cases In Regression Testing
...Show More Authors

 

Regression testing is a crucial phase in the software development lifecycle that makes sure that new changes/updates in the software system don’t introduce defects or don’t affect adversely the existing functionalities. However, as the software systems grow in complexity, the number of test cases in regression suite can become large which results into more testing time and resource consumption. In addition, the presence of redundant and faulty test cases may affect the efficiency of the regression testing process. Therefore, this paper presents a new Hybrid Framework to Exclude Similar & Faulty Test Cases in Regression Testing (ETCPM) that utilizes automated code analysis techniques and historical test execution data to

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Use projection pursuit regression and neural network to overcome curse of dimensionality
...Show More Authors

Abstract

This research aim to overcome the problem of dimensionality by using the methods of non-linear regression, which reduces the root of the average square error (RMSE), and is called the method of projection pursuit regression (PPR), which is one of the methods for reducing dimensions that work to overcome the problem of dimensionality (curse of dimensionality), The (PPR) method is a statistical technique that deals with finding the most important projections in multi-dimensional data , and With each finding projection , the data is reduced by linear compounds overall the projection. The process repeated to produce good projections until the best projections are obtained. The main idea of the PPR is to model

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Dec 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Comparing the Sequential Nonlinear least squared Method and Sequential robust M method to estimate the parameters of Two Dimensional sinusoidal signal model:
...Show More Authors

Estimation of the unknown parameters in 2-D sinusoidal signal model can be considered as important and difficult problem. Due to the difficulty to find estimate of all the parameters of this type of models at the same time, we propose sequential non-liner least squares method and sequential robust  M method after their development through the use of sequential  approach in the estimate suggested by Prasad et al to estimate unknown frequencies and amplitudes for the 2-D sinusoidal compounds but depending on Downhill Simplex Algorithm in solving non-linear equations for the purpose of obtaining non-linear parameters estimation which represents frequencies and then use of least squares formula to estimate

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Adoption of multi – model Assignment Fuzzy to find Optimizing for the use of internet line in the Ministry of Science and Technlogy
...Show More Authors

We have provided in this research model multi assignment  with  fuzzy function goal has been to build programming model is correct Integer Programming fogging  after removing the case from the objective function data and convert it to real data .Pascal triangular graded mean using Pascal way to the center of the triangular.

The data processing to get rid of the case fogging which is surrounded by using an Excel 2007 either model multi assignment  has been used program LNDO to reach the optimal solution, which represents less than what can be from time to accomplish a number of tasks by the number of employees on the specific amount of the Internet, also included a search on some of the

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Oct 01 2014
Journal Name
Journal Of Economics And Administrative Sciences
Spatial Regression Model Estimation for the poverty Rates In the districts of Iraq in 2012
...Show More Authors

Theresearch took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide a practical evident that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial andthat includes all of them spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. Spatial analysis had

... Show More
View Publication Preview PDF
Publication Date
Thu Dec 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
Usage of non-linear programming in building a mathematical model for production planning according to discount constraints put on bought amount
...Show More Authors

Abstract

 This research deals will the declared production planning operation in the general company of planting oils, which have  great role in production operations management who had built mathematical model for correct non-linear programming according to discounting operation during raw materials or half-made materials purchasing operation which concentration of six main products by company but discount included just three products of raw materials, and there were six months taken from the 1st half of 2014 as a planning period has been chosen . Simulated annealing algorithm  application on non-linear model which been more difficulty than possible solution when imposed restric

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Sep 01 2020
Journal Name
Al-khwarizmi Engineering Journal
Two-Stage Classification of Breast Tumor Biomarkers for Iraqi Women
...Show More Authors

Objective: Breast cancer is regarded as a deadly disease in women causing lots of mortalities. Early diagnosis of breast cancer with appropriate tumor biomarkers may facilitate early treatment of the disease, thus reducing the mortality rate. The purpose of the current study is to improve early diagnosis of breast by proposing a two-stage classification of breast tumor biomarkers fora sample of Iraqi women.

Methods: In this study, a two-stage classification system is proposed and tested with four machine learning classifiers. In the first stage, breast features (demographic, blood and salivary-based attributes) are classified into normal or abnormal cases, while in the second stage the abnormal breast cases are

... Show More
View Publication Preview PDF
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Use Generalized Pareto Survival Models to Estimation Optimal Survival Time for Myocardial Infarction Patients
...Show More Authors

The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Mar 23 2020
Journal Name
International Journal Of Nanoscience
Gold Nanoparticles Synthesis Using Environmentally Friendly Approach for Inhibition Human Breast Cancer
...Show More Authors

In this study, gold nanoparticles were synthesized in a single step biosynthetic method using aqueous leaves extract of thymus vulgaris L. It acts as a reducing and capping agent. The characterizations of nanoparticles were carried out using UV-Visible spectra, X-ray diffraction (XRD) and FTIR. The surface plasmon resonance of the as-prepared gold nanoparticles (GNPs) showed the surface plasmon resonance centered at 550[Formula: see text]nm. The XRD pattern showed that the strong four intense peaks indicated the crystalline nature and the face centered cubic structure of the gold nanoparticles. The average crystallite size of the AuNPs was 14.93[Formula: see text]nm. Field emission scanning electron microscope (FESEM) was used to s

... Show More
View Publication
Scopus (5)
Crossref (3)
Scopus Clarivate Crossref