Preferred Language
Articles
/
jeasiq-2161
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the data type and type of medical study. The probabilistic values obtained from the artificial neural network are used to calculate the net reclassification index (NRI).  A program was written for this purpose using the statistical programming language (R), where the mean maximum absolute error criterion (MME) of the net reclassification network index (NRI) was used to compare the methods of specifying the sample size and the presence of the number of different default parameters in light of the value of a specific error margin (ε). To verify the performance of the methods using the comparison criteria above were the most important conclusions were that the Bennett inequality method is the best in determining the optimum sample size according to the number of default parameters and the error margin value

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Some Methods for Estimating the Scheff'e Model of the Mixture

Because of the experience of the mixture problem of high correlation and the existence of linear MultiCollinearity between the explanatory variables, because of the constraint of the unit and the interactions between them in the model, which increases the existence of links between the explanatory variables and this is illustrated by the variance inflation vector (VIF), L-Pseudo component to reduce the bond between the components of the mixture.

    To estimate the parameters of the mixture model, we used in our research the use of methods that increase bias and reduce variance, such as the Ridge Regression Method and the Least Absolute Shrinkage and Selection Operator (LASSO) method a

... Show More
Crossref
View Publication Preview PDF
Publication Date
Sun Aug 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of the performance of some r- (k,d) class estimators with the (PCTP) estimator that used in estimating the general linear regression model in the presence of autocorrelation and multicollinearity problems at the same time "

In the analysis of multiple linear regression, the problem of multicollinearity and auto-correlation drew the attention of many researchers, and given the appearance of these two problems together and their bad effect on the estimation, some of the researchers found new methods to address these two problems together at the same time. In this research a comparison for the performance of the Principal Components Two Parameter estimator (PCTP) and The (r-k) class estimator and the r-(k,d) class estimator by conducting a simulation study and through the results and under the mean square error (MSE) criterion to find the best way to address the two problems together. The results showed that the r-(k,d) class estimator is the best esti

... Show More
Crossref
View Publication Preview PDF
Publication Date
Wed Nov 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
strong criminal capabilities، Using simulation .

The penalized least square method is a popular method to deal with high dimensional data ,where  the number of explanatory variables is large than the sample size . The properties of  penalized least square method are given high prediction accuracy and making estimation and variables selection

 At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and

... Show More
Crossref
View Publication Preview PDF
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Maximum Likelihood Method And Bayesian Method For Estimating Some Non-Homogeneous Poisson Processes Models

Abstract

The Non - Homogeneous Poisson  process is considered  as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).

This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto ,   to estimate th

... Show More
Crossref
View Publication Preview PDF
Publication Date
Sat Feb 26 2022
Journal Name
Iraqi Journal Of Science
Estimating the Reliability Function for Transmuted Pareto Distribution Using Simulation

     In this work, the methods (Moments, Modified Moments, L-Moments, Percentile, Rank Set sampling and Maximum Likelihood) were used to estimate the reliability function and the two parameters of the Transmuted Pareto (TP) distribution. We use simulation to generate the required data from three cases this indicates  sample size , and it replicates  for the real value for parameters, for reliability times values  we take .

Results were compared by using mean square error (MSE), the result appears as follows :

The best methods are Modified Moments, Maximum likelihood and L-Moments in first case, second case and third case respectively.

Scopus Crossref
View Publication Preview PDF
Publication Date
Fri Dec 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between Bayesian Method and Full Maximum Likelihood to estimate Poisson regression model hierarchy and its application to the maternal deaths in Baghdad

Abstract:

 This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.

The comparison was done by  simulation  using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the  Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood  with sample size  (n = 30) is the best to represent the maternal mortality data after it has been reliance value param

... Show More
Crossref
View Publication Preview PDF
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Some Estimator Methods of Linear Regression Model With Auto-Correlated Errors With Application Data for the Wheat in Iraq

This research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),

... Show More
Crossref
View Publication Preview PDF
Publication Date
Sat Dec 01 2018
Journal Name
Political Sciences Journal
The Patterns of the Strategic Environment and its Role in Determining Strategies for Dealing with Conflict and Peace Situations

Abstract

The decision maker needs to understand the strategic environment to be addressed through different means and methods. It is obvious that there is a difference between the three strategic environments (conflict environment, peace environment, post- peace environment) in terms of inputs and strategies to deal with each one of them. There is an urgent need to understand each pattern separately, analyze its inputs, and identify the factors and variables that affect the continuity of this situation (conflict, peace, post-peace). It is not appropriate to identify treatment without diagnosis of the condition, so it is very important to understand the type of strategic environment to be dealt with it.

... Show More
Crossref (1)
Crossref
View Publication Preview PDF
Publication Date
Tue Apr 04 2023
Journal Name
Journal Of Techniques
Comparison Between the Kernel Functions Used in Estimating the Fuzzy Regression Discontinuous Model

Some experiments need to know the extent of their usefulness to continue providing them or not. This is done through the fuzzy regression discontinuous model, where the Epanechnikov Kernel and Triangular Kernel were used to estimate the model by generating data from the Monte Carlo experiment and comparing the results obtained. It was found that the. Epanechnikov Kernel has a least mean squared error.

Crossref
View Publication Preview PDF
Publication Date
Sat Dec 31 2022
Journal Name
Journal Of Economics And Administrative Sciences
Estimation of Causal Effect of treatment via Fuzzy Regression Discontinuity Designs

In some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after

... Show More
View Publication Preview PDF