Preferred Language
Articles
/
jeasiq-1712
The use of the Biz method and classical methods in estimating the parameters of the binary logistic regression model
...Show More Authors

Abstract

          Binary logistic regression model used in data classification and it is the strongest most flexible tool in study cases variable response binary when compared to linear regression. In this research, some classic methods were used to estimate parameters binary logistic regression model, included the maximum likelihood method, minimum chi-square method, weighted least squares, with bayes estimation , to choose the best method of estimation by default values to estimate parameters according two different models of general linear regression models ,and different sample sizes ,and building  an experiment simulation experience then displaying the results and the analysis using the statistical criteria Mean Squares Error (MSE),to choose the best standard methods for estimators the binary logistic regression model.

   Generally, The method was found to be the best one among the standard estimation methods, for the purpose of estimating the parameters for binary logistic regression model because it has the less (MSE) for estimators compared to other methods, which indicates the accuracy of the  method in estimating the parameters of the model.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Use projection pursuit regression and neural network to overcome curse of dimensionality
...Show More Authors

Abstract

This research aim to overcome the problem of dimensionality by using the methods of non-linear regression, which reduces the root of the average square error (RMSE), and is called the method of projection pursuit regression (PPR), which is one of the methods for reducing dimensions that work to overcome the problem of dimensionality (curse of dimensionality), The (PPR) method is a statistical technique that deals with finding the most important projections in multi-dimensional data , and With each finding projection , the data is reduced by linear compounds overall the projection. The process repeated to produce good projections until the best projections are obtained. The main idea of the PPR is to model

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Dec 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Multi – Linear in Multiple Nonparametric Regression , Detection and Treatment Using Simulation
...Show More Authors

             It is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jun 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Use of lower squares and restricted boxes In the estimation of the first-order self-regression parameter AR (1) (simulation study)
...Show More Authors

Use of lower squares and restricted boxes
In the estimation of the first-order self-regression parameter
AR (1) (simulation study)

View Publication Preview PDF
Crossref
Publication Date
Thu Jun 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Comparing Some of Robust the Non-Parametric Methods for Semi-Parametric Regression Models Estimation
...Show More Authors

In this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then  these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.

The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Fri Sep 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Choosing the best method for estimating the survival function of inverse Gompertz distribution by using Integral mean squares error (IMSE)
...Show More Authors

In this research , we study the inverse Gompertz distribution (IG) and estimate the  survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes

View Publication Preview PDF
Crossref
Publication Date
Thu Jun 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
The Use of the Regression Tree and the Support Vector Machine in the Classification of the Iraqi Stock Exchange for the Period 2019-2020
...Show More Authors

 The financial markets are one of the sectors whose data is characterized by continuous movement in most of the times and it is constantly changing, so it is difficult to predict its trends , and this leads to the need of methods , means and techniques for making decisions, and that pushes investors and analysts in the financial markets to use various and different methods in order to reach at predicting the movement of the direction of the financial markets. In order to reach the goal of making decisions in different investments, where the algorithm of the support vector machine and the CART regression tree algorithm are used to classify the stock data in order to determine

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Aug 01 2014
Journal Name
Journal Of Economics And Administrative Sciences
Comparison between the Local Polynomial Kernel and Penalized Spline to Estimating Varying Coefficient Model
...Show More Authors

Analysis the economic and financial phenomena and other requires to build the appropriate model, which represents the causal relations between factors. The operation building of the model depends on Imaging conditions and factors surrounding an in mathematical formula and the Researchers target to build that formula appropriately. Classical linear regression models are an important statistical tool, but used in a limited way, where is assumed that the relationship between the variables illustrations and response variables identifiable. To expand the representation of relationships between variables that represent the phenomenon under discussion we used Varying Coefficient Models

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jun 01 2008
Journal Name
Journal Of Economics And Administrative Sciences
تقدير دالة الأنحدار اللامعلمي باستخدام بعض الطرائق اللامعلمية الرتيبة
...Show More Authors

المستخلـص

تم في هذا البحث دراسة الطرائق اللامعلمية الرتيبة لتقدير دالة الأنحدار اللامعلمي، ومعالجة القيم الشاذة الموجودة في دالة الأنحدار اللامعلمي لجعل الدالة رتيبة (متزايدة أو متناقصة).

لذا سنقوم أولاً بتقدير دالة الأنحدار اللامعلمي بإستخدام ممهد Kernel ومن ثم تطبيق الطرائق الرتيبة لجعل الدالة متزايدة إذ سنتناول ثلاث طرائق للتقدير:-

1- طريقة ste

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Estimating the reliability function of Kumaraswamy distribution data
...Show More Authors

The aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter  (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.

The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet

... Show More
View Publication Preview PDF
Crossref (1)
Crossref