Preferred Language
Articles
/
jeasiq-767
Using Bayesian method to estimate the parameters of Exponential Growth Model with Autocorrelation problem and different values of parameter of correlation-using simulation
...Show More Authors

We have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.

The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. For verifying the goal has been used the simulation technique where has been generated random samples with known parameters and different values of correlation. It has been shown from the computational results that all result has been affected by the values of correlation coefficients used to generate the data, and there is a clear proof and regularity of the sensitivity for Bayesian estimators by Autocorrelation with increase the size of sample.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Dec 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Between Ordinary Methods (LS,IV) and Robust Methods (2SWLS,LTS,RA) to estimate the Parameters of ARX(1,1,1) Model for Electric Loads
...Show More Authors

 

Abstract:

The models of time series often suffer from the problem of the existence of outliers ​​that accompany the data collection process for many reasons, their existence may have a significant impact on the estimation of the parameters of the studied model. Access to highly efficient estimators  is one of the most important stages of statistical analysis, And it is therefore important to choose the appropriate methods to obtain good  estimators. The aim of this research is to compare the ordinary estimators and the robust estimators of the estimation of the parameters of

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Poisson Regression and Conway Maxwell Poisson Models Using Simulation
...Show More Authors

Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well-  Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.

Paper type:

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Analyzing current and future direction of non-oil primary balance: Case Study of Iraq Using Exponential Smoothing model
...Show More Authors

In recent years, non-oil primary balance indicator has been given considerable financial important in rentier state. It highly depends on this indicator to afford a clear and proper picture of public finance situation in term of appropriate and sustainability in these countries, due to it excludes the effect of oil- rental from compound of financial accounts which provide sufficient information to economic policy makers of how economy is able to create potential added value and then changes by eliminating one sided shades of economy. In Iraq, since, 2004, the deficit in value of this indicator has increased, due to almost complete dependence on the revenues of the oil to finance the budget and the obvious decline of the non-oil s

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Apr 20 2023
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Bayesian Estimation for Two Parameters of Exponential Distribution under Different Loss Functions
...Show More Authors

In this paper, two parameters for the Exponential distribution were estimated using the
Bayesian estimation method under three different loss functions: the Squared error loss function,
the Precautionary loss function, and the Entropy loss function. The Exponential distribution prior
and Gamma distribution have been assumed as the priors of the scale γ and location δ parameters
respectively. In Bayesian estimation, Maximum likelihood estimators have been used as the initial
estimators, and the Tierney-Kadane approximation has been used effectively. Based on the MonteCarlo
simulation method, those estimators were compared depending on the mean squared errors (MSEs).The results showed that the Bayesian esti

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Sep 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
A Comparative Study for Estimate Fractional Parameter of ARFIMA Model
...Show More Authors

      Long memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu May 11 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Estimation of the Parameter of an Exponential Distribution When Applying Maximum Likelihood and Probability Plot Methods Using Simulation
...Show More Authors

 Exponential Distribution is probably the most important distribution in reliability work. In this paper, estimating the scale parameter of an exponential distribution was proposed through out employing maximum likelihood estimator and probability plot methods for different samples size. Mean square error was implemented as an indicator of performance for assumed several values of the parameter and computer simulation has been carried out to analysis the obtained results

View Publication Preview PDF
Publication Date
Tue Jun 01 2021
Journal Name
Baghdad Science Journal
Comparing Weibull Stress – Strength Reliability Bayesian Estimators for Singly Type II Censored Data under Different loss Functions
...Show More Authors

     The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Maximum Likelihood Method And Bayesian Method For Estimating Some Non-Homogeneous Poisson Processes Models
...Show More Authors

Abstract

The Non - Homogeneous Poisson  process is considered  as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).

This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto ,   to estimate th

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed May 10 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
On Double Stage Shrinkage-Bayesian Estimator for the Scale Parameter of Exponential Distribution
...Show More Authors

  This paper is concerned with Double Stage Shrinkage Bayesian (DSSB) Estimator for lowering the mean squared error of classical estimator ˆ q for the scale parameter (q) of an exponential distribution in a region (R) around available prior knowledge (q0) about the actual value (q) as initial estimate as well as to reduce the cost of experimentations.         In situation where the experimentations are time consuming or very costly, a Double Stage procedure can be used to reduce the expected sample size needed to obtain the estimator. This estimator is shown to have smaller mean squared error for certain choice of the shrinkage weight factor y( ) and for acceptance region R. Expression for

... Show More
View Publication Preview PDF