Preferred Language
Articles
/
jeasiq-767
Using Bayesian method to estimate the parameters of Exponential Growth Model with Autocorrelation problem and different values of parameter of correlation-using simulation
...Show More Authors

We have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.

The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. For verifying the goal has been used the simulation technique where has been generated random samples with known parameters and different values of correlation. It has been shown from the computational results that all result has been affected by the values of correlation coefficients used to generate the data, and there is a clear proof and regularity of the sensitivity for Bayesian estimators by Autocorrelation with increase the size of sample.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Apr 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Solving a three dimensional transportation problem using linear programming
...Show More Authors

Transport is a problem and one of the most important mathematical methods that help in making the right decision for the transfer of goods from sources of supply to demand centers and the lowest possible costs, In this research, the mathematical model of the three-dimensional transport problem in which the transport of goods is not homogeneous was constructed. The simplex programming method was used to solve the problem of transporting the three food products (rice, oil, paste) from warehouses to the student areas in Baghdad, This model proved its efficiency in reducing the total transport costs of the three products. After the model was solved in (Winqsb) program, the results showed that the total cost of transportation is (269,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Poisson Regression and Conway Maxwell Poisson Models Using Simulation
...Show More Authors

Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well-  Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.

Paper type:

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Analyzing current and future direction of non-oil primary balance: Case Study of Iraq Using Exponential Smoothing model
...Show More Authors

In recent years, non-oil primary balance indicator has been given considerable financial important in rentier state. It highly depends on this indicator to afford a clear and proper picture of public finance situation in term of appropriate and sustainability in these countries, due to it excludes the effect of oil- rental from compound of financial accounts which provide sufficient information to economic policy makers of how economy is able to create potential added value and then changes by eliminating one sided shades of economy. In Iraq, since, 2004, the deficit in value of this indicator has increased, due to almost complete dependence on the revenues of the oil to finance the budget and the obvious decline of the non-oil s

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Sep 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
A Comparative Study for Estimate Fractional Parameter of ARFIMA Model
...Show More Authors

      Long memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jun 01 2021
Journal Name
Baghdad Science Journal
Comparing Weibull Stress – Strength Reliability Bayesian Estimators for Singly Type II Censored Data under Different loss Functions
...Show More Authors

     The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery

... Show More
View Publication Preview PDF
Scopus (3)
Scopus Clarivate Crossref
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Maximum Likelihood Method And Bayesian Method For Estimating Some Non-Homogeneous Poisson Processes Models
...Show More Authors

Abstract

The Non - Homogeneous Poisson  process is considered  as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).

This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto ,   to estimate th

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jul 31 2025
Journal Name
Journal Of Administration And Economics
Using the Maximum Likelihood Method with a Suggested Weight to Estimate the Effect of Some Pollutants on the Tigris River- City of Kut
...Show More Authors

The aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t

... Show More
View Publication Preview PDF
Publication Date
Mon Oct 22 2018
Journal Name
Journal Of Economics And Administrative Sciences
Using Mehar method to change fuzzy cost of fuzzy linear model with practical application
...Show More Authors

  Many production companies suffers from big losses because of  high production cost and low profits for several reasons, including raw materials high prices and no taxes impose on imported goods also consumer protection law deactivation and national product and customs law, so most of consumers buy imported goods because it is characterized by modern specifications and low prices.

  The production company also suffers from uncertainty in the cost, volume of production, sales, and availability of raw materials and workers number because they vary according to the seasons of the year.

  I had adopted in this research fuzzy linear program model with fuzzy figures

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Sep 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
The use of the methods of the lower squares and the smaller squares weighted in the estimation of the parameters and design of the sample acceptance schemesFor general exponential distribution
...Show More Authors

The acceptance sampling plans for generalized exponential distribution, when life time experiment is truncated at a pre-determined time are provided in this article. The two parameters (α, λ), (Scale parameters and Shape parameters) are estimated by LSE, WLSE and the Best Estimator’s for various samples sizes are used to find the ratio of true mean time to a pre-determined, and are used to find the smallest possible sample size required to ensure the producer’s risks, with a pre-fixed probability (1 - P*). The result of estimations and of sampling plans is provided in tables.

Key words: Generalized Exponential Distribution, Acceptance Sampling Plan, and Consumer’s and Producer Risks

... Show More
View Publication Preview PDF
Crossref