Preferred Language
Articles
/
Chg5aJQBVTCNdQwCWhUx
Comparison of some Bayesian estimation methods for type-I generalized extreme value distribution with simulation
...Show More Authors

The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimating the scale parameter of the Weibull distribution. To evaluate their performance, we generate simulated datasets with different sample sizes and varying parameter values. A technique for pre-estimation shrinkage is suggested to enhance the precision of estimation. Simulation experiments proved that the Bayesian shrinkage estimator and shrinkage preestimation under the squared loss function method are better than the other methods because they give the least mean square error. Overall, our findings highlight the advantages of shrinkage Bayesian estimation methods for the proposed distribution. Researchers and practitioners in fields reliant on extreme value analysis can benefit from these findings when selecting appropriate Bayesian estimation techniques for modeling extreme events accurately and efficiently.

Scopus Clarivate Crossref
View Publication
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Some Robust methods for Estimates the power Spectrum in ARMA Models Simulation Study
...Show More Authors

Abstract:

Robust statistics Known as, resistance to errors caused by deviation from the stability hypotheses of the statistical operations (Reasonable, Approximately Met, Asymptotically Unbiased, Reasonably Small Bias, Efficient ) in the data selected in a wide range of probability distributions whether they follow a normal distribution or a mixture of other distributions deviations different standard .

power spectrum function lead to, President role in the analysis of Stationary random processes, form stable random variables organized according to time, may be discrete random variables or continuous. It can be described by measuring its total capacity as function in frequency.

<

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Sep 22 2019
Journal Name
Baghdad Science Journal
Estimation of Survival Function for Rayleigh Distribution by Ranking function:-
...Show More Authors

In this article, performing and deriving te probability density function for Rayleigh distribution is done by using ordinary least squares estimator method and Rank set estimator method. Then creating interval for scale parameter of Rayleigh distribution. Anew method using   is used for fuzzy scale parameter. After that creating the survival and hazard functions for two ranking functions are conducted to show which one is beast.

View Publication Preview PDF
Scopus (5)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Maximum Likelihood Method And Bayesian Method For Estimating Some Non-Homogeneous Poisson Processes Models
...Show More Authors

Abstract

The Non - Homogeneous Poisson  process is considered  as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).

This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto ,   to estimate th

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Some Estimation methods for the two models SPSEM and SPSAR for spatially dependent data
...Show More Authors

ABSTRUCT

In This Paper, some semi- parametric spatial models were estimated, these models are, the semi – parametric spatial error model (SPSEM), which suffer from the problem of spatial errors dependence, and the semi – parametric spatial auto regressive model (SPSAR). Where the method of maximum likelihood was used in estimating the parameter of spatial error          ( λ ) in the model (SPSEM), estimated  the parameter of spatial dependence ( ρ ) in the model ( SPSAR ), and using the non-parametric method in estimating the smoothing function m(x) for these two models, these non-parametric methods are; the local linear estimator (LLE) which require finding the smoo

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jun 02 2013
Journal Name
Baghdad Science Journal
Comparison of Maximum Likelihood and some Bayes Estimators for Maxwell Distribution based on Non-informative Priors
...Show More Authors

In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes est

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue Jun 01 2021
Journal Name
Baghdad Science Journal
Comparing Weibull Stress – Strength Reliability Bayesian Estimators for Singly Type II Censored Data under Different loss Functions
...Show More Authors

     The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery

... Show More
View Publication Preview PDF
Scopus (3)
Scopus Clarivate Crossref
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Bayesian methods to estimate sub - population
...Show More Authors

The aim of the research is to estimate the hidden population. Here، the number of drug users in Baghdad was calculated for the male age group (15-60) years old ، based on the Bayesian models. These models are used to treat some of the bias in the Killworth method Accredited in many countries of the world.

Four models were used: random degree، Barrier effects، Transmission bias، the first model being random، an extension of the Killworth model، adding random effects such as variance and uncertainty Through the size of the personal network، and when expanded by adding the fact that the respondents have different tendencies، the mixture of non-random variables with random to produce

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2017
Journal Name
Australian Journal Of Basic And Applied Sciences
Proposed Algorithm for Gumbel Distribution Estimation
...Show More Authors

Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu

... Show More
Preview PDF
Crossref
Publication Date
Mon Jul 20 2020
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Estimation of the Reliability Function of Basic Gompertz Distribution under Different Priors
...Show More Authors

In this paper, some estimators for the reliability function R(t) of Basic Gompertz (BG) distribution have been obtained, such as Maximum likelihood estimator, and Bayesian estimators under General Entropy loss function by assuming non-informative prior by using Jefferys prior and informative prior represented by Gamma and inverted Levy priors. Monte-Carlo simulation is conducted to compare the performance of all estimates of the R(t), based on integrated mean squared.

View Publication Preview PDF
Crossref
Publication Date
Sun Aug 13 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Estimation Optimal Threshold Value for Image Edge Detection
...Show More Authors

      A new approach presented in this study to determine the optimal edge detection threshold value. This approach is base on extracting small homogenous blocks from unequal mean targets. Then, from these blocks we generate small image with known edges (edges represent the lines between the contacted blocks). So, these simulated edges can be assumed as true edges .The true simulated edges, compared with the detected edges in the small generated image is done by using different thresholding values. The comparison based on computing mean square errors between the simulated edge image and the produced edge image from edge detector methods. The mean square error computed for the total edge image (Er), for edge regio

... Show More
View Publication Preview PDF