In this research estimated the parameters of Gumbel distribution Type 1 for Maximum values through the use of two estimation methods:- Moments (MoM) and Modification Moments(MM) Method. the Simulation used for comparison between each of the estimation methods to reach the best method to estimate the parameters where the simulation was to generate random data follow Gumbel distributiondepending on three models of the real values of the parameters for different sample sizes with samples of replicate (R=500).The results of the assessment were put in tables prepared for the purpose of comparison, which made depending on the mean squares error (MSE).
In this paper, we will illustrate a gamma regression model assuming that the dependent variable (Y) is a gamma distribution and that it's mean ( ) is related through a linear predictor with link function which is identity link function g(μ) = μ. It also contains the shape parameter which is not constant and depends on the linear predictor and with link function which is the log link and we will estimate the parameters of gamma regression by using two estimation methods which are The Maximum Likelihood and the Bayesian and a comparison between these methods by using the standard comparison of average squares of error (MSE), where the two methods were applied to real da
... Show MoreIn this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show MoreThis paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.
Exponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreIn this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes est
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreThe aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show MoreThis Research deals with estimation the reliability function for two-parameters Exponential distribution, using different estimation methods ; Maximum likelihood, Median-First Order Statistics, Ridge Regression, Modified Thompson-Type Shrinkage and Single Stage Shrinkage methods. Comparisons among the estimators were made using Monte Carlo Simulation based on statistical indicter mean squared error (MSE) conclude that the shrinkage method perform better than the other methods