This paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreAdvances in gamma imaging technology mean that is now technologically feasible to conduct stereoscopic gamma imaging in a hand-held unit. This paper derives an analytical model for stereoscopic pinhole imaging which can be used to predict performance for a wide range of camera configurations. Investigation of this concept through Monte Carlo and benchtop studies, for an example configuration, shows camera-source distance measurements with a mean deviation between calculated and actual distances of <5 mm for imaging distances of 50–250 mm. By combining this technique with stereoscopic optical imaging, we are then able to calculate the depth of a radioisotope source beneath a surfa
In this paper, the Azzallini’s method used to find a weighted distribution derived from the standard Pareto distribution of type I (SPDTI) by inserting the shape parameter (θ) resulting from the above method to cover the period (0, 1] which was neglected by the standard distribution. Thus, the proposed distribution is a modification to the Pareto distribution of the first type, where the probability of the random variable lies within the period The properties of the modified weighted Pareto distribution of the type I (MWPDTI) as the probability density function ,cumulative distribution function, Reliability function , Moment and the hazard function are found. The behaviour of probability density function for MWPDTI distrib
... Show MoreIn this research, the focus was on estimating the parameters on (min- Gumbel distribution), using the maximum likelihood method and the Bayes method. The genetic algorithmmethod was employed in estimating the parameters of the maximum likelihood method as well as the Bayes method. The comparison was made using the mean error squares (MSE), where the best estimator is the one who has the least mean squared error. It was noted that the best estimator was (BLG_GE).
The research presents the reliability. It is defined as the probability of accomplishing any part of the system within a specified time and under the same circumstances. On the theoretical side, the reliability, the reliability function, and the cumulative function of failure are studied within the one-parameter Raleigh distribution. This research aims to discover many factors that are missed the reliability evaluation which causes constant interruptions of the machines in addition to the problems of data. The problem of the research is that there are many methods for estimating the reliability function but no one has suitable qualifications for most of these methods in the data such
Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreThis paper concerned with estimation reliability ( for K components parallel system of the stress-strength model with non-identical components which is subjected to a common stress, when the stress and strength follow the Generalized Exponential Distribution (GED) with unknown shape parameter α and the known scale parameter θ (θ=1) to be common. Different shrinkage estimation methods will be considered to estimate  depending on maximum likelihood estimator and prior estimates based on simulation using mean squared error (MSE) criteria. The study approved that the shrinkage estimation using shrinkage weight function was the best.
In this paper, we prove some coincidence and common fixed point theorems for a pair of discontinuous weakly compatible self mappings satisfying generalized contractive condition in the setting of Cone-b- metric space under assumption that the Cone which is used is nonnormal. Our results are generalizations of some recent results.
In the lifetime process in some systems, most data cannot belong to one single population. In fact, it can represent several subpopulations. In such a case, the known distribution cannot be used to model data. Instead, a mixture of distribution is used to modulate the data and classify them into several subgroups. The mixture of Rayleigh distribution is best to be used with the lifetime process. This paper aims to infer model parameters by the expectation-maximization (EM) algorithm through the maximum likelihood function. The technique is applied to simulated data by following several scenarios. The accuracy of estimation has been examined by the average mean square error (AMSE) and the average classification success rate (ACSR). T
... Show MoreSurvival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete
... Show More