Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical features. The objective of this paper is thus to propose a new algorithm where it allows getting the estimation of the parameters of Gumbel probability distribution directly. Furthermore, it overcomes the mathematical difficulties in this matter without need to the derivative of the likelihood function. Taking simulation approach under consideration as empirical experiments where a hybrid method performs optimization of these three traditional methods. In this regard, comparisons have been done between the new proposed method and each pair of the traditional methods mentioned above by efficiency criterion Root of Mean Squared Error (RMSE). As a result, (36) experiments of different combinations of initial values of two parameters (λ: shift parameter and θ: scale parameter) in three values that take four different sample sizes for each experiment. To conclude, the proposed algorithm showed its superiority in all simulation combinations associated with all sample sizes for the two parameters (λ and θ). In addition, the method of Moments was the best in estimating the shift parameter (λ) and the method of Maximum Likelihood was in estimating the scale parameter (θ).
The 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show MoreThe objective of this study is to examine the properties of Bayes estimators of the shape parameter of the Power Function Distribution (PFD-I), by using two different prior distributions for the parameter θ and different loss functions that were compared with the maximum likelihood estimators. In many practical applications, we may have two different prior information about the prior distribution for the shape parameter of the Power Function Distribution, which influences the parameter estimation. So, we used two different kinds of conjugate priors of shape parameter θ of the <
... Show More
The reliability of the stress-strength model attracted many statisticians for several years owing to its applicability in different and diverse parts such as engineering, quality control, and economics. In this paper, the system reliability estimation in the stress-strength model containing Kth parallel components will be offered by four types of shrinkage methods: constant Shrinkage Estimation Method, Shrinkage Function Estimator, Modified Thompson Type Shrinkage Estimator, Squared Shrinkage Estimator. The Monte Carlo simulation study is compared among proposed estimators using the mean squared error. The result analyses of the shrinkage estimation methods showed that the shrinkage functions estimator was the best since
... Show MoreEstimation the unknown parameters of a two-dimensional sinusoidal signal model is an important and a difficult problem , The importance of this model in modeling Symmetric gray- scale texture image . In this paper, we propose employment Deferential Evaluation algorithm and the use of Sequential approach to estimate the unknown frequencies and amplitudes of the 2-D sinusoidal components when the signal is affected by noise. Numerical simulation are performed for different sample size, and various level of standard deviation to observe the performance of this method in estimate the parameters of 2-D sinusoidal signal model , This model was used for modeling the Symmetric gray scale texture image and estimating by using
... Show MoreEstimation of the tail index parameter of a one - parameter Pareto model has wide important by the researchers because it has awide application in the econometrics science and reliability theorem.
Here we introduce anew estimator of "generalized median" type and compare it with the methods of Moments and Maximum likelihood by using the criteria, mean square error.
The estimator of generalized median type performing best over all.
Improving" Jackknife Instrumental Variable Estimation method" using A class of immun algorithm with practical application
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreTI1e Web service securi ty challenge is to understand and assess the risk involved in securing a web-based service today, based on our existing security technology, and at the same time tmck emerging standards and understand how they will be used to offset the risk in
new web services. Any security model must i llustrate how data can
now through an application and network topology to meet the
requirements defined by the busi ness wi thout exposing the data to undue risk. In this paper we propose &n
... Show More