Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical features. The objective of this paper is thus to propose a new algorithm where it allows getting the estimation of the parameters of Gumbel probability distribution directly. Furthermore, it overcomes the mathematical difficulties in this matter without need to the derivative of the likelihood function. Taking simulation approach under consideration as empirical experiments where a hybrid method performs optimization of these three traditional methods. In this regard, comparisons have been done between the new proposed method and each pair of the traditional methods mentioned above by efficiency criterion Root of Mean Squared Error (RMSE). As a result, (36) experiments of different combinations of initial values of two parameters (λ: shift parameter and θ: scale parameter) in three values that take four different sample sizes for each experiment. To conclude, the proposed algorithm showed its superiority in all simulation combinations associated with all sample sizes for the two parameters (λ and θ). In addition, the method of Moments was the best in estimating the shift parameter (λ) and the method of Maximum Likelihood was in estimating the scale parameter (θ).
The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p
... Show MoreThis paper is interested in comparing the performance of the traditional methods to estimate parameter of exponential distribution (Maximum Likelihood Estimator, Uniformly Minimum Variance Unbiased Estimator) and the Bayes Estimator in the case of data to meet the requirement of exponential distribution and in the case away from the distribution due to the presence of outliers (contaminated values). Through the employment of simulation (Monte Carlo method) and the adoption of the mean square error (MSE) as criterion of statistical comparison between the performance of the three estimators for different sample sizes ranged between small, medium and large (n=5,10,25,50,100) and different cases (wit
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreThe analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreDiabetic nephropathy (DN) is the foremost cause of end-stage renal disease. Early detection of DN can spare diabetic patients of severe complications. This study aimed to evaluate the diagnostic value of red cell distribution width (RDW) and neutrophil-lymphocyte ratio (NLR) in the detection of DN in patients with type 2 diabetes mellitus (T2DM). This cross-sectional study included a total of 130 patients with T2DM, already diagnosed with T2DM. The albumin creatinine ratio (ACR) in urine samples was calculated for each patient, according to which patients were divided into two groups: with evidence of DN when ACR ? 30 mg/g, and those with no evidence of DN when ACR < 30 mg/g. According to multivariate analysis, each of disease duration (OR
... Show MoreThis investigation proposed an identification system of offline signature by utilizing rotation compensation depending on the features that were saved in the database. The proposed system contains five principle stages, they are: (1) data acquisition, (2) signature data file loading, (3) signature preprocessing, (4) feature extraction, and (5) feature matching. The feature extraction includes determination of the center point coordinates, and the angle for rotation compensation (θ), implementation of rotation compensation, determination of discriminating features and statistical condition. During this work seven essential collections of features are utilized to acquire the characteristics: (i) density (D), (ii) average (A), (iii) s
... Show MoreThis paper is concerned with Double Stage Shrinkage Bayesian (DSSB) Estimator for lowering the mean squared error of classical estimator ˆ q for the scale parameter (q) of an exponential distribution in a region (R) around available prior knowledge (q0) about the actual value (q) as initial estimate as well as to reduce the cost of experimentations. In situation where the experimentations are time consuming or very costly, a Double Stage procedure can be used to reduce the expected sample size needed to obtain the estimator. This estimator is shown to have smaller mean squared error for certain choice of the shrinkage weight factor y( ) and for acceptance region R. Expression for
... Show MoreA comparison of double informative and non- informative priors assumed for the parameter of Rayleigh distribution is considered. Three different sets of double priors are included, for a single unknown parameter of Rayleigh distribution. We have assumed three double priors: the square root inverted gamma (SRIG) - the natural conjugate family of priors distribution, the square root inverted gamma – the non-informative distribution, and the natural conjugate family of priors - the non-informative distribution as double priors .The data is generating form three cases from Rayleigh distribution for different samples sizes (small, medium, and large). And Bayes estimators for the parameter is derived under a squared erro
... Show More