Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical features. The objective of this paper is thus to propose a new algorithm where it allows getting the estimation of the parameters of Gumbel probability distribution directly. Furthermore, it overcomes the mathematical difficulties in this matter without need to the derivative of the likelihood function. Taking simulation approach under consideration as empirical experiments where a hybrid method performs optimization of these three traditional methods. In this regard, comparisons have been done between the new proposed method and each pair of the traditional methods mentioned above by efficiency criterion Root of Mean Squared Error (RMSE). As a result, (36) experiments of different combinations of initial values of two parameters (λ: shift parameter and θ: scale parameter) in three values that take four different sample sizes for each experiment. To conclude, the proposed algorithm showed its superiority in all simulation combinations associated with all sample sizes for the two parameters (λ and θ). In addition, the method of Moments was the best in estimating the shift parameter (λ) and the method of Maximum Likelihood was in estimating the scale parameter (θ).
This paper suggest two method of recognition, these methods depend on the extraction of the feature of the principle component analysis when applied on the wavelet domain(multi-wavelet). First method, an idea of increasing the space of recognition, through calculating the eigenstructure of the diagonal sub-image details at five depths of wavelet transform is introduced. The effective eigen range selected here represent the base for image recognition. In second method, an idea of obtaining invariant wavelet space at all projections is presented. A new recursive from that represents invariant space of representing any image resolutions obtained from wavelet transform is adopted. In this way, all the major problems that effect the image and
... Show MoreThe research aims to identify the reasons that lead to asymmetry of information between economic unity administration and the parties that use accounting information such as shareholders, So, the ability to reach to the solutions that would reduce this problem, these factors have been divided into two types: the first one is the internal factors which represent the administration's desire in order to expand the self-interest of getting the profits and increase the value and competitive entity and investors to obtaining greater returns for their shares, so the second type is the external factors, which represent the failer that occurs in the laws and regula
... Show MoreIn this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.
The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreThe aim of this paper is to design artificial neural network as an alternative accurate tool to estimate concentration of Cadmium in contaminated soils for any depth and time. First, fifty soil samples were harvested from a phytoremediated contaminated site located in Qanat Aljaeesh in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. The inputs are the soil depth, the time, and the soil parameters but the output is the concentration of Cu in the soil for depth x and time t. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Cadmium. The performance of the ANN technique was compared with the traditional laboratory inspecting
... Show MoreThe objective of this research was to estimate the dose distribution delivered by radioactive gold nanoparticles (198 AuNPs or 199 AuNPs) to the tumor inside the human prostate as well as to normal tissues surrounding the tumor using the Monte-Carlo N-Particle code (MCNP-6.1. 1 code). Background Radioactive gold nanoparticles are emerging as promising agents for cancer therapy and are being investigated to treat prostate cancer in animals. In order to use them as a new therapeutic modality to treat human prostate cancer, accurate radiation dosimetry simulations are required to estimate the energy deposition in the tumor and surrounding tissue and to establish the course of therapy for the patient. Materials and methods A simple geometrical
... Show MoreIt is very difficult to obtain the value of a rock strength along the wellbore. The value of Rock strength utilizing to perform different analysis, for example, preventing failure of the wellbore, deciding a completion design and, control the production of sand. In this study, utilizing sonic log data from (Bu-50) and (BU-47) wells at Buzurgan oil field. Five formations have been studied (Mishrif, Sadia, Middle lower Kirkuk, Upper Kirkuk, and Jaddala) Firstly, calculated unconfined compressive strength (UCS) for each formation, using a sonic log method. Then, the derived confined compressive rock strengthens from (UCS) by entering the effect of bore and hydrostatic pressure for each formation. Evaluations th
... Show MoreIn this article we study the variance estimator for the normal distribution when the mean is un known depend of the cumulative function between unbiased estimator and Bays estimator for the variance of normal distribution which is used include Double Stage Shrunken estimator to obtain higher efficiency for the variance estimator of normal distribution when the mean is unknown by using small volume equal volume of two sample .
In this study, we used Bayesian method to estimate scale parameter for the normal distribution. By considering three different prior distributions such as the square root inverted gamma (SRIG) distribution and the non-informative prior distribution and the natural conjugate family of priors. The Bayesian estimation based on squared error loss function, and compared it with the classical estimation methods to estimate the scale parameter for the normal distribution, such as the maximum likelihood estimation and th
... Show More