Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical features. The objective of this paper is thus to propose a new algorithm where it allows getting the estimation of the parameters of Gumbel probability distribution directly. Furthermore, it overcomes the mathematical difficulties in this matter without need to the derivative of the likelihood function. Taking simulation approach under consideration as empirical experiments where a hybrid method performs optimization of these three traditional methods. In this regard, comparisons have been done between the new proposed method and each pair of the traditional methods mentioned above by efficiency criterion Root of Mean Squared Error (RMSE). As a result, (36) experiments of different combinations of initial values of two parameters (λ: shift parameter and θ: scale parameter) in three values that take four different sample sizes for each experiment. To conclude, the proposed algorithm showed its superiority in all simulation combinations associated with all sample sizes for the two parameters (λ and θ). In addition, the method of Moments was the best in estimating the shift parameter (λ) and the method of Maximum Likelihood was in estimating the scale parameter (θ).
Time series analysis is the statistical approach used to analyze a series of data. Time series is the most popular statistical method for forecasting, which is widely used in several statistical and economic applications. The wavelet transform is a powerful mathematical technique that converts an analyzed signal into a time-frequency representation. The wavelet transform method provides signal information in both the time domain and frequency domain. The aims of this study are to propose a wavelet function by derivation of a quotient from two different Fibonacci coefficient polynomials, as well as a comparison between ARIMA and wavelet-ARIMA. The time series data for daily wind speed is used for this study. From the obtained results, the
... Show MoreThe sensitive and important data are increased in the last decades rapidly, since the tremendous updating of networking infrastructure and communications. to secure this data becomes necessary with increasing volume of it, to satisfy securing for data, using different cipher techniques and methods to ensure goals of security that are integrity, confidentiality, and availability. This paper presented a proposed hybrid text cryptography method to encrypt a sensitive data by using different encryption algorithms such as: Caesar, Vigenère, Affine, and multiplicative. Using this hybrid text cryptography method aims to make the encryption process more secure and effective. The hybrid text cryptography method depends on circular queue. Using circ
... Show MoreThis paper suggest two method of recognition, these methods depend on the extraction of the feature of the principle component analysis when applied on the wavelet domain(multi-wavelet). First method, an idea of increasing the space of recognition, through calculating the eigenstructure of the diagonal sub-image details at five depths of wavelet transform is introduced. The effective eigen range selected here represent the base for image recognition. In second method, an idea of obtaining invariant wavelet space at all projections is presented. A new recursive from that represents invariant space of representing any image resolutions obtained from wavelet transform is adopted. In this way, all the major problems that effect the image and
... Show MoreThe research aims to identify the reasons that lead to asymmetry of information between economic unity administration and the parties that use accounting information such as shareholders, So, the ability to reach to the solutions that would reduce this problem, these factors have been divided into two types: the first one is the internal factors which represent the administration's desire in order to expand the self-interest of getting the profits and increase the value and competitive entity and investors to obtaining greater returns for their shares, so the second type is the external factors, which represent the failer that occurs in the laws and regula
... Show MoreThis paper describes a number of new interleaving strategies based on the golden section. The new interleavers are called golden relative prime interleavers, golden interleavers, and dithered golden interleavers. The latter two approaches involve sorting a real-valued vector derived from the golden section. Random and so-called “spread” interleavers are also considered. Turbo-code performance results are presented and compared for the various interleaving strategies. Of the interleavers considered, the dithered golden interleaver typically provides the best performance, especially for low code rates and large block sizes. The golden relative prime interleaver is shown to work surprisingly well for high puncture rates. These interleav
... Show MoreThe aim of this paper is to design artificial neural network as an alternative accurate tool to estimate concentration of Cadmium in contaminated soils for any depth and time. First, fifty soil samples were harvested from a phytoremediated contaminated site located in Qanat Aljaeesh in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. The inputs are the soil depth, the time, and the soil parameters but the output is the concentration of Cu in the soil for depth x and time t. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Cadmium. The performance of the ANN technique was compared with the traditional laboratory inspecting
... Show MoreIn this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.
The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreIt is very difficult to obtain the value of a rock strength along the wellbore. The value of Rock strength utilizing to perform different analysis, for example, preventing failure of the wellbore, deciding a completion design and, control the production of sand. In this study, utilizing sonic log data from (Bu-50) and (BU-47) wells at Buzurgan oil field. Five formations have been studied (Mishrif, Sadia, Middle lower Kirkuk, Upper Kirkuk, and Jaddala) Firstly, calculated unconfined compressive strength (UCS) for each formation, using a sonic log method. Then, the derived confined compressive rock strengthens from (UCS) by entering the effect of bore and hydrostatic pressure for each formation. Evaluations th
... Show More