Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical features. The objective of this paper is thus to propose a new algorithm where it allows getting the estimation of the parameters of Gumbel probability distribution directly. Furthermore, it overcomes the mathematical difficulties in this matter without need to the derivative of the likelihood function. Taking simulation approach under consideration as empirical experiments where a hybrid method performs optimization of these three traditional methods. In this regard, comparisons have been done between the new proposed method and each pair of the traditional methods mentioned above by efficiency criterion Root of Mean Squared Error (RMSE). As a result, (36) experiments of different combinations of initial values of two parameters (λ: shift parameter and θ: scale parameter) in three values that take four different sample sizes for each experiment. To conclude, the proposed algorithm showed its superiority in all simulation combinations associated with all sample sizes for the two parameters (λ and θ). In addition, the method of Moments was the best in estimating the shift parameter (λ) and the method of Maximum Likelihood was in estimating the scale parameter (θ).
Improving" Jackknife Instrumental Variable Estimation method" using A class of immun algorithm with practical application
The research presents the reliability. It is defined as the probability of accomplishing any part of the system within a specified time and under the same circumstances. On the theoretical side, the reliability, the reliability function, and the cumulative function of failure are studied within the one-parameter Raleigh distribution. This research aims to discover many factors that are missed the reliability evaluation which causes constant interruptions of the machines in addition to the problems of data. The problem of the research is that there are many methods for estimating the reliability function but no one has suitable qualifications for most of these methods in the data such
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThis research deals with a very important subject as it tries to change the theoretical and scientific heritage and some professional rules adopted in the newsroom. Most media students have difficulties in writing news for press correctly. The researcher tries to identify the compatibility of what is published in local news agencies with professional and academic standards.
The research finds detailed editorial rules for a number of news formats which will play an important role in writing news for press easily, especially for the beginners and newcomers. Also, it discovers a new fact denying the beliefs of some researchers and writers in not having news conclusion in news edited according to the inverted pyramid pattern.
The re
In this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process, where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab
... Show MoreThe parameter and system reliability in stress-strength model are estimated in this paper when the system contains several parallel components that have strengths subjects to common stress in case when the stress and strengths follow Generalized Inverse Rayleigh distribution by using different Bayesian estimation methods. Monte Carlo simulation introduced to compare among the proposal methods based on the Mean squared Error criteria.
Absence or hypoplasia of the internal carotid artery (ICA) is a rare congenital anomaly that is mostly unilateral and highly associated with other intracranial vascular anomalies, of which saccular aneurysm is the most common. Blood flow to the circulation of the affected side is maintained by collateral pathways, some of which include the anterior communicating artery (Acom) as part of their anatomy. Therefore, temporary clipping during microsurgery on Acom aneurysms in patients with unilateral ICA anomalies could jeopardize these collaterals and place the patient at risk of ischemic damage. In this paper, we review the literature on cases with a unilaterally absent ICA associa