The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreRadial density distribution function of one particle D(r1) was calculated for main orbital of carbon atom and carbon like ions (N+ and B- ) by using the Partitioning technique .The results presented for K and L shells for the Carbon atom and negative ion of Boron and positive ion for nitrogen ion . We observed that as atomic number increases the probability of existence of electrons near the nucleus increases and the maximum of the location r1 decreases. In this research the Hartree-fock wavefunctions have been computed using Mathcad computer software .
Abstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreYou Mohammed, you're prophet of God and I'm Gabriel)). With this heavenly call which Mohammed, the messenger of God (may God's mercies be on him), got and when he left Hiraa cave and after getting the aye ((read with your God's name)), a new period of mankind's history started. From that time, the greatest state was established. There was no public treasury and no public financial resources at that time. Abu Baker (God bless him) spent a lot of money to support the costs of the new mission. After Al-Hijra, the bases of establishing the Islamic state were available but it lacked administrative and financial organizing. Therefore the prophet was very keen to find Islamic system which ensures justice and availability of
... Show MoreIn this paper, the maximum likelihood estimates for parameter ( ) of two parameter's Weibull are studied, as well as white estimators and (Bain & Antle) estimators, also Bayes estimator for scale parameter ( ), the simulation procedures are used to find the estimators and comparing between them using MSE. Also the application is done on the data for 20 patients suffering from a headache disease.
Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreThis research includes the application of non-parametric methods in estimating the conditional survival function represented in a method (Turnbull) and (Generalization Turnbull's) using data for Interval censored of breast cancer and two types of treatment, Chemotherapy and radiation therapy and age is continuous variable, The algorithm of estimators was applied through using (MATLAB) and then the use average Mean Square Error (MSE) as amusement to the estimates and the results showed (generalization of Turnbull's) In estimating the conditional survival function and for both treatments ,The estimated survival of the patients does not show very large differences
... Show More