The analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the purposes of assessment and estimating and fitting, this along with the use of the classical method. It was to identify the best estimation method through the use of a of comparison criteria: Root of Mean Square Error: RMSE, and the Mean Absolute Percentage Error: MAPE. Sample sizes were selected as (n = 18, 30, 50, 81) which represents the size of data generation n = 18 five-year age groups for the phenomenon being studied and the sample size n = 81 age group represents a unilateral, and replicated the experiment (500) times. The results showed the simulation that the Maximum Likelihood method is the best in the case of small and medium-sized samples where it was applied to the data for five-year age groups suffering from disturbances and confusion of Iraq Household socio-Economic survey: IHSES II2012 while entropy method outperformed in the case of large samples where it was applied to age groups monounsaturated resulting from the use of mathematical method lead to results based on the staging equation data (Formula for Interpolation) placed Sprague (Sprague) and these transactions or what is called Sprague transactions (Sprague multipliers) are used to derive the preparation of deaths and the preparation of the population by unilateral age within the age groups a five-year given the use of the death toll and the preparation of the population in this age group and its environs from a five-year categories by using Excel program where the use of age groups monounsaturated data for accuracy not detect any age is in danger of annihilation.
This paper presents a hybrid software copy protection scheme, the scheme is applied to
prevent illegal copying of software by produce a license key which is unique and easy to
generate. This work employs the uniqueness of identification of hard disk in personal
computer which can get by software to create a license key after treated with SHA-1 one way
hash function. Two mean measures are used to evaluate the proposed method, complexity
and processing time, SHA-1 can insure the high complexity to deny the hackers for produce
unauthorized copies, many experiments have been executed using different sizes of software
to calculate the consuming time. The measures show high complexity and short execution
time for propos
After Zadeh introduced the concept of z-number scientists in various fields have shown keen interest in applying this concept in various applications. In applications of z-numbers, to compare two z-numbers, a ranking procedure is essential. While a few ranking functions have been already proposed in the literature there is a need to evolve some more good ranking functions. In this paper, a novel ranking function for z-numbers is proposed- "the Momentum Ranking Function"(MRF). Also, game theoretic problems where the payoff matrix elements are z-numbers are considered and the application of the momentum ranking function in such problems is demonstrated.
This paper presents a statistical study for a suitable distribution of rainfall in the provinces of Iraq
Using two types of distributions for the period (2005-2015). The researcher suggested log normal distribution, Mixed exponential distribution of each rovince were tested with the distributions to determine the optimal distribution of rainfall in Iraq. The distribution will be selected on the basis of minimum standards produced some goodness of fit tests, which are to determine
Akaike (CAIC), Bayesian Akaike (BIC), Akaike (AIC). It has been applied to distributions to find the right distribution of the data of rainfall in the provinces of Iraq was used (maximu
... Show MoreConditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
In this paper, some estimators for the reliability function R(t) of Basic Gompertz (BG) distribution have been obtained, such as Maximum likelihood estimator, and Bayesian estimators under General Entropy loss function by assuming non-informative prior by using Jefferys prior and informative prior represented by Gamma and inverted Levy priors. Monte-Carlo simulation is conducted to compare the performance of all estimates of the R(t), based on integrated mean squared.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreThe paper shows how to estimate the three parameters of the generalized exponential Rayleigh distribution by utilizing the three estimation methods, namely, the moment employing estimation method (MEM), ordinary least squares estimation method (OLSEM), and maximum entropy estimation method (MEEM). The simulation technique is used for all these estimation methods to find the parameters for the generalized exponential Rayleigh distribution. In order to find the best method, we use the mean squares error criterion. Finally, in order to extract the experimental results, one of object oriented programming languages visual basic. net was used
In recent years, the attention of researchers has increased of semi-parametric regression models, because it is possible to integrate the parametric and non-parametric regression models in one and then form a regression model has the potential to deal with the cruse of dimensionality in non-parametric models that occurs through the increasing of explanatory variables. Involved in the analysis and then decreasing the accuracy of the estimation. As well as the privilege of this type of model with flexibility in the application field compared to the parametric models which comply with certain conditions such as knowledge of the distribution of errors or the parametric models may
... Show MoreThis paper is devoted to compare the performance of non-Bayesian estimators represented by the Maximum likelihood estimator of the scale parameter and reliability function of inverse Rayleigh distribution with Bayesian estimators obtained under two types of loss function specifically; the linear, exponential (LINEX) loss function and Entropy loss function, taking into consideration the informative and non-informative priors. The performance of such estimators assessed on the basis of mean square error (MSE) criterion. The Monte Carlo simulation experiments are conducted in order to obtain the required results.
The simulation is the oldest theory in art, since it appeared in the Greek aesthetic thought of the philosopher Plato, as we find in many of the thinkers and philosophers over a wide period of time to reach our world today. Our fascination with art in general and design art in particular is due to the creativity and innovations of the artist through the simulation, as well as the peculiarities in this simulation, which give objects signs and signals that may have an echo that sometimes does not exist in their physical reality.
The real representation of life and design construction, descriptions of the expression of each of them in the form of intellectual construction and the ideas of producti
... Show More