The analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the purposes of assessment and estimating and fitting, this along with the use of the classical method. It was to identify the best estimation method through the use of a of comparison criteria: Root of Mean Square Error: RMSE, and the Mean Absolute Percentage Error: MAPE. Sample sizes were selected as (n = 18, 30, 50, 81) which represents the size of data generation n = 18 five-year age groups for the phenomenon being studied and the sample size n = 81 age group represents a unilateral, and replicated the experiment (500) times. The results showed the simulation that the Maximum Likelihood method is the best in the case of small and medium-sized samples where it was applied to the data for five-year age groups suffering from disturbances and confusion of Iraq Household socio-Economic survey: IHSES II2012 while entropy method outperformed in the case of large samples where it was applied to age groups monounsaturated resulting from the use of mathematical method lead to results based on the staging equation data (Formula for Interpolation) placed Sprague (Sprague) and these transactions or what is called Sprague transactions (Sprague multipliers) are used to derive the preparation of deaths and the preparation of the population by unilateral age within the age groups a five-year given the use of the death toll and the preparation of the population in this age group and its environs from a five-year categories by using Excel program where the use of age groups monounsaturated data for accuracy not detect any age is in danger of annihilation.
In this paper, compared eight methods for generating the initial value and the impact of these methods to estimate the parameter of a autoregressive model, as was the use of three of the most popular methods to estimate the model and the most commonly used by researchers MLL method, Barg method and the least squares method and that using the method of simulation model first order autoregressive through the design of a number of simulation experiments and the different sizes of the samples.
Conditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
The paper shows how to estimate the three parameters of the generalized exponential Rayleigh distribution by utilizing the three estimation methods, namely, the moment employing estimation method (MEM), ordinary least squares estimation method (OLSEM), and maximum entropy estimation method (MEEM). The simulation technique is used for all these estimation methods to find the parameters for the generalized exponential Rayleigh distribution. In order to find the best method, we use the mean squares error criterion. Finally, in order to extract the experimental results, one of object oriented programming languages visual basic. net was used
In recent years, the attention of researchers has increased of semi-parametric regression models, because it is possible to integrate the parametric and non-parametric regression models in one and then form a regression model has the potential to deal with the cruse of dimensionality in non-parametric models that occurs through the increasing of explanatory variables. Involved in the analysis and then decreasing the accuracy of the estimation. As well as the privilege of this type of model with flexibility in the application field compared to the parametric models which comply with certain conditions such as knowledge of the distribution of errors or the parametric models may
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreThe process of evaluating data (age and the gender structure) is one of the important factors that help any country to draw plans and programs for the future. Discussed the errors in population data for the census of Iraqi population of 1997. targeted correct and revised to serve the purposes of planning. which will be smoothing the population databy using nonparametric regression estimator (Nadaraya-Watson estimator) This estimator depends on bandwidth (h) which can be calculate it by two ways of using Bayesian method, the first when observations distribution is Lognormal Kernel and the second is when observations distribution is Normal Kernel
... Show MoreThis paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
The simulation is the oldest theory in art, since it appeared in the Greek aesthetic thought of the philosopher Plato, as we find in many of the thinkers and philosophers over a wide period of time to reach our world today. Our fascination with art in general and design art in particular is due to the creativity and innovations of the artist through the simulation, as well as the peculiarities in this simulation, which give objects signs and signals that may have an echo that sometimes does not exist in their physical reality.
The real representation of life and design construction, descriptions of the expression of each of them in the form of intellectual construction and the ideas of producti
... Show MoreThis paper shews how to estimate the parameter of generalized exponential Rayleigh (GER) distribution by three estimation methods. The first one is maximum likelihood estimator method the second one is moment employing estimation method (MEM), the third one is rank set sampling estimator method (RSSEM)The simulation technique is used for all these estimation methods to find the parameters for generalized exponential Rayleigh distribution. Finally using the mean squares error criterion to compare between these estimation methods to find which of these methods are best to the others