In this paper we suggest new method to estimate the missing data in bivariate normal distribution and compare it with Single Imputation method (Unconditional mean and Conditional mean) by using simulation.
In this paper we suggest new method to estimate the missing data in bivariate normal distribution and compare it with Single Imputation method (Unconditional mean and Conditional mean) by using simulation.
The estimation of the parameters of Two Parameters Gamma Distribution in case of missing data has been made by using two important methods: the Maximum Likelihood Method and the Shrinkage Method. The former one consists of three methods to solve the MLE non-linear equation by which the estimators of the maximum likelihood can be obtained: Newton-Raphson, Thom and Sinha methods. Thom and Sinha methods are developed by the researcher to be suitable in case of missing data. Furthermore, the Bowman, Shenton and Lam Method, which depends on the Three Parameters Gamma Distribution to get the maximum likelihood estimators, has been developed. A comparison has been made between the methods in the experimental aspect to find the best meth
... Show MoreThe search is contain compared among some order selection criteria (FPE,AIC,SBC,H-Q) for the Model first order Autoregressive when the White Noise is follow Normal distribution and some of non Gaussian distributions (Log normal, Exponential and Poisson distribution ) by using Simulation
In this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
سنقوم في هذا البحث باشتقاق توزيع الطلب خلال فترة الانتظار لنظام سيطرة على الخزين يخضع فيه الطلب لتوزيع گاما فيما يخضع وقت الانتظار للتوزيع اللوغايتمي الطبيعي، كما سيتم استخراج العزوم الأساسية لهذا المتغير ، الضرورية بدورها لاستخراج بعض مؤشرات النظام المذكور.
المصطلحات المستخدمة: التكامل المحيط، المستوي المركب، تكامل هانكيل، مستوى إعادة الطلب، الوقاية.
The development in manufacturing computers from both (Hardware and Software) sides, make complicated robust estimators became computable and gave us new way of dealing with the data, when classical discriminant methods failed in achieving its optimal properties especially when data contains a percentage of outliers. Thus, the inability to have the minimum probability of misclassification. The research aim to compare robust estimators which are resistant to outlier influence like robust H estimator, robust S estimator and robust MCD estimator, also robustify misclassification probability with showing outlier influence on the percentage of misclassification when using classical methods. ,the other
... Show MoreIn this research weights, which are used, are estimated using General Least Square Estimation to estimate simple linear regression parameters when the depended variable, which is used, consists of two classes attributes variable (for Heteroscedastic problem) depending on Sequential Bayesian Approach instead of the Classical approach used before, Bayes approach provides the mechanism of tackling observations one by one in a sequential way, i .e each new observation will add a new piece of information for estimating the parameter of probability estimation of certain phenomenon of Bernoulli trials who research the depended variable in simple regression linear equation. in addition to the information deduced from the past exper
... Show MoreThe repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show MoreReliability has an important role in both the industrial and engineering applications. So the need for Reliability Tests appeared are series of tests a discover out of factors that appear through the test, knowledge limit of fit a specifics production addition for getting on goodness of production.
Therefore, the need for research to test for censor data from ( Type II ) for exponential distribution with one parameter and that test it’s (Reliability Growth) includes three curves are Idealized Growth curve estimation parameters and reliability with maximum likelihood method, Duane Growth curve takes estimation parameters and reliability with least squares method, Exponential Reliability Growth Cur
... Show MoreExponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show More