In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chronic lymphocytic leukemia and through the use of the Gaussian function and based on the comparison criterion (MSE) it was found that the Nadaraya -Watson method is the best because it obtained the lowest value for this criterion.
This research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreSemi-parametric models analysis is one of the most interesting subjects in recent studies due to give an efficient model estimation. The problem when the response variable has one of two values either 0 ( no response) or one – with response which is called the logistic regression model.
We compare two methods Bayesian and . Then the results were compared using MSe criteria.
A simulation had been used to study the empirical behavior for the Logistic model , with different sample sizes and variances. The results using represent that the Bayesian method is better than the at small samples sizes.
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreExponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreThe last few years witnessed great and increasing use in the field of medical image analysis. These tools helped the Radiologists and Doctors to consult while making a particular diagnosis. In this study, we used the relationship between statistical measurements, computer vision, and medical images, along with a logistic regression model to extract breast cancer imaging features. These features were used to tell the difference between the shape of a mass (Fibroid vs. Fatty) by looking at the regions of interest (ROI) of the mass. The final fit of the logistic regression model showed that the most important variables that clearly affect breast cancer shape images are Skewness, Kurtosis, Center of mass, and Angle, with an AUCROC of
... Show MoreAbstract
In this research provide theoretical aspects of one of the most important statistical distributions which it is Lomax, which has many applications in several areas, set of estimation methods was used(MLE,LSE,GWPM) and compare with (RRE) estimation method ,in order to find out best estimation method set of simulation experiment (36) with many replications in order to get mean square error and used it to make compare , simulation experiment contrast with (estimation method, sample size ,value of location and shape parameter) results show that estimation method effected by simulation experiment factors and ability of using other estimation methods such as(Shrinkage, jackknif
... Show MoreLinear regression is one of the most important statistical tools through which it is possible to know the relationship between the response variable and one variable (or more) of the independent variable(s), which is often used in various fields of science. Heteroscedastic is one of the linear regression problems, the effect of which leads to inaccurate conclusions. The problem of heteroscedastic may be accompanied by the presence of extreme outliers in the independent variables (High leverage points) (HLPs), the presence of (HLPs) in the data set result unrealistic estimates and misleading inferences. In this paper, we review some of the robust
... Show MoreABSTRICT:
This study is concerned with the estimation of constant and time-varying parameters in non-linear ordinary differential equations, which do not have analytical solutions. The estimation is done in a multi-stage method where constant and time-varying parameters are estimated in a straight sequential way from several stages. In the first stage, the model of the differential equations is converted to a regression model that includes the state variables with their derivatives and then the estimation of the state variables and their derivatives in a penalized splines method and compensating the estimations in the regression model. In the second stage, the pseudo- least squares method was used to es
... Show More
We have presented the distribution of the exponentiated expanded power function (EEPF) with four parameters, where this distribution was created by the exponentiated expanded method created by the scientist Gupta to expand the exponential distribution by adding a new shape parameter to the cumulative function of the distribution, resulting in a new distribution, and this method is characterized by obtaining a distribution that belongs for the exponential family. We also obtained a function of survival rate and failure rate for this distribution, where some mathematical properties were derived, then we used the method of maximum likelihood (ML) and method least squares developed (LSD)
... Show More