In this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .
A stochastic process {Xk, k = 1, 2, ...} is a doubly geometric stochastic process if there exists the ratio (a > 0) and the positive function (h(k) > 0), so that {α 1 h-k }; k ak X k = 1, 2, ... is a generalization of a geometric stochastic process. This process is stochastically monotone and can be used to model a point process with multiple trends. In this paper, we use nonparametric methods to investigate statistical inference for doubly geometric stochastic processes. A graphical technique for determining whether a process is in agreement with a doubly geometric stochastic process is proposed. Further, we can estimate the parameters a, b, μ and σ2 of the doubly geometric stochastic process by using the least squares estimate for Xk a
... Show MoreEach phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho
... Show MoreThis research deals with unusual approach for analyzing the Simple Linear Regression via Linear Programming by Two - phase method, which is known in Operations Research: “O.R.”. The estimation here is found by solving optimization problem when adding artificial variables: Ri. Another method to analyze the Simple Linear Regression is introduced in this research, where the conditional Median of (y) was taken under consideration by minimizing the Sum of Absolute Residuals instead of finding the conditional Mean of (y) which depends on minimizing the Sum of Squared Residuals, that is called: “Median Regression”. Also, an Iterative Reweighted Least Squared based on the Absolute Residuals as weights is performed here as another method to
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreIn this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
The aim of this research is to estimate the parameters of the linear regression model with errors following ARFIMA model by using wavelet method depending on maximum likelihood and approaching general least square as well as ordinary least square. We use the estimators in practical application on real data, which were the monthly data of Inflation and Dollar exchange rate obtained from the (CSO) Central Statistical organization for the period from 1/2005 to 12/2015. The results proved that (WML) was the most reliable and efficient from the other estimators, also the results provide that the changing of fractional difference parameter (d) doesn’t effect on the results.
The researchers have a special interest in studying Markov chains as one of the probability samples which has many applications in different fields. This study comes to deal with the changes issue that happen on budget expenditures by using statistical methods, and Markov chains is the best expression about that as they are regarded reliable samples in the prediction process. A transitional matrix is built for three expenditure cases (increase ,decrease ,stability) for one of budget expenditure items (base salary) for three directorates (Baghdad ,Nineveh , Diyala) of one of the ministries. Results are analyzed by applying Maximum likelihood estimation and Ordinary least squares methods resulting
... Show MoreThe purpose of this research is to find the estimator of the average proportion of defectives based on attribute samples. That have been curtailed either with rejection of a lot finding the kth defective or with acceptance on finding the kth non defective.
The MLE (Maximum likelihood estimator) is derived. And also the ASN in Single Curtailed Sampling has been derived and we obtain a simplified Formula All the Notations needed are explained.
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreThis research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show More