In this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition to generating random error in the model of the transfer function model that follows the ARMA model by two functions and a variation (0.5) at sample sizes (n = 100,150,200) The results showed the superiority of the nonparametric transfer function model at the cubic smoothing spline estimator C.S.S On the nonlinear and nonparametric transfer function model.
After Zadeh introduced the concept of z-number scientists in various fields have shown keen interest in applying this concept in various applications. In applications of z-numbers, to compare two z-numbers, a ranking procedure is essential. While a few ranking functions have been already proposed in the literature there is a need to evolve some more good ranking functions. In this paper, a novel ranking function for z-numbers is proposed- "the Momentum Ranking Function"(MRF). Also, game theoretic problems where the payoff matrix elements are z-numbers are considered and the application of the momentum ranking function in such problems is demonstrated.
This paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them
In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreThe use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode
... Show MoreResearch includes three axes, the first is the average estimate time of achievement (day) to work oversight, to five supervisory departments in the Office of Financial Supervision Federal and then choose the three control outputs and at the level of each of the five departments above, and after analyzing the data statistically back to us that the distribution of the times of achievement It is the exponential distribution (Exponential Distribution) a parameter (q), and the distribution of normal (Normal Distribution) with two parameters (μ, σ2), and introduced four methods of parameter estimation (q) as well as four modalities parameter to estimate (
... Show MoreIn this paper we use the Markov Switching model to investigate the link between the level of Iraqi inflation and its uncertainty; forth period 1980-2010 we measure inflation uncertainty as the variance of unanticipated inflation. The results ensure there are a negative effect of inflation level on inflation uncertainty and all so there are a positive effect of inflation uncertainty on inflation level.  
... Show MoreAbstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreThis work focused on principle of higher order mode excitation using in- line Double Clad Multi-Mode Mach-Zehnder Interferometer (DC-MM-MZI). The DC-MM-MZI was designed with 50 cm etched MMF. The etching length is 5cm. The tenability of this interferometer was studied using opt grating ver.4.2.2 and optiwave
ver. 7 simulator. After removing (25, 35, 45, 55) μm from MMF and immersing this segment of MMF with water bath contained distilled water and ethanol, in addition to, air. Pulsed laser source centered at 1546.7nm ,pulse width 10ns and peak power 1.33mW was propagated via this interferometer Maximum modes were obtained in case of air surrounded media which are 9800 and 25 um removed cladding layer, with peak power 49.800 m