The aim of this research is to estimate the parameters of the linear regression model with errors following ARFIMA model by using wavelet method depending on maximum likelihood and approaching general least square as well as ordinary least square. We use the estimators in practical application on real data, which were the monthly data of Inflation and Dollar exchange rate obtained from the (CSO) Central Statistical organization for the period from 1/2005 to 12/2015. The results proved that (WML) was the most reliable and efficient from the other estimators, also the results provide that the changing of fractional difference parameter (d) doesn’t effect on the results.
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreAbstract
The Phenomenon of Extremism of Values (Maximum or Rare Value) an important phenomenon is the use of two techniques of sampling techniques to deal with this Extremism: the technique of the peak sample and the maximum annual sampling technique (AM) (Extreme values, Gumbel) for sample (AM) and (general Pareto, exponential) distribution of the POT sample. The cross-entropy algorithm was applied in two of its methods to the first estimate using the statistical order and the second using the statistical order and likelihood ratio. The third method is proposed by the researcher. The MSE comparison coefficient of the estimated parameters and the probability density function for each of the distributions were
... Show MoreIn this paper, we investigate the connection between the hierarchical models and the power prior distribution in quantile regression (QReg). Under specific quantile, we develop an expression for the power parameter ( ) to calibrate the power prior distribution for quantile regression to a corresponding hierarchical model. In addition, we estimate the relation between the and the quantile level via hierarchical model. Our proposed methodology is illustrated with real data example.
Abstract
This research attempt to explain the essential aspects of one important model in management of Bank risks , that is (stress testing) , which increase the concentrate on it resulting the negative affects of Global financial crisis that it accuar in 2008 to study the application possibilities in iraqian banks to enhancing the safety and financial soundness Becuase the classical tools in Risk management don’t give clear image on Banks ability in facing risks, hence the Basel committee on Banking supervision focusing in agreement of Basel 2,3 on stress testing when it doing the internal capital adequacy assessment process (ICAAP) .
To achieving the reseach obje
... Show MoreIn this paper, the homotopy perturbation method (HPM) is presented for treating a linear system of second-kind mixed Volterra-Fredholm integral equations. The method is based on constructing the series whose summation is the solution of the considered system. Convergence of constructed series is discussed and its proof is given; also, the error estimation is obtained. Algorithm is suggested and applied on several examples and the results are computed by using MATLAB (R2015a). To show the accuracy of the results and the effectiveness of the method, the approximate solutions of some examples are compared with the exact solution by computing the absolute errors.
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThe study was aimed to evaluate the marketing efficiency of dry Onion crop in Salah al-Deen, as estimate the impact of some quality and quantity factors in the efficiency of marketing process of crop using Tobit regression model. The average marketing efficiency of the research sample was 71.3686%. The marketing margins differed according to the marketing channel followed in marketing the crop. The qualitative and quantitative variables in the model are productivity, family size, distance from the market, educational level. The estimated model revealed that a variable productivity is the most important and influential in marketing efficiency, followed by the variable of the distance between the farm and the market, then the variable
... Show MoreThe study was aimed to evaluate the marketing efficiency of dry Onion crop in Salah al-Deen, as estimate the impact of some quality and quantity factors in the efficiency of marketing process of crop using Tobit regression model. The average marketing efficiency of the research sample was 71.3686%. The marketing margins differed according to the marketing channel followed in marketing the crop. The qualitative and quantitative variables in the model are productivity, family size, distance from the market, educational level. The estimated model revealed that a variable productivity is the most important and influential in marketing efficiency, followed by the variable of the distance between the farm and the market, then the variable
... Show MoreIn this paper, previous studies about Fuzzy regression had been presented. The fuzzy regression is a generalization of the traditional regression model that formulates a fuzzy environment's relationship to independent and dependent variables. All this can be introduced by non-parametric model, as well as a semi-parametric model. Moreover, results obtained from the previous studies and their conclusions were put forward in this context. So, we suggest a novel method of estimation via new weights instead of the old weights and introduce
Paper Type: Review article.
another suggestion based on artificial neural networks.
The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p
... Show More