In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countries that fall within the seismic belt which are Turkey and Iran, which makes it not excluded from the arrival of seismic frequencies to it, where the three parameters of the above model were estimated in two ways which are the maximum likelihood method and the Bayesian method to find the time average of the occurrence of this phenomenon and a mean square error (MSE) was used to find out which methods are best in estimating the model parameters show that the Bayesian method is the best estimation method.
The tax base is one of the bases of the technical organizing of taxes, and that a good selection of the tax base effects the outcome of the tax and its fairness, and with the expansion of the tax range results a dangerous phenomenon called tax evasion, which became threaten the economies of countries and this phenomenon prevents the achievement of the state to its economic, political and social objectives which seeks to resolve this phenomenon and identifying all human and material potential and realize the real reasons that lie behind it. The researcher found that tax authorities are weak in terms of it the technical material and financial abilities, the analysis of data show that then is a significant reve
... Show MoreThe problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
In the current study, the researchers have been obtained Bayes estimators for the shape and scale parameters of Gamma distribution under the precautionary loss function, assuming the priors, represented by Gamma and Exponential priors for the shape and scale parameters respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation.
Based on Monte Carlo simulation method, those estimators are compared depending on the mean squared errors (MSE’s). The results show that, the performance of Bayes estimator under precautionary loss function with Gamma and Exponential priors is better than other estimates in all cases.
The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreIn this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.
The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the
... Show MoreThis research aims to solve the nonlinear model formulated in a system of differential equations with an initial value problem (IVP) represented in COVID-19 mathematical epidemiology model as an application using new approach: Approximate Shrunken are proposed to solve such model under investigation, which combines classic numerical method and numerical simulation techniques in an effective statistical form which is shrunken estimation formula. Two numerical simulation methods are used firstly to solve this model: Mean Monte Carlo Runge-Kutta and Mean Latin Hypercube Runge-Kutta Methods. Then two approximate simulation methods are proposed to solve the current study. The results of the proposed approximate shrunken methods and the numerical
... Show MoreThe analysis of time series considers one of the mathematical and statistical methods in explanation of the nature phenomena and its manner in a specific time period.
Because the studying of time series can get by building, analysis the models and then forecasting gives the priority for the practicing in different fields, therefore the identification and selection of the model is of great importance in spite of its difficulties.
The selection of a standard methods has the ability for estimation the errors in the estimated the parameters for the model, and there will be a balance between the suitability and the simplicity of the model.
In the analysis of d
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreThis article aims to estimate the partially linear model by using two methods, which are the Wavelet and Kernel Smoothers. Simulation experiments are used to study the small sample behavior depending on different functions, sample sizes, and variances. Results explained that the wavelet smoother is the best depending on the mean average squares error criterion for all cases that used.