Preferred Language
Articles
/
jeasiq-1926
Using The Maximum Likelihood And Bayesian Methods To Estimate The Time-Rate Function Of Earthquake Phenomenon
...Show More Authors

In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countries that fall within the seismic belt which are Turkey and Iran, which makes it not excluded from the arrival of seismic frequencies to it, where the three parameters of the above model were estimated in two ways which are the maximum likelihood method and the Bayesian method to find the time average of the occurrence of this phenomenon and a mean square error (MSE) was used to find out which methods are best in estimating the model parameters show that the Bayesian method is the best estimation method.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Sep 01 2018
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
The Planning For Reducing the Phenomenon of Tax EvasionBy Developing the Role of the Equitable: Distribution of TaxBurdens on TaxpayersAn Applied Research in the General Commission Of Taxes
...Show More Authors

 The tax base is one of the bases of the technical organizing of taxes,   and that a good selection of the tax base effects the outcome of the tax and its fairness, and with the expansion of the tax range results a dangerous phenomenon called tax evasion, which became threaten the economies of countries and this phenomenon prevents the achievement of the state to its economic, political and social objectives which seeks to resolve this phenomenon and identifying all human and material potential and realize the real reasons that lie behind it.   The researcher found that tax authorities are weak in terms of it the technical material and financial abilities, the analysis of data show that then is a significant reve

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Apr 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Branch and Bound Algorithm with Penalty Function Method for solving Non-linear Bi-level programming with application
...Show More Authors

The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.

View Publication
Crossref
Publication Date
Sun Jan 20 2019
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Bayesian Estimation for Two Parameters of Gamma Distribution Under Precautionary Loss Function
...Show More Authors

In the current study, the researchers have been obtained Bayes estimators for the shape and scale parameters of Gamma distribution under the precautionary loss function, assuming the priors, represented by Gamma and Exponential priors for the shape and scale parameters respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation.

Based on Monte Carlo simulation method, those estimators are compared depending on the mean squared errors (MSE’s). The results show that, the performance of Bayes estimator under precautionary loss function with Gamma and Exponential priors is better than other estimates in all cases.

View Publication Preview PDF
Crossref (5)
Crossref
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Use Generalized Pareto Survival Models to Estimation Optimal Survival Time for Myocardial Infarction Patients
...Show More Authors

The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jun 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Comparing Some of Robust the Non-Parametric Methods for Semi-Parametric Regression Models Estimation
...Show More Authors

In this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then  these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.

The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2023
Journal Name
International Journal Of Physiology, Nutrition And Physical Education
The effect of effort perception training according to race speed rhythm control for developing speed endurance, adapting maximum heart rate, and achieving 3000 m running/hurdles for men
...Show More Authors

View Publication
Crossref
Publication Date
Mon Aug 21 2023
Journal Name
Communications In Mathematical Biology And Neuroscience
New techniques to estimate the solution of autonomous system
...Show More Authors

This research aims to solve the nonlinear model formulated in a system of differential equations with an initial value problem (IVP) represented in COVID-19 mathematical epidemiology model as an application using new approach: Approximate Shrunken are proposed to solve such model under investigation, which combines classic numerical method and numerical simulation techniques in an effective statistical form which is shrunken estimation formula. Two numerical simulation methods are used firstly to solve this model: Mean Monte Carlo Runge-Kutta and Mean Latin Hypercube Runge-Kutta Methods. Then two approximate simulation methods are proposed to solve the current study. The results of the proposed approximate shrunken methods and the numerical

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Clarivate Crossref
Publication Date
Sat Dec 01 2007
Journal Name
Journal Of Economics And Administrative Sciences
The analysis of time series considers one of the mathematical and statistical methods in explanation of the nature phenomena and its manner in a specific time period.
...Show More Authors

The analysis of time series considers one of the mathematical and statistical methods in explanation of the nature phenomena and its manner in a specific time period.

Because the studying of time series can get by building, analysis the models and then forecasting gives the priority for the practicing in different fields, therefore the identification and selection of the model is of great importance in spite of its difficulties.

The selection of a standard methods has the ability for estimation the errors in the estimated the parameters for the model, and there will be a balance between the suitability and the simplicity of the model.

In the analysis of d

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Baghdad Science Journal
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I Article Sidebar
...Show More Authors

n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func

... Show More
View Publication
Scopus (3)
Scopus Clarivate Crossref
Publication Date
Thu Apr 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Estimate the Partial Linear Model Using Wavelet and Kernel Smoothers
...Show More Authors

This article aims to estimate the partially linear model by using two methods, which are the Wavelet and Kernel Smoothers. Simulation experiments are used to study the small sample behavior depending on different functions, sample sizes, and variances. Results explained that the wavelet smoother is the best depending on the mean average squares error criterion for all cases that used.

 

 

View Publication Preview PDF
Crossref