We are used Bayes estimators for unknown scale parameter when shape Parameter is known of Erlang distribution. Assuming different informative priors for unknown scale parameter. We derived The posterior density with posterior mean and posterior variance using different informative priors for unknown scale parameter which are the inverse exponential distribution, the inverse chi-square distribution, the inverse Gamma distribution, and the standard Levy distribution as prior. And we derived Bayes estimators based on the general entropy loss function (GELF) is used the Simulation method to obtain the results. we generated different cases for the parameters of the Erlang model, for different sample sizes. The estimates have been compared in terms of their mean-squared error (MSE). We concluded that the best estimators of the scale parameterof the Erlang distribution, based on GELF for the shape parameter (c=1,2,3) under inverse gamma prior with for all samples sizes(n) where the true cases of the Erlang model are and according to the smallest values of MSE
In this paper, wavelets were used to study the multivariate fractional Brownian motion through the deviations of the random process to find an efficient estimation of Hurst exponent. The results of simulations experiments were shown that the performance of the proposed estimator was efficient. The estimation process was made by taking advantage of the detail coefficients stationarity from the wavelet transform, as the variance of this coefficient showed the power-low behavior. We use two wavelet filters (Haar and db5) to manage minimizing the mean square error of the model.
This study seeks to address the impact of marketing knowledge dimensions (product, price, promotion, distribution) on the organizational performance in relation to a number of variables which are (efficiency, effectiveness, market share, customer satisfaction), and seeks to reveal the role of marketing knowledge in organizational performance.
In order to achieve the objective of the study the researcher has adopted a hypothetical model that reflects the logical relationships between the variables of the study. In order to reveal the nature of these relationships, several hypotheses have been presented as tentative solutions and this study seeks to verify the validity of these hypotheses.
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreAbstract
The research aims to identify tax exemptions, their objectives and types, as well as to shed light on the concept of sustainable development, its objectives, dimensions and indicators (economic, social and environmental), as well as to analyze the relationship between tax exemptions and economic development, in addition to measuring and analyzing the impact of tax exemptions on economic development in Iraq for the period ( 2015 - 2021) using the NARDL model. The research problem centers on the fact that failure to employ financial policy tools correctly led to a weakness in achieving economic justice, which leads to a failure to improve social welfar
... Show More
Abstract
Rayleigh distribution is one of the important distributions used for analysis life time data, and has applications in reliability study and physical interpretations. This paper introduces four different methods to estimate the scale parameter, and also estimate reliability function; these methods are Maximum Likelihood, and Bayes and Modified Bayes, and Minimax estimator under squared error loss function, for the scale and reliability function of the generalized Rayleigh distribution are obtained. The comparison is done through simulation procedure, t
... Show MoreIn this paper, the maximum likelihood estimator and the Bayes estimator of the reliability function for negative exponential distribution has been derived, then a Monte –Carlo simulation technique was employed to compare the performance of such estimators. The integral mean square error (IMSE) was used as a criterion for this comparison. The simulation results displayed that the Bayes estimator performed better than the maximum likelihood estimator for different samples sizes.
In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreThe present paper concerns with the problem of estimating the reliability system in the stress – strength model under the consideration non identical and independent of stress and strength and follows Lomax Distribution. Various shrinkage estimation methods were employed in this context depend on Maximum likelihood, Moment Method and shrinkage weight factors based on Monte Carlo Simulation. Comparisons among the suggested estimation methods have been made using the mean absolute percentage error criteria depend on MATLAB program.
In this paper, a method is proposed to increase the compression ratio for the color images by
dividing the image into non-overlapping blocks and applying different compression ratio for these
blocks depending on the importance information of the block. In the region that contain important
information the compression ratio is reduced to prevent loss of the information, while in the
smoothness region which has not important information, high compression ratio is used .The
proposed method shows better results when compared with classical methods(wavelet and DCT).