The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimating the scale parameter of the Weibull distribution. To evaluate their performance, we generate simulated datasets with different sample sizes and varying parameter values. A technique for pre-estimation shrinkage is suggested to enhance the precision of estimation. Simulation experiments proved that the Bayesian shrinkage estimator and shrinkage preestimation under the squared loss function method are better than the other methods because they give the least mean square error. Overall, our findings highlight the advantages of shrinkage Bayesian estimation methods for the proposed distribution. Researchers and practitioners in fields reliant on extreme value analysis can benefit from these findings when selecting appropriate Bayesian estimation techniques for modeling extreme events accurately and efficiently.
Is in this research review of the way minimum absolute deviations values based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.
The article aims to study the liquidity that is required to be provided optimally and the profitability that is required to be achieved by the bank, and the impact of both of them on the value of the bank, and their effect of both liquidity and profitability on the value of the bank. Hence, the research problem emerged, which indicates the extent of the effect of liquidity and profitability on the value of the bank. The importance of the research stems from the main role that commercial banks play in the economy of a country. This requires the need to identify liquidity in a broad way and its most important components, and how to
... Show MoreThe Normalized Difference Vegetation Index (NDVI) is commonly used as a measure of land surface greenness based on the assumption that NDVI value is positively proportional to the amount of green vegetation in an image pixel area. The Normalized Difference Vegetation Index data set of Landsat based on the remote sensing information is used to estimate the area of plant cover in region west of Baghdad during 1990-2001. The results show that in the period of 1990 and 2001 the plant area in region of Baghdad increased from (44760.25) hectare to (75410.67) hectare. The vegetation area increased during the period 1990-2001, and decreases the exposed area.
The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreThe research took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide practical evidence that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial and that includes all of the spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. The spatial analysis had been applied to Iraq Household Socio-Economic Survey: IHS
... Show MoreA simple analytical method was used in the present work for the simultaneous quantification of Ciprofloxacin and Isoniazid in pharmaceutical preparations. UV-Visible spectrophotometry has been applied to quantify these compounds in pure and mixture solutions using the first-order derivative method. The method depends on the first derivative spectrophotometry using zero-cross, peak to baseline, peak to peak and peak area measurements. Good linearity was shown in the concentration range of 2 to 24 μg∙mL-1 for Ciprofloxacin and 2 to 22 μg∙mL-1 for Isoniazid in the mixture, and the correlation coefficients were 0.9990 and 0.9989 respectively using peak area mode. The limits of detection (LOD) and limits of quantification (LOQ) wer
... Show MoreThe purpose of this paper is to introduce and study the concepts of fuzzy generalized open sets, fuzzy generalized closed sets, generalized continuous fuzzy proper functions and prove results about these concepts.