The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimating the scale parameter of the Weibull distribution. To evaluate their performance, we generate simulated datasets with different sample sizes and varying parameter values. A technique for pre-estimation shrinkage is suggested to enhance the precision of estimation. Simulation experiments proved that the Bayesian shrinkage estimator and shrinkage preestimation under the squared loss function method are better than the other methods because they give the least mean square error. Overall, our findings highlight the advantages of shrinkage Bayesian estimation methods for the proposed distribution. Researchers and practitioners in fields reliant on extreme value analysis can benefit from these findings when selecting appropriate Bayesian estimation techniques for modeling extreme events accurately and efficiently.
The concept of separation axioms constitutes a key role in general topology and all generalized forms of topologies. The present authors continued the study of gpα-closed sets by utilizing this concept, new separation axioms, namely gpα-regular and gpα-normal spaces are studied and established their characterizations. Also, new spaces namely gpα-Tk for k = 0, 1, 2 are studied.
This paper deals with testing a numerical solution for the discrete classical optimal control problem governed by a linear hyperbolic boundary value problem with variable coefficients. When the discrete classical control is fixed, the proof of the existence and uniqueness theorem for the discrete solution of the discrete weak form is achieved. The existence theorem for the discrete classical optimal control and the necessary conditions for optimality of the problem are proved under suitable assumptions. The discrete classical optimal control problem (DCOCP) is solved by using the mixed Galerkin finite element method to find the solution of the discrete weak form (discrete state). Also, it is used to find the solution for the discrete adj
... Show MoreThe purpose of this article is to improve and minimize noise from the signal by studying wavelet transforms and showing how to use the most effective ones for processing and analysis. As both the Discrete Wavelet Transformation method was used, we will outline some transformation techniques along with the methodology for applying them to remove noise from the signal. Proceeds based on the threshold value and the threshold functions Lifting Transformation, Wavelet Transformation, and Packet Discrete Wavelet Transformation. Using AMSE, A comparison was made between them , and the best was selected. When the aforementioned techniques were applied to actual data that was represented by each of the prices, it became evident that the lift
... Show MoreIn this research, we will discuss how to improve the work by dealing with the factors that
participates in enhancing small IT organization to produce the software using the suitable
development process supported by experimental theories to achieve the goals. Starting from
the selecting of the methodology to implement the software. The steps used are and should be
compatible with the type of the products the organization will produce and here it is the Web-Based Project Development.
The researcher suggest Extreme Programming (XP) as a methodology for the Web-Based
Project Development and justifying this suggestion and that will guide to know how the
methodology is very important and effective in the software dev
In recent years, the attention of researchers has increased of semi-parametric regression models, because it is possible to integrate the parametric and non-parametric regression models in one and then form a regression model has the potential to deal with the cruse of dimensionality in non-parametric models that occurs through the increasing of explanatory variables. Involved in the analysis and then decreasing the accuracy of the estimation. As well as the privilege of this type of model with flexibility in the application field compared to the parametric models which comply with certain conditions such as knowledge of the distribution of errors or the parametric models may
... Show MoreHealth service institutions suffer from challenges resulting from the great changes that our world is witnessing today. This has affected the value that these institutions add to the patient.
This research aims to identify the effect of integrating each of the techniques of QFD and value engineering for the health services provided to the patient to improve the value for him and thus obtain his satisfaction, which is reflected in the reputation of the surveyed hospitals. To achieve this, the descriptive analytical method was used, and a questionnaire was designed to collect the necessary data, which represents a measure of this research. The questionnaire was distri
... Show MoreIn this paper generalized spline method is used for solving linear system of fractional integro-differential equation approximately. The suggested method reduces the system to system of linear algebraic equations. Different orders of fractional derivative for test example is given in this paper to show the accuracy and applicability of the presented method.
Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show More