Preferred Language
Articles
/
Chg5aJQBVTCNdQwCWhUx
Comparison of some Bayesian estimation methods for type-I generalized extreme value distribution with simulation
...Show More Authors

The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimating the scale parameter of the Weibull distribution. To evaluate their performance, we generate simulated datasets with different sample sizes and varying parameter values. A technique for pre-estimation shrinkage is suggested to enhance the precision of estimation. Simulation experiments proved that the Bayesian shrinkage estimator and shrinkage preestimation under the squared loss function method are better than the other methods because they give the least mean square error. Overall, our findings highlight the advantages of shrinkage Bayesian estimation methods for the proposed distribution. Researchers and practitioners in fields reliant on extreme value analysis can benefit from these findings when selecting appropriate Bayesian estimation techniques for modeling extreme events accurately and efficiently.

Scopus Clarivate Crossref
View Publication
Publication Date
Wed Dec 25 2019
Journal Name
Journal Of Engineering
Comparison of Different DEM Generation Methods based on Open Source Datasets
...Show More Authors

Digital Elevation Model (DEM) is one of the developed techniques for relief representation.  The definition of a DEM construction is the modeling technique of earth surface from existing data. DEM plays a role as one of the fundamental information requirement that has been generally utilized in GIS data structures. The main aim of this research is to present a methodology for assessing DEMs generation methods. The DEMs data will be extracted from open source data e.g. Google Earth. The tested data will be compared with data produced from formal institutions such as General Directorate of Surveying. The study area has been chosen in south of Iraq (Al-Gharraf / Dhi Qar governorate. The methods of DEMs creation are kri

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sat Apr 01 2023
Journal Name
Full Text Book Of Minar Congress8
REGULARITY VIA PRE- GENERALIZED OPEN SETS
...Show More Authors

By use the notions pre-g-closedness and pre-g-openness we have generalized a class of separation axioms in topological spaces. In particular, we presented in this paper new types of regulαrities, which we named ρg­regulαrity and Sρg­regulαrity. Many results and properties of both types have been investigated and have illustrated by examples.

View Publication
Crossref
Publication Date
Wed Jul 20 2022
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
A Subject Review on Some Analytical Methods for Determination of Fosfomycin Drugs
...Show More Authors

Medicines comprising fosfomycin are prescribed for urinary tract infections. These drugs are available for oral use as tromethamine and calcium, while fosfomycin-sodium and disodium are given for intravenous (IV) and intramuscular (IM). Many quantitative analytical methods have been reported to estimate Fosfomycin in blood, urine, plasma, serum, and pharmaceutical dosage formulations. Some techniques were spectrophotometric, mass spectrometry, gas chromatography, high-performance liquid chromatography, and electrochemical methods. Here we perform a rapid narrative review that discusses and comparison between them of various analytical methods for the determination of Fosfomycin-containing drugs.

View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Mar 01 2020
Journal Name
Periodicals Of Engineering And Natural Sciences
Employment of the genetic algorithm in some methods of estimating survival function with application
...Show More Authors

Intended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted

... Show More
Scopus (2)
Scopus
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
An Approximate solution for two points oundary value problem corresponding to some optimal control
...Show More Authors

this paper presents a novel method for solving nonlinear optimal conrol problems of regular type via its equivalent two points boundary value problems using the non-classical

View Publication Preview PDF
Publication Date
Mon Jun 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of the Shapiro-Wilk test Jureckova test using simulation and multiple distributions
...Show More Authors

 إن المقصود باختبارات حسن المطابقة هو التحقق من فرضية العدم القائمة على تطابق مشاهدات أية عينة تحت الدراسة لتوزيع احتمالي معين وترد مثل هكذا حالات في التطبيق العملي بكثرة وفي كافة المجالات وعلى الأخص بحوث علم الوراثة والبحوث الطبية والبحوث الحياتية ,عندما اقترح كلا من   Shapiro والعالم Wilk  عام 1965 اختبار حسن المطابقة الحدسي مع معالم القياس
(

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Jan 01 2022
Journal Name
International Journal Of Agricultural And Statistical Sciences
ON ERROR DISTRIBUTION WITH SINGLE INDEX MODEL
...Show More Authors

In this paper, the error distribution function is estimated for the single index model by the empirical distribution function and the kernel distribution function. Refined minimum average variance estimation (RMAVE) method is used for estimating single index model. We use simulation experiments to compare the two estimation methods for error distribution function with different sample sizes, the results show that the kernel distribution function is better than the empirical distribution function.

Scopus
Publication Date
Thu Jan 01 2009
Journal Name
مجلة العلوم الاحصائية
Robust Estimator for Semiparametric Generalized Additive Model
...Show More Authors

Generalized Additive Model has been considered as a multivariate smoother that appeared recently in Nonparametric Regression Analysis. Thus, this research is devoted to study the mixed situation, i.e. for the phenomena that changes its behaviour from linear (with known functional form) represented in parametric part, to nonlinear (with unknown functional form: here, smoothing spline) represented in nonparametric part of the model. Furthermore, we propose robust semiparametric GAM estimator, which compared with two other existed techniques.

View Publication Preview PDF
Publication Date
Thu Sep 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
The use of the methods of the lower squares and the smaller squares weighted in the estimation of the parameters and design of the sample acceptance schemesFor general exponential distribution
...Show More Authors

The acceptance sampling plans for generalized exponential distribution, when life time experiment is truncated at a pre-determined time are provided in this article. The two parameters (α, λ), (Scale parameters and Shape parameters) are estimated by LSE, WLSE and the Best Estimator’s for various samples sizes are used to find the ratio of true mean time to a pre-determined, and are used to find the smallest possible sample size required to ensure the producer’s risks, with a pre-fixed probability (1 - P*). The result of estimations and of sampling plans is provided in tables.

Key words: Generalized Exponential Distribution, Acceptance Sampling Plan, and Consumer’s and Producer Risks

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref