Preferred Language
Articles
/
jeasiq-40
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Jun 01 2023
Journal Name
Baghdad Science Journal
Estimation of Parameters for the Gumbel Type-I Distribution under Type-II Censoring Scheme
...Show More Authors

This paper aims to decide the best parameter estimation methods for the parameters of the Gumbel type-I distribution under the type-II censorship scheme. For this purpose, classical and Bayesian parameter estimation procedures are considered. The maximum likelihood estimators are used for the classical parameter estimation procedure. The asymptotic distributions of these estimators are also derived. It is not possible to obtain explicit solutions of Bayesian estimators. Therefore, Markov Chain Monte Carlo, and Lindley techniques are taken into account to estimate the unknown parameters. In Bayesian analysis, it is very important to determine an appropriate combination of a prior distribution and a loss function. Therefore, two different

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Thu May 11 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Estimation of the Parameter of an Exponential Distribution When Applying Maximum Likelihood and Probability Plot Methods Using Simulation
...Show More Authors

 Exponential Distribution is probably the most important distribution in reliability work. In this paper, estimating the scale parameter of an exponential distribution was proposed through out employing maximum likelihood estimator and probability plot methods for different samples size. Mean square error was implemented as an indicator of performance for assumed several values of the parameter and computer simulation has been carried out to analysis the obtained results

View Publication Preview PDF
Publication Date
Thu Jun 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Estimation of Time of Survival Rate by Using Clayton Function for the Exponential Distribution with Practical Application
...Show More Authors

Each phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Compare to the conditional logistic regression models with fixed and mixed effects for longitudinal data
...Show More Authors

Mixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Sep 03 2012
Journal Name
The International Archives Of The Photogrammetry, Remote Sensing And Spatial Information Sciences
CALIBRATION OF FULL-WAVEFORM ALS DATA BASED ON ROBUST INCIDENCE ANGLE ESTIMATION
...Show More Authors

Abstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r

... Show More
View Publication
Crossref
Publication Date
Fri Oct 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of some of reliability and Hazard estimation methods for Rayleigh logarithmic distribution using simulation with application
...Show More Authors

The question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.

In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Feb 01 2020
Journal Name
Journal Of Economics And Administrative Sciences
Fuzzy Robust Estimation For Location Parameter
...Show More Authors

 In this paper, we introduce three robust fuzzy estimators of a location parameter based on Buckley’s approach, in the presence of outliers. These estimates were compared using the variance of fuzzy numbers criterion, all these estimates were best of Buckley’s estimate. of these, the fuzzy median was the best in the case of small and medium sample size, and in large sample size, the fuzzy trimmed mean was the best.

View Publication Preview PDF
Crossref
Publication Date
Tue Mar 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
Using Iterative Reweighting Algorithm and Genetic Algorithm to Calculate The Estimation of The Parameters Of The Maximum Likelihood of The Skew Normal Distribution
...Show More Authors

Excessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the M

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jul 20 2020
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Applying Shrinkage Estimation Technique of P(Y<Max X1, X2,…, Xk) in Case of Generalized Exponential Distribution
...Show More Authors

     This paper concerned with estimation reliability (­ for K components parallel system of the stress-strength model with non-identical components which is subjected to a common stress, when the stress and strength follow the Generalized Exponential Distribution (GED) with unknown shape parameter α and the known scale parameter θ (θ=1) to be common. Different shrinkage estimation methods will be considered to estimate ­ depending on maximum likelihood estimator and prior estimates based on simulation using mean squared error (MSE) criteria. The study approved that the shrinkage estimation using shrinkage weight function was the best.

 

View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Fri Mar 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Robust Two-Step Estimation and Approximation Local Polynomial Kernel For Time-Varying Coefficient Model With Balance Longitudinal Data
...Show More Authors

      In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of  specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-

... Show More
View Publication Preview PDF
Crossref