Preferred Language
Articles
/
jeasiq-513
Comparison some of methods wavelet estimation for non parametric regression function with missing response variable at random
...Show More Authors

Abstract

 The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .

The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation it is not possible to apply these methods because of the miss of one of its conditions which is dyadic sample size .

According to the great impact resulted from the problem , many researchers who devoted their studies to process this problem , by using traditional methods in processing missing data , whereas the current research used imputation methods more efficient and effective to process missing data as a primary stage so that these data will be ready and available to wavelet application , as a result simulation experiment proved that the suggested methods (Nearset Nighbor Polynomial Wavelet) are more efficient and superior to other methods , this paper also includes the auto correction of boundaries problem by using local polynomial models , and using different threshold values in wavelet estimations

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Oct 22 2024
Journal Name
Iraqi Statisticians Journal
Inferential Methods for the Dagum Regression Model
...Show More Authors

The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana

... Show More
View Publication Preview PDF
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Determine the optimal policy for the function of Pareto distribution reliability estimated using dynamic programming
...Show More Authors

The goal (purpose) from using development technology that require mathematical procedure related with high Quality & sufficiency of solving complex problem called Dynamic Programming with in recursive method (forward & backward) through  finding series of associated decisions for reliability function of Pareto distribution estimator by using two approach Maximum likelihood & moment .to conclude optimal policy

View Publication
Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
A Comparative Study of Some Methods of Estimating Robust Variance Covariance Matrix of the Parameters Estimated by (OLS) in Cross-Sectional Data
...Show More Authors

 

Abstract

The Classical Normal Linear Regression Model Based on Several hypotheses, one of them is Heteroscedasticity as it is known that the wing of least squares method (OLS), under the existence of these two problems make the estimators, lose their desirable properties, in addition the statistical inference becomes unaccepted table. According that we put tow alternative,  the first one is  (Generalized Least Square) Which is denoted by (GLS), and the second alternative is to (Robust covariance matrix estimation) the estimated parameters method(OLS), and that the way (GLS) method neat and certified, if the capabilities (Efficient) and the statistical inference Thread on the basis of an acceptable

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Periodicals Of Engineering And Natural Sciences
Comparison between the estimated of nonparametric methods by using the methodology of quantile regression models
...Show More Authors

This paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them

Scopus
Publication Date
Thu Nov 01 2018
Journal Name
International Journal Of Biomathematics
A non-conventional hybrid numerical approach with multi-dimensional random sampling for cocaine abuse in Spain
...Show More Authors

This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ

... Show More
View Publication
Scopus (7)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison between robust methods in canonical correlation by using empirical influence function
...Show More Authors

       Canonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.

In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Mar 01 2007
Journal Name
Al-khwarizmi Engineering Journal
Image restoration using regularized inverse filtering and adaptive threshold wavelet denoising
...Show More Authors

Although the Wiener filtering is the optimal tradeoff of inverse filtering and noise smoothing, in the case when the blurring filter is singular, the Wiener filtering actually amplify the noise. This suggests that a denoising step is needed to remove the amplified noise .Wavelet-based denoising scheme provides a natural technique for this purpose .

                In this paper  a new image restoration scheme is proposed, the scheme contains two separate steps : Fourier-domain inverse filtering  and wavelet-domain image denoising. The first stage is Wiener filtering of the input image , the filtered image is inputted to adaptive threshold wavelet

... Show More
View Publication Preview PDF
Publication Date
Fri Apr 12 2019
Journal Name
Journal Of Economics And Administrative Sciences
The robust estimators of reliability function using sample technique AM & POT
...Show More Authors

Abstract 

The Phenomenon of Extremism of Values ​​(Maximum or Rare Value) an important phenomenon is the use of two techniques of sampling techniques to deal with this Extremism: the technique of the peak sample and the maximum annual sampling technique (AM) (Extreme values, Gumbel) for sample (AM) and (general Pareto, exponential) distribution of the POT sample. The cross-entropy algorithm was applied in two of its methods to the first estimate using the statistical order and the second using the statistical order and likelihood ratio. The third method is proposed by the researcher. The MSE comparison coefficient of the estimated parameters and the probability density function for each of the distributions were

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jun 06 2010
Journal Name
Baghdad Science Journal
Stochastic Non-Linear Pseudo-Random Sequence Generator
...Show More Authors

Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence

View Publication Preview PDF
Crossref
Publication Date
Thu Nov 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Multistage and Numerical Discretization Methods for Estimating Parameters in Nonlinear Linear Ordinary Differential Equations Models.
...Show More Authors

Many of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the rem

... Show More
View Publication Preview PDF
Crossref