Preferred Language
Articles
/
jeasiq-191
A comparison between Bayesian Method and Full Maximum Likelihood to estimate Poisson regression model hierarchy and its application to the maternal deaths in Baghdad
...Show More Authors

Abstract:

 This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.

The comparison was done by  simulation  using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the  Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood  with sample size  (n = 30) is the best to represent the maternal mortality data after it has been reliance value parameter to the distribution obtained through a program of (easy fit) (μ = 3.9167 ), and then we take the hypothetical values ​​for this one smaller parameter (μ = 2.50) greater than the other (μ = 4.50) so as to obtain more accurate results, so it has been applied to real data that have been obtained from the Ministry of Health where he was recording the number of deaths mothers over five years and on a quarterly basis, were three circles healthier choice in Baghdad, since the validity of each circle represents the total will be so (20) watch for each group and the total aggregate Views will be (60).

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Oct 22 2018
Journal Name
Journal Of Economics And Administrative Sciences
Using Mehar method to change fuzzy cost of fuzzy linear model with practical application
...Show More Authors

  Many production companies suffers from big losses because of  high production cost and low profits for several reasons, including raw materials high prices and no taxes impose on imported goods also consumer protection law deactivation and national product and customs law, so most of consumers buy imported goods because it is characterized by modern specifications and low prices.

  The production company also suffers from uncertainty in the cost, volume of production, sales, and availability of raw materials and workers number because they vary according to the seasons of the year.

  I had adopted in this research fuzzy linear program model with fuzzy figures

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Jun 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
"Using Markov Switching Model to Investigate the Link between the Inflation and Uncertain Inflation in Iraq for the periods 1980-2010"
...Show More Authors

In this paper we use the Markov Switching model to investigate the link between the level of Iraqi inflation and its uncertainty; forth period 1980-2010 we measure inflation uncertainty as the variance of unanticipated  inflation. The results ensure there are a negative effect of inflation level on inflation uncertainty and  all so there are a positive effect of inflation uncertainty on inflation level.                                                   &nbsp

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Mar 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Some Methods for Estimating Nonparametric Binary Logistic Regression
...Show More Authors

In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Oct 23 2022
Journal Name
Baghdad Science Journal
Comparison Between Deterministic and Stochastic Model for Interaction (COVID-19) With Host Cells in Humans
...Show More Authors

In this paper, the deterministic and the stochastic models are proposed to study the interaction of the Coronavirus (COVID-19) with host cells inside the human body. In the deterministic model, the value of the basic reproduction number   determines the persistence or extinction of the COVID-19. If   , one infected cell will transmit the virus to less than one cell, as a result,  the person carrying the Coronavirus will get rid of the disease .If   the infected cell  will be able to infect  all  cells that contain ACE receptors. The stochastic model proves that if  are sufficiently large then maybe  give  us ultimate disease extinction although ,  and this  facts also proved by computer simulation.

View Publication Preview PDF
Scopus (9)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Mon May 14 2018
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
The Comparison Between Different Approaches to Overcome the Multicollinearity Problem in Linear Regression Models
...Show More Authors

    In the presence of multi-collinearity problem, the parameter estimation method based on the ordinary least squares procedure is unsatisfactory. In 1970, Hoerl and Kennard insert analternative method labeled as estimator of ridge regression.

In such estimator, ridge parameter plays an important role in estimation. Various methods were proposed by many statisticians to select the biasing constant (ridge parameter). Another popular method that is used to deal with the multi-collinearity problem is the principal component method. In this paper,we employ the simulation technique to compare the performance of principal component estimator with some types of ordinary ridge regression estimators based on the value of t

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Oct 02 2021
Journal Name
International Journal Of Nonlinear Analysis And Applications
Using the wavelet analysis to estimate the nonparametric regression model in the presence of associated errors
...Show More Authors

Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f

... Show More
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon May 08 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Using Restricted Least Squares Method to Estimate and Analyze the Cobb-Douglas Production Function with Application
...Show More Authors

  In this paper, the restricted least squares method is employed to estimate the parameters of the Cobb-Douglas production function and then analyze and interprete the results obtained.         A practical application is performed on the state company for leather industries in Iraq for the period (1990-2010).         The statistical program SPSS is used to perform the required calculations.

View Publication Preview PDF
Publication Date
Mon Sep 03 2018
Journal Name
Al-academy
The Interchange of Sign Transformation between Locality and Universality in the Iraqi Theatre, "Romeo and Juliet in Baghdad Show" - A Model
...Show More Authors

The research deals with the interchange of the sign transformed from the universal to the local in the theatrical show through the direction processing in the production of a communicative artistic discourse and message, thus making the process of reading the speech and recognizing it by taking into account the cultural differences, customs and local rituals of each country, region, or area. The problem of the research was focused on answering the following question: What are the requirements for the sign in terms of its transformation between the universality and locality in the read-out?

               The importance of research is to determine the requiremen

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jun 10 2025
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Modified LASS Method Suggestion as an additional Penalty on Principal Components Estimation – with Application-
...Show More Authors

This research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v

... Show More
View Publication Preview PDF