Preferred Language
Articles
/
jeasiq-798
Robust Two-Step Estimation and Approximation Local Polynomial Kernel For Time-Varying Coefficient Model With Balance Longitudinal Data
...Show More Authors

      In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of  specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-steps method depends, in estimation, on (OLS) method, which is sensitive for the existence of abnormality in data or contamination of error; robust methods have been proposed such as LAD & M to strengthen the two-steps method towards the abnormality and contamination of error. In this research imitating experiments have been performed, with verifying the performance of the traditional and robust methods for Local Linear kernel LLPK technique by using two criteria, for different sample sizes and disparity levels.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Iraqi Journal Of Statistical Sciences
Use the robust RFCH method with a polychoric correlation matrix in structural equation modeling When you are ordinal data
...Show More Authors

View Publication
Crossref
Publication Date
Sat Feb 02 2019
Journal Name
Journal Of The College Of Education For Women
The Balance and unbalance in alliance relations The Saudi Arabia and United State Relation as model: The Balance and unbalance in alliance relations The Saudi Arabia and United State Relation as model
...Show More Authors

Abstract:
Saudi Arabia and United States long relation could present an important
subject to understand alliance kind in international relations types. We trying
in this study to diagnose and analyze the Saudi Arabia and United States
model to find balance and unbalance statues and its influence on the
directions of Saudi Arabia foreign policy positions.
We divided the study in two parts, each part have many sections. The
first part deal with the historian emergence of Saudi Arabia state and its
development in three stages including its foreign relations with regions and
international powers. While the second part was dedicated in analyzing and
understanding the mechanism and active facts that drawing the Sa

... Show More
View Publication Preview PDF
Publication Date
Sat Feb 01 2020
Journal Name
Journal Of Economics And Administrative Sciences
Applying some hybrid models for modeling bivariate time series assuming different distributions for random error with a practical application
...Show More Authors

Abstract

  Bivariate time series modeling and forecasting have become a promising field of applied studies in recent times. For this purpose, the Linear Autoregressive Moving Average with exogenous variable ARMAX model is the most widely used technique over the past few years in modeling and forecasting this type of data. The most important assumptions of this model are linearity and homogenous for random error variance of the appropriate model. In practice, these two assumptions are often violated, so the Generalized Autoregressive Conditional Heteroscedasticity (ARCH) and (GARCH) with exogenous varia

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Oct 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Fuzzy Multi-Objective Capacitated Transportation Problem with Mixed Constraints using different forms of membership functions
...Show More Authors

In this research, the problem of multi- objective modal transport was formulated with mixed constraints to find the optimal solution. The foggy approach of the Multi-objective Transfer Model (MOTP) was applied. There are three objectives to reduce costs to the minimum cost of transportation, administrative cost and cost of the goods. The linear membership function, the Exponential membership function, and the Hyperbolic membership function. Where the proposed model was used in the General Company for the manufacture of grain to reduce the cost of transport to the minimum and to find the best plan to transfer the product according to the restrictions imposed on the model.

View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Fri Mar 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Stability testing of time series data for CT Large industrial establishments in Iraq
...Show More Authors

Abstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of ​​cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.

Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
A Comparison Between the Theoretical Cross Section Based on the Partial Level Density Formulae Calculated by the Exciton Model with the Experimental Data for (_79^197)Au nucleus
...Show More Authors

In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction  at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted  in the theoretical cross section and compared with the experimental data for  nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when  doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with  the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Jan 20 2019
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Bayesian Estimation for Two Parameters of Gamma Distribution Under Precautionary Loss Function
...Show More Authors

In the current study, the researchers have been obtained Bayes estimators for the shape and scale parameters of Gamma distribution under the precautionary loss function, assuming the priors, represented by Gamma and Exponential priors for the shape and scale parameters respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation.

Based on Monte Carlo simulation method, those estimators are compared depending on the mean squared errors (MSE’s). The results show that, the performance of Bayes estimator under precautionary loss function with Gamma and Exponential priors is better than other estimates in all cases.

View Publication Preview PDF
Crossref (5)
Crossref
Publication Date
Thu Apr 20 2023
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Bayesian Estimation for Two Parameters of Exponential Distribution under Different Loss Functions
...Show More Authors

In this paper, two parameters for the Exponential distribution were estimated using the
Bayesian estimation method under three different loss functions: the Squared error loss function,
the Precautionary loss function, and the Entropy loss function. The Exponential distribution prior
and Gamma distribution have been assumed as the priors of the scale γ and location δ parameters
respectively. In Bayesian estimation, Maximum likelihood estimators have been used as the initial
estimators, and the Tierney-Kadane approximation has been used effectively. Based on the MonteCarlo
simulation method, those estimators were compared depending on the mean squared errors (MSEs).The results showed that the Bayesian esti

... Show More
View Publication Preview PDF
Crossref (1)
Crossref