Preferred Language
Articles
/
jeasiq-2162
Comparison of Hurst exponent estimation methods
...Show More Authors

Through recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the methods is relatively close, except for the SODDW method was the most efficient in MASE.

Paper type Categorize your paper under one of these classifications: General review

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Dec 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
(Estimation and Analysis of the Cobb-Duglas Production Function for the Rail Transport Sector in Iraq for the Period 1990-2016 using the ARDL Model)
...Show More Authors

Abstract:

Since the railway transport sector is very important in many countries of the world, we have tried through this research to study the production function of this sector and to indicate the level of productivity under which it operates.

It was found through the estimation and analysis of the production function Kub - Duglas that the railway transport sector in Iraq suffers from a decline in the level of productivity, which was reflected in the deterioration of the level of services provided for the transport of passengers and goods. This led to the loss of the sector of importance in supporting the national economy and the reluctance of most passengers an

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Maximum Likelihood Method And Bayesian Method For Estimating Some Non-Homogeneous Poisson Processes Models
...Show More Authors

Abstract

The Non - Homogeneous Poisson  process is considered  as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).

This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto ,   to estimate th

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 01 2023
Journal Name
Petroleum Science And Technology
Lithofacies and electrofacies models for Mishrif Formation in West Qurna oilfield, Southern Iraq by deterministic and stochastic methods (comparison and analyzing)
...Show More Authors

View Publication
Scopus (14)
Crossref (11)
Scopus Clarivate Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Baghdad Science Journal
Comparison between RSA and CAST-128 with Adaptive Key for Video Frames Encryption with Highest Average Entropy
...Show More Authors

Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Wed Dec 01 2010
Journal Name
Journal Of Economics And Administrative Sciences
مقارنة بين طرائق تقدير معالم الانحدار عند وجود مشكلة عدم تجانس التباين مع التطبيق العملي
...Show More Authors

In this research weights, which are used, are estimated using General Least Square Estimation to estimate simple linear regression parameters when the depended variable, which is used, consists of two classes attributes variable (for Heteroscedastic problem) depending on Sequential Bayesian Approach instead of the Classical approach used before, Bayes approach provides the mechanism of tackling observations one by one in a sequential way, i .e each new observation will add a new piece of information for estimating the parameter of probability estimation of certain phenomenon of Bernoulli trials who research the depended variable in simple regression  linear equation. in addition to the information deduced from the past exper

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Robust M Estimate With Cubic Smoothing Splines For Time-Varying Coefficient Model For Balance Longitudinal Data
...Show More Authors

In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of  specific time points (m)،since the frequent measurements within the subjects are almost connected an

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Nov 01 2019
Journal Name
Iop Conference Series: Earth And Environmental Science
The comparison of several methods for calculating the degree of heritability and calculating the number of genes in maize (Zea mays L.). I. Agronomic traits
...Show More Authors
Abstract<p>The objective of present study was to compare of several methods for estimating the degree of heritability and calculating the number of genes using generation mean analysis of maize (<italic>Zea mays</italic>L.). The experiment was conducted at the field of Field Crop Dept. College of Agric / Univ. of Baghdad, for many seasons, spring and fall seasons 2009, 2010, spring 2011 and fall 2013.Six diverse inbred lines were crossed to produce F1,F2,BC1 and BC2 for four superior crosses.Broad-sense and narrow sense heritability estimates based on variance of different generations. The results showed that the four formulas used to estimate the heritability were different in estimating the values o</p> ... Show More
View Publication
Scopus (4)
Scopus Crossref
Publication Date
Thu Apr 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Branch and Bound Algorithm with Penalty Function Method for solving Non-linear Bi-level programming with application
...Show More Authors

The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.

View Publication
Crossref
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Comparing Between Shrinkage &Maximum likelihood Method For Estimation Parameters &Reliability Function With 3- Parameter Weibull Distribution By Using Simulation
...Show More Authors

The 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .

In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.

Note:- ns : small sample ; nm=median sample

... Show More
View Publication Preview PDF
Crossref