Preferred Language
Articles
/
jeasiq-321
A Comparison Between Maximum Likelihood Method And Bayesian Method For Estimating Some Non-Homogeneous Poisson Processes Models
...Show More Authors

Abstract

The Non - Homogeneous Poisson  process is considered  as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).

This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto ,   to estimate the parameter of the model that mentioned above , It have been used maximum likelihood method and Bayesian method in the estimation of the parameter that is used in this Research . in order to find the best method in the estimation , we referring to simulation method in which we tested four size of samples ( 25, 50 , 75, 100) to illustrate the effect of changes in samples size on features estimation , Also we suppose four initial value for every parameter from research models parameter and for making a comparison between the used method in estimation as it depend on mean square error (MSE) . As the result referred to that   maximum likelihood method is the best and efficient way in estimation in which it gives the minimum mean square error (MSE).

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Jul 01 2012
Journal Name
Journal Of Educational And Psychological Researches
A comparative study between the emotional responses in patients with blood pressure on a scale of music types characteristics compared with responses to some Healthy patients
...Show More Authors

The problem of the study and its significance:

           Due to the increasing pressures of life continually, and constant quest behind materialism necessary and frustrations that confront us daily in general, the greater the emergence of a number of cases of disease organic roots psychological causing them because of severity of a lack of response to conventional treatments (drugs), and this is creating in patients a number of emotional disorders resulting from concern the risk of disease

 

     That is interested psychologists and doctors searchin

... Show More
View Publication Preview PDF
Publication Date
Thu May 12 2022
Journal Name
Journal Of Economics And Administrative Sciences
Nonparametric Estimator (Histogram) For Estimating Probability Density Function: Nonparametric Estimator (Histogram) For Estimating Probability Density Function
...Show More Authors

 In this paper we introduce several estimators for Binwidth of histogram estimators' .We use simulation technique to compare these estimators .In most cases, the results proved that the rule of thumb estimator is better than other estimators.

View Publication Preview PDF
Crossref
Publication Date
Tue Sep 01 2020
Journal Name
Baghdad Science Journal
Bayesian and Non - Bayesian Inference for Shape Parameter and Reliability Function of Basic Gompertz Distribution
...Show More Authors

In this paper, some estimators of the unknown shape parameter and reliability function  of Basic Gompertz distribution (BGD) have been obtained, such as MLE, UMVUE, and MINMSE, in addition to estimating Bayesian estimators under Scale invariant squared error loss function assuming informative prior represented by Gamma distribution and non-informative prior by using Jefferys prior. Using Monte Carlo simulation method, these estimators of the shape parameter and R(t), have been compared based on mean squared errors and integrated mean squared, respectively

View Publication Preview PDF
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Thu Dec 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
solving linear fractional programming problems (LFP) by Using denominator function restriction method and compare it with linear transformations method
...Show More Authors

 

Abstract

The use of modern scientific methods and techniques, is considered important topics to solve many of the problems which face some sector, including industrial, service and health. The researcher always intends to use modern methods characterized by accuracy, clarity and speed to reach the optimal solution and be easy at the same time in terms of understanding and application.

the research presented this comparison between the two methods of solution for linear fractional programming models which are linear transformation for Charnas & Cooper , and denominator function restriction method through applied on the oil heaters and gas cookers plant , where the show after reac

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Sep 01 2019
Journal Name
Baghdad Science Journal
A New Method for the Isolation and Purification of Trigonelline as Hydrochloride from Trigonella foenum-graecum L.
...Show More Authors

Separation of Trigonelline, the major alkaloid in fenugreek seeds, is difficult because the extract of these seeds usually contains Trigonelline, choline, mucilage, and steroidal saponins, in addition to some other substances. This study amis to isolate the quaternary ammonium alkaloid (Trigonelline) and choline from fenugreek seeds (Trigonella-foenum graecum L.) which have similar physiochemical properties by modifying of the classical method. Seeds were defatted and then extracted with methanol. The presence of alkaloids was detected by using Mayer's and Dragendorff's reagents. In this work, trigonilline was isolated with traces of choline by subsequent processes of purification using analytical and preparative TLC techniques.

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
Study About The Robustness Of The Bayesian Criterion
...Show More Authors

In this research work an attempt has been made to investigate about the Robustness of the Bayesian Information criterion to estimate the order of the autoregressive process when the error of this model,  Submits to a specific distributions and different cases of the time series on various size of samples by using the simulation,  This criterion has been studied by depending on ten distributions, they are (Normal, log-Normal, continues uniform, Gamma , Exponential, Gamble, Cauchy, Poisson, Binomial, Discrete uniform) distributions, and then it has been reached to many collection and recommendations related to this object , when the series residual variable is subject to each  ( Poisson , Binomial , Exponential , Dis

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Baghdad Science Journal
Comparison between RSA and CAST-128 with Adaptive Key for Video Frames Encryption with Highest Average Entropy
...Show More Authors

Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Thu Apr 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Between Tree regression (TR), and Negative binomial regression (NBR) by Using Simulation.
...Show More Authors

            In this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 20 2019
Journal Name
Iraqi Journal Of Physics
A comparison between PCA and some enhancement filters for denoising astronomical images
...Show More Authors

This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.

Experimental results shows LPG-

... Show More
View Publication Preview PDF
Crossref