Preferred Language
Articles
/
jeasiq-1777
Use Generalized Pareto Survival Models to Estimation Optimal Survival Time for Myocardial Infarction Patients
...Show More Authors

The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as Exponential Model, Weibull Model, Log-logistic Model. Our research aims to adopt some of the Bayesian Optimal Criteria in achieving optimal design to estimate the optimal survival time for patients with myocardial infarction by constructing a parametric survival model based on the probability distribution of the survival times of myocardial infarction patients, which is among the most serious diseases that  threaten human life and the main cause of death all over the world, as the duration of survival of patients with myocardial infarction varies with the factor or factors causing the injury, there are many factors that lead to the disease such as diabetes, high blood pressure, high cholesterol, psychological pressure and obesity. Therefore, the need to estimate the optimal survival time was expressed by constructing a model of the relationship between the factors leading to the disease and the patient survival time, and we found that the optimal rate of survival time is 18 days.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Dec 01 2012
Journal Name
Journal Of Economics And Administrative Sciences
Comparison between the empirical bayes method with moments method to estimate the affiliation parameter in the clinical trials using simulation
...Show More Authors

In this research the Empirical Bayes method is used to Estimate the affiliation parameter in the clinical trials and then we compare this with the Moment Estimates for this parameter using Monte Carlo stimulation , we assumed that the distribution of the observation is binomial distribution while the distribution with the unknown random parameters is beta distribution ,finally we conclude that the Empirical bayes method for the random affiliation parameter is efficient using Mean Squares Error (MSE) and for different Sample size .

View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2006
Journal Name
Journal Of Educational And Psychological Researches
اثر لعبة كرة القلب في خفض غضب أطفال الروضة
...Show More Authors

مشكلة البحث واهميته :

            يعد الغضب من المظاهر الانفعالية الشائعة في مرحلة الطفولة المبكرة وذلك بسبب كثرة المواقف المثيرة لغضب الطفل في هذه المرحلة ، كما ان الطفل في هذه المرحلة سرعان ما يكتفي في كثير من الاحيان بان غضبه وبكائه طريقة سهلة للوصول الى ما يريد . ويعد المختصون في هذا المجال ، بان نوبات غضب الاطفال ، هي شيء عام طبيعي عند جميع الاطفال ، لكن عندما تكون نوبات عنيفة

... Show More
View Publication Preview PDF
Publication Date
Sat Jan 10 2015
Journal Name
British Journal Of Applied Science & Technology
The Use of Cubic Bezier Interpolation, Biorthogonal Wavelet and Quadtree Coding to Compress Color Images
...Show More Authors

In this paper, an efficient method for compressing color image is presented. It allows progressive transmission and zooming of the image without need to extra storage. The proposed method is going to be accomplished using cubic Bezier surface (CBI) representation on wide area of images in order to prune the image component that shows large scale variation. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, bi-orthogonal wavelet transform is applied to decompose the residue component. Both scalar quantization and quad tree coding steps are applied on the produced wavelet sub bands. Finally, adaptive shift coding is applied to handle the remaining statistical redundancy and attain e

... Show More
View Publication
Crossref (2)
Crossref
Publication Date
Tue Oct 30 2001
Journal Name
3rd. Jordanian Civil Engineering Conference ,29-31 Oct.2001. 2001
The Use of the F.E.M. to Study the Performance of Stone Columns in Soft Soil
...Show More Authors

In this paper, the penetration of the stone column was investigated in order to get the minimum length of the stone column above which the increase in length has little advantage. The effect of using different materials in column are also studied. The material used is granular of different angle of internal friction (). The results of the investigation indicated that the effect of stone column remains constant when the ratio of the thickness of the soft clay layer to the stone column’s diameter is more than 15. The results also indicated that a pronounced effect is obtained when the angle of internal friction of the stone column material is increased.

Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
USE OF MODIFIED MAXIMUM LIKELIHOOD METHOD TO ESTIMATE PARAMETERS OF THE MULTIPLE LINEAR REGRESSION MODEL
...Show More Authors

Scopus
Publication Date
Tue Dec 30 2008
Journal Name
Al-kindy College Medical Journal
Use of Ascetic Fluid Cholesterol as a Marker to Differentiate between Types of Ascetic Fluid
...Show More Authors

Background:- Cholesterol is high in ascetic fluid due
to malignancy and other causes of exudates.
Objective:-To use cholesterol as a marker to
differentiate between exudative and transudative
ascetic fluid and to compare that with other routine
parameters.
Methods:-Twenty eight patients were included in this
study 17 females with mean age of 41.9 years, 11
males with mean age of 48.2 years. The patients were
divided in group I suspected transudate, and group II
suspected exudate according to history and clinical
examination.
Ascetic fluid samples were sent for total protein,
albumin, and cholesterol measurement blood samples
were sent for serum protein and albumin measurement.
Results:-In this

... Show More
View Publication Preview PDF
Publication Date
Tue Oct 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Robust M Estimate With Cubic Smoothing Splines For Time-Varying Coefficient Model For Balance Longitudinal Data
...Show More Authors

In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of  specific time points (m)،since the frequent measurements within the subjects are almost connected an

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jun 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Using ARIMA models to forecast the volume of cargo handled in Iraqi ports An applied study in the general company of Iraqi ports
...Show More Authors

Time series is an important statistical method adopted in the analysis of phenomena, practices, and events in all areas during specific time periods and predict future values ​​contribute to give a rough estimate of the status of the study, so the study aimed to adopt the ARIMA models to forecast the volume of cargo handled and achieved in four ports (Umm Qasr Port, Khor Al Zubair Port, Abu Flus Port, and Maqal Port(, Monthly data on the volume of cargo handled for the years (2006-2018) were collected (156) observations. The study found that the most efficient model is ARIMA (1,1,1).

The volume of go

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Nov 06 2018
Journal Name
Iraqi National Journal Of Nursing Specialties
Determination of Contributing Risk Factors to Adult Nephrolithiasis Patients
...Show More Authors

Objectives: To determine the contributing risk factors to adult nephrolithiasis patients.
Methodology: A descriptive study was conducted to determine the contributing risk factors to
Adults nephrolithiasis starting from December 2007 to September 2008. A purposive "nonprobability"
sample of (100) patients with nephrolithiasis was selected of those who were
admitted to the hospitals, attending the Urology Consultation Clinic and Extracorporeal Shock
Wave Lithotripsy Department. The study instrument consists of two parts. The first part is
related to the patients' demographic variables and the second part is constructed to serve the
purpose of the study. The total number of items in the questionnaire was (85) ones.

... Show More
View Publication Preview PDF