Preferred Language
Articles
/
jperc-1019
Comparison between Rush Model Parameters to Completed and Lost Data by Different Methods of Processing Missing Data
...Show More Authors

The current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition, the researcher relied on chi-squared value for each item at (0.05).  After that, the researcher found out the parameters of the missing data after relying on a loss percentage (10%) and used three ways to treat them (mean, regression, likelihood). The results showed that the comparison between the parameters completed and missing data by using three ways of processing the missing data is in favor of the parameters of the completed data, and the likelihood way is the suitable way to treat the completed data. 

     The conclusions, recommendations and suggestions have been drawn based on the findings.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Jan 01 2016
Journal Name
Statistics And Its Interface
Search for risk haplotype segments with GWAS data by use of finite mixture models
...Show More Authors

The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled

... Show More
View Publication
Scopus Clarivate Crossref
Publication Date
Fri Mar 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Robust Two-Step Estimation and Approximation Local Polynomial Kernel For Time-Varying Coefficient Model With Balance Longitudinal Data
...Show More Authors

      In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of  specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Iraqi Journal Of Statistical Sciences
Use The Coiflets and Daubechies Wavelet Transform To Reduce Data Noise For a Simple Experiment
...Show More Authors

In this research, a simple experiment in the field of agriculture was studied, in terms of the effect of out-of-control noise as a result of several reasons, including the effect of environmental conditions on the observations of agricultural experiments, through the use of Discrete Wavelet transformation, specifically (The Coiflets transform of wavelength 1 to 2 and the Daubechies transform of wavelength 2 To 3) based on two levels of transform (J-4) and (J-5), and applying the hard threshold rules, soft and non-negative, and comparing the wavelet transformation methods using real data for an experiment with a size of 26 observations. The application was carried out through a program in the language of MATLAB. The researcher concluded that

... Show More
Publication Date
Thu Mar 30 2017
Journal Name
Iraqi Journal Of Pharmaceutical Sciences ( P-issn 1683 - 3597 E-issn 2521 - 3512)
Study the Prevalence of Helicobacter pylori Infection by Different Diagnostic Methods
...Show More Authors

A total of 41 patients with gastro duodenal symptoms (show signs of inflammation with or without duodenal ulcer) . 21 males (51.2%) and 20 female (48.8%) with an average age 0f  (20 – 80) years old under  going gastrointestinal endoscopy at Baghdad teaching hospital in internal disease clinical laboratory , between (February – June)  2009 . Biopsies specimen of antrum , gastric fundus ,& duodenal bulb were examined by the following methods (rapid urease test , Giemsa stain section to detect bacteria , & Haematoxilin and Eosin stained section for pathological study which are considered the gold standard methods , sera or plasma from these patients were tested by immunochromotography (ICM),serological m

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Mar 30 2021
Journal Name
Wasit Journal Of Computer And Mathematics Science
Dynamic Data Replication for Higher Availability and Security
...Show More Authors

The paradigm and domain of data security is the key point as per the current era in which the data is getting transmitted to multiple channels from multiple sources. The data leakage and security loopholes are enormous and there is need to enforce the higher levels of security, privacy and integrity. Such sections incorporate e-administration, long range interpersonal communication, internet business, transportation, coordinations, proficient correspondences and numerous others. The work on security and trustworthiness is very conspicuous in the systems based situations and the private based condition. This examination original copy is exhibiting the efficacious use of security based methodology towards the execution with blockchain

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Mon Sep 30 2019
Journal Name
College Of Islamic Sciences
The term that is verbally agreed and different is meaningful Between grammar and fundamentalists
...Show More Authors

Science is linked to each other bonds assets, branches and even close to each other; to become a link of this type of patterns of scientific integration, and perhaps the most prominent science that saw the overlap and interdependence of science and jurisprudence and its origins, and this interdependence did not stop at the epistemological foundations of knowledge; To the branches, methods of extraction, and diligence to extend to the terminology; to find similar terms in both flags; to become those terms similar in the word and meaning and even the end, but we do not lack terms similar in pronunciation and differing in meaning; and this difference is imposed by the specificity of each science and what M E; however, approached many terms

... Show More
View Publication Preview PDF
Publication Date
Mon Oct 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of the statistical methods used to Forecast the size of the Iraqi GDP for the two sectors (public and private) for the period (2025-2016)
...Show More Authors

Gross domestic product (GDP) is an important measure of the size of the economy's production. Economists use this term to determine the extent of decline and growth in the economies of countries. It is also used to determine the order of countries and compare them to each other. The research aims at describing and analyzing the GDP during the period from 1980 to 2015 and for the public and private sectors and then forecasting GDP in subsequent years until 2025. To achieve this goal, two methods were used: linear and nonlinear regression. The second method in the time series analysis of the Box-Jenkins models and the using of statistical package (Minitab17), (GRETLW32)) to extract the results, and then comparing the two methods, T

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jan 28 2019
Journal Name
Journal Of The College Of Education For Women
Apply the gravity model for trips between Najaf center and its settlements
...Show More Authors

The study showed flow rates and the interaction between the settlements served by applying the model of gravity theory to measure depending on the number of the population between city Najaf and the rest of the other settlements served and using three functions of disability, time and cost, as recorded an increase in the interaction index with some settlements like them Kufa, Abbasid and Manathira, while the indicator contrast was in other settlements, either when the application of the gravity model depending on trips and socio-economic characteristics accuracy rate was more pronounced.

View Publication Preview PDF
Publication Date
Tue Jan 01 2019
Journal Name
Ieee Access
Implementation of Univariate Paradigm for Streamflow Simulation Using Hybrid Data-Driven Model: Case Study in Tropical Region
...Show More Authors

View Publication
Scopus (90)
Crossref (88)
Scopus Clarivate Crossref
Publication Date
Tue Oct 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Comparing Between Shrinkage &Maximum likelihood Method For Estimation Parameters &Reliability Function With 3- Parameter Weibull Distribution By Using Simulation
...Show More Authors

The 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .

In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.

Note:- ns : small sample ; nm=median sample

... Show More
View Publication Preview PDF
Crossref