Preferred Language
Articles
/
jperc-1019
Comparison between Rush Model Parameters to Completed and Lost Data by Different Methods of Processing Missing Data
...Show More Authors

The current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition, the researcher relied on chi-squared value for each item at (0.05).  After that, the researcher found out the parameters of the missing data after relying on a loss percentage (10%) and used three ways to treat them (mean, regression, likelihood). The results showed that the comparison between the parameters completed and missing data by using three ways of processing the missing data is in favor of the parameters of the completed data, and the likelihood way is the suitable way to treat the completed data. 

     The conclusions, recommendations and suggestions have been drawn based on the findings.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Oct 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Robust M Estimate With Cubic Smoothing Splines For Time-Varying Coefficient Model For Balance Longitudinal Data
...Show More Authors

In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of  specific time points (m)،since the frequent measurements within the subjects are almost connected an

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Aug 01 2014
Journal Name
Journal Of Economics And Administrative Sciences
Efficiency Measurement Model for Postgraduate Programs and Undergraduate Programs by Using Data Envelopment Analysis
...Show More Authors

Measuring the efficiency of postgraduate and undergraduate programs is one of the essential elements in educational process. In this study, colleges of Baghdad University and data for the academic year (2011-2012) have been chosen to measure the relative efficiencies of postgraduate and undergraduate programs in terms of their inputs and outputs. A relevant method to conduct the analysis of this data is Data Envelopment Analysis (DEA). The effect of academic staff to the number of enrolled and alumni students to the postgraduate and undergraduate programs are the main focus of the study.

 

View Publication Preview PDF
Crossref
Publication Date
Wed Dec 18 2019
Journal Name
Baghdad Science Journal
Detecting Keratoconus by Using SVM and Decision Tree Classifiers with the Aid of Image Processing
...Show More Authors

 Researchers used different methods such as image processing and machine learning techniques in addition to medical instruments such as Placido disc, Keratoscopy, Pentacam;to help diagnosing variety of diseases that affect the eye. Our paper aims to detect one of these diseases that affect the cornea, which is Keratoconus. This is done by using image processing techniques and pattern classification methods. Pentacam is the device that is used to detect the cornea’s health; it provides four maps that can distinguish the changes on the surface of the cornea which can be used for Keratoconus detection. In this study, sixteen features were extracted from the four refractive maps along with five readings from the Pentacam software. The

... Show More
View Publication Preview PDF
Scopus (12)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Nov 02 2013
Journal Name
International Journal Of Computer Applications
Mixed Transforms Generated by Tensor Product and Applied in Data Processing
...Show More Authors

Finding orthogonal matrices in different sizes is very complex and important because it can be used in different applications like image processing and communications (eg CDMA and OFDM). In this paper we introduce a new method to find orthogonal matrices by using tensor products between two or more orthogonal matrices of real and imaginary numbers with applying it in images and communication signals processing. The output matrices will be orthogonal matrices too and the processing by our new method is very easy compared to other classical methods those use basic proofs. The results are normal and acceptable in communication signals and images but it needs more research works.

View Publication
Crossref
Publication Date
Mon Jan 01 2024
Journal Name
Baghdad Science Journal
Estimating the Parameters of Exponential-Rayleigh Distribution under Type-I Censored Data
...Show More Authors

     This paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number o

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Sep 01 2007
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Between Ordinary Method and Robust Method to estimate the Parameters of the Univariate Mixed Model with Low Order
...Show More Authors

A condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.

Simulation study was done for a varieties the model.  using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.

 

View Publication Preview PDF
Crossref
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
A Comparison between Ericson's Formulae Results and Experimental Data Using New Formulae of Single Particle Level Density
...Show More Authors

The partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter  was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of  are derived from the relation between  and level density parameter . The formulae used to derive  are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on  from the Thomas-Fermi formula show a good agreement with the experimental data.

View Publication Preview PDF
Scopus Crossref
Publication Date
Thu Nov 01 2012
Journal Name
2012 International Conference On Advanced Computer Science Applications And Technologies (acsat)
Data Missing Solution Using Rough Set theory and Swarm Intelligence
...Show More Authors

This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima

... Show More
View Publication Preview PDF
Scopus (7)
Crossref (3)
Scopus Clarivate Crossref