Preferred Language
Articles
/
jeasiq-751
Building discriminant function for repeated measurements data under compound symmetry (CS) covariance structure and applied in the health field
...Show More Authors

Discriminant analysis is a technique used to distinguish and classification an individual to a group among a number of  groups based on a linear combination of a set of relevant variables know discriminant function. In this research  discriminant analysis used to analysis data from repeated measurements design. We  will  deal  with the problem of  discrimination  and  classification in the case of  two  groups by assuming the Compound Symmetry covariance structure  under  the  assumption  of  normality for  univariate  repeated measures data.

 

The importance of this research represented to find the best model  to classify  a  group of  patients who  suffer  from diabetes.  For  the purpose of studying the effects of  the number of correlations, variances, and umber of  repeated  measurements  on the performance of classification rules for this  type of  data  based on monthly measurements  of  glycosylated  hemoglobin (HbA1C) in the blood was taken in three stages, which  is  the beginning  of  the experiment, and after three months, and  then  six  months for two groups of patients, the first group consists of  (38)  patients  was  suffered  from  diabetes  type (I)  and  the second group includes (33) patients suffered from diabetes type (II).

 

And through this research, concluded that when the number of parameters began to increase. Thus, the apparent error rate  begin to increasing, and this is what reduces the efficiency of classification rules for this type of data. And  we  recommend  by  using  the linear discriminant function when you focus on the least number of parameters to build the classification rule. And quadratic discriminant procedure Represented by equal the variance and different correlation parameters  under compound symmetry covariance structures

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Dec 01 2024
Journal Name
Nordic Concrete Research
Evaluating Concrete Strength Under Various Curing Conditions Using Artificial Neural Networks
...Show More Authors
Abstract<p>This study examines the impact of different curing methods on the compressive strength of concrete. It investigates techniques such as air curing, periodic water spraying, full water submersion, and polyethylene encasement. Artificial neural network models were employed to evaluate the compressive strength under each curing condition. A model for calculating compressive strength that considers surrounding conditions was created using an artificial neural network. The current study’s figures were generated using this model. The research thoroughly examined the impact of curing environments and concrete mix components on strength properties, taking into account factors such as tempera</p> ... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Thu Nov 08 2018
Journal Name
Iraqi National Journal Of Nursing Specialties
Evaluation of National Standards for Exposure to Chemical Materials and Dusts in the State Company for Drugs Industry in Samarra
...Show More Authors

Objective: Evaluation the national standards for exposure to chemical materials and dusts in The State
Company for Drugs Industry in Samarra.
Methodology: A descriptive evaluation design is employed through the present study from 25th May 2011
to 30th November 2011 in order to evaluate the national standards for exposure chemical materials and dusts
in The State Company for Drugs Industry in Samarra. A purposive (non-probability) sample is selected for the
study which includes (110) workers from the State Company for Drugs Industry in Samarra. Data were
gathered through the workers` interviewed according to the nature of work that they perform. The evaluation
questionnaire comprised of three parts which include the w

... Show More
View Publication Preview PDF
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More
Publication Date
Fri Mar 01 2019
Journal Name
Spatial Statistics
Efficient Bayesian modeling of large lattice data using spectral properties of Laplacian matrix
...Show More Authors

Spatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati

... Show More
View Publication
Scopus (9)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Sun May 11 2025
Journal Name
Iraqi Statisticians Journal
Estimating General Linear Regression Model of Big Data by Using Multiple Test Technique
...Show More Authors

View Publication
Crossref
Publication Date
Sun Jul 01 2018
Journal Name
Agronomy Journal
Use of Rainfall Data to Improve Ground-Based Active Optical Sensors Yield Estimates
...Show More Authors

Ground-based active optical sensors (GBAOS) have been successfully used in agriculture to predict crop yield potential (YP) early in the season and to improvise N rates for optimal crop yield. However, the models were found weak or inconsistent due to environmental variation especially rainfall. The objectives of the study were to evaluate if GBAOS could predict YP across multiple locations, soil types, cultivation systems, and rainfall differences. This study was carried from 2011 to 2013 on corn (Zea mays L.) in North Dakota, and in 2017 in potatoes in Maine. Six N rates were used on 50 sites in North Dakota and 12 N rates on two sites, one dryland and one irrigated, in Maine. Two active GBAOS used for this study were GreenSeeker and Holl

... Show More
View Publication
Publication Date
Wed Feb 06 2013
Journal Name
Eng. & Tech. Journal
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete

... Show More
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Ten Years of OpenStreetMap Project: Have We Addressed Data Quality Appropriately? – Review Paper
...Show More Authors

It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlighting the diff

... Show More
Crossref (3)
Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Journal Of Southwest Jiaotong University
Recognizing Job Apathy Patterns of Iraqi Higher Education Employees Using Data Mining Techniques
...Show More Authors

Psychological research centers help indirectly contact professionals from the fields of human life, job environment, family life, and psychological infrastructure for psychiatric patients. This research aims to detect job apathy patterns from the behavior of employee groups in the University of Baghdad and the Iraqi Ministry of Higher Education and Scientific Research. This investigation presents an approach using data mining techniques to acquire new knowledge and differs from statistical studies in terms of supporting the researchers’ evolving needs. These techniques manipulate redundant or irrelevant attributes to discover interesting patterns. The principal issue identifies several important and affective questions taken from

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Ten Years of OpenStreetMap Project: Have We Addressed Data Quality Appropriately? – Review Paper
...Show More Authors

It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight

... Show More
View Publication Preview PDF