Preferred Language
Articles
/
jeasiq-3045
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroups by penalizing the pairwise distances between the coefficients of the cubic B-spline model using one of the common penalize functions, the Minimax Concave Penalty function (MCP). This method, in turn, works to determine the number of clusters through one of the model selection criteria, Bayesian information criteria (BIC), and we used optimization methods to solve their equations. Therefore, we applied the alternative direction method of the ADMM multiplier algorithm to reach approximate solutions to find the estimators of the nonparametric model using R statistical software.
Longitudinally balanced data were generated in the simulation study, as the number of subjects was 60 and the number of repeats (time) was 10 for each subject. The simulation was iterated 100 times, and it showed that employing the MCP partial methods on the cubic model can group profiles into clusters, which is the aim of this paper.

 

Paper type: Research paper.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
Detecting Textual Propaganda Using Machine Learning Techniques
...Show More Authors

Social Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation.  Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota

... Show More
View Publication Preview PDF
Scopus (20)
Crossref (11)
Scopus Clarivate Crossref
Publication Date
Tue Sep 08 2020
Journal Name
Baghdad Science Journal
Voice Identification Using MFCC and Vector Quantization
...Show More Authors

The speaker identification is one of the fundamental problems in speech processing and voice modeling. The speaker identification applications include authentication in critical security systems and the accuracy of the selection. Large-scale voice recognition applications are a major challenge. Quick search in the speaker database requires fast, modern techniques and relies on artificial intelligence to achieve the desired results from the system. Many efforts are made to achieve this through the establishment of variable-based systems and the development of new methodologies for speaker identification. Speaker identification is the process of recognizing who is speaking using the characteristics extracted from the speech's waves like pi

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Tue Jan 03 2023
Journal Name
College Of Islamic Sciences
Ruling on selling big data (Authentical Fiqh Study): Ruling on selling big data (Authentical Fiqh Study)
...Show More Authors

Abstract:

Research Topic: Ruling on the sale of big data

Its objectives: a statement of what it is, importance, source and governance.

The methodology of the curriculum is inductive, comparative and critical

One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it

 Recommendation: Follow-up of studies dealing with the provisions of the issue

Subject Terms

Judgment, Sale, Data, Mega, Sayings, Jurists

 

View Publication Preview PDF
Publication Date
Tue Dec 31 2019
Journal Name
Opcion
Analysis of Computer Textbook for the Second Intermediate Grade According to Digital Citizenship
...Show More Authors

The research aims to build a list of digital citizenship axes and standards and indicators emanating from them, which should be included in the content of the computer textbook scheduled for second grade intermediate students in Iraq, and the analysis of the above mentioned book according to the same list using the descriptive analytical method ((method of content analysis)). The research community and its sample consisted of the content of the computer textbook scheduled for the second year intermediate students for the academic year 2018-2019, and the research tool was built in its initial form after reference to a set of specialized literature and previous studies that dealt with topics related to digital citizenship, and the authenticit

... Show More
Preview PDF
Publication Date
Fri Aug 30 2019
Journal Name
Environmental Engineering Research
Numerical modeling of two-dimensional simulation of groundwater protection from lead using different sorbents in permeable barriers
...Show More Authors

This study is to investigate the possibility of using activated carbon prepared from Iraqi date-pits (ADP) which are produced from palm trees (Phoenix dactylifera L.) as low-cost reactive material in the permeable reactive barrier (PRB) for treating lead (Pb<sup>+2</sup>) from the contaminated groundwater, and then compare the results experimentally with other common reactive materials such as commercial activated carbon (CAC), zeolite pellets (ZP). Factors influencing sorption such as contact time, initial pH of the solution, sorbent dosage, agitation speed, and initial lead concentration has been studied. Two isotherm models were used for the description of sorption data (Langmuir and Freundlich). The maximum lead sorp

... Show More
Scopus (22)
Crossref (17)
Scopus Clarivate Crossref
Publication Date
Mon Mar 11 2019
Journal Name
Baghdad Science Journal
Analysing Iraqi Railways Network by Applying Specific Criteria Using the GIS Techniques
...Show More Authors

The railways network is one of the huge infrastructure projects. Therefore, dealing with these projects such as analyzing and developing should be done using appropriate tools, i.e. GIS tools. Because, traditional methods will consume resources, time, money and the results maybe not accurate. In this research, the train stations in all of Iraq’s provinces were studied and analyzed using network analysis, which is one of the most powerful techniques within GIS. A free trial copy of ArcGIS®10.2 software was used in this research in order to achieve the aim of this study. The analysis of current train stations has been done depending on the road network, because people used roads to reach those train stations. The data layers for this st

... Show More
View Publication Preview PDF
Scopus Clarivate Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
A Comparative Study of Some Methods of Estimating Robust Variance Covariance Matrix of the Parameters Estimated by (OLS) in Cross-Sectional Data
...Show More Authors

 

Abstract

The Classical Normal Linear Regression Model Based on Several hypotheses, one of them is Heteroscedasticity as it is known that the wing of least squares method (OLS), under the existence of these two problems make the estimators, lose their desirable properties, in addition the statistical inference becomes unaccepted table. According that we put tow alternative,  the first one is  (Generalized Least Square) Which is denoted by (GLS), and the second alternative is to (Robust covariance matrix estimation) the estimated parameters method(OLS), and that the way (GLS) method neat and certified, if the capabilities (Efficient) and the statistical inference Thread on the basis of an acceptable

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Sat Jan 01 2022
Journal Name
Encyclopedia Of Smart Materials
Modeling Behavior of Magnetorheological Fluids
...Show More Authors

View Publication
Scopus (2)
Scopus Crossref
Publication Date
Mon Feb 18 2019
Journal Name
Iraqi Journal Of Physics
Data visualization and distinct features extraction of the comet Ison 2013
...Show More Authors

The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.

View Publication Preview PDF
Crossref