Preferred Language
Articles
/
7hYg5IsBVTCNdQwCH-Mp
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the improved algorithm can detect this type of anomaly. Thus, our approach is effective in finding abnormalities.

Scopus
Preview PDF
Quick Preview PDF
Publication Date
Tue Jan 01 2019
Journal Name
Ieee Access
Implementation of Univariate Paradigm for Streamflow Simulation Using Hybrid Data-Driven Model: Case Study in Tropical Region
...Show More Authors

View Publication
Scopus (93)
Crossref (89)
Scopus Clarivate Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Mar 12 2023
Journal Name
المجلة الدولية للتنمية في العلوم الاجتماعية والإنسانية
the Ottoman Empire in the Nineteenth Century (Iraq as a Model)the Ottoman Empire in the Nineteenth Century (Iraq as a Model)
...Show More Authors

One of the most important features of American foreign activity during the second half of the 19th century was the encouragement of sending Protestant missionary missions to all countries around the world. The mission of these missions was facilitated by the weakened state of the Ottoman Empire, the increasing influence of European countries, and their interference in its internal affairs. Despite the Ottoman efforts to counter the activities of these missions, the American missionary missions adopted a well-planned strategy that suited the conditions of the region, aimed at achieving their goals through direct and indirect methods, including intervention in the field of education. These missions gave great importance to educational aspects

... Show More
View Publication
Publication Date
Sun Oct 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A novel data offloading scheme for QoS optimization in 5G based internet of medical things
...Show More Authors

The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat

... Show More
View Publication
Scopus (6)
Crossref (1)
Scopus Crossref
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Sun Oct 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A novel data offloading scheme for QoS optimization in 5G based internet of medical things
...Show More Authors

The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat

... Show More
Publication Date
Fri Jan 01 2016
Journal Name
Journal Of Sensors
WDARS: A Weighted Data Aggregation Routing Strategy with Minimum Link Cost in Event-Driven WSNs
...Show More Authors

Realizing the full potential of wireless sensor networks (WSNs) highlights many design issues, particularly the trade-offs concerning multiple conflicting improvements such as maximizing the route overlapping for efficient data aggregation and minimizing the total link cost. While the issues of data aggregation routing protocols and link cost function in a WSNs have been comprehensively considered in the literature, a trade-off improvement between these two has not yet been addressed. In this paper, a comprehensive weight for trade-off between different objectives has been employed, the so-called weighted data aggregation routing strategy (WDARS) which aims to maximize the overlap routes for efficient data aggregation and link cost

... Show More
View Publication Preview PDF
Scopus (38)
Crossref (21)
Scopus Clarivate Crossref
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
The impact of enabling the effectiveness of the work of the audit committees in private commercial banks (A survey study of the views of a sample of the objective of the inspection bodies represented by the Central Bank and the Securities Commission and e
...Show More Authors

The aim of this study is to identify the effect of enabling the effectiveness of the work of the audit committees in private commercial banks and to identify the extent of awareness of the importance of empowerment in the work of these committees, especially as it is known that these committees, especially the inspection committees that go to private banks and from various sources including committees of the Central Bank of Iraq Committees of the Securities Commission and finally committees of the external audit offices, through an analysis of the determinants of empowerment in the performance of the most important work of the audit committees, namely: supervising the process of preparing reports, supervising the system of intern

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Sep 24 2023
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin

... Show More
View Publication
Crossref