Preferred Language
Articles
/
7hYg5IsBVTCNdQwCH-Mp
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the improved algorithm can detect this type of anomaly. Thus, our approach is effective in finding abnormalities.

Scopus
Preview PDF
Quick Preview PDF
Publication Date
Tue Dec 01 2020
Journal Name
Baghdad Science Journal
Detection of Suicidal Ideation on Twitter using Machine Learning & Ensemble Approaches
...Show More Authors

Suicidal ideation is one of the most severe mental health issues faced by people all over the world. There are various risk factors involved that can lead to suicide. The most common & critical risk factors among them are depression, anxiety, social isolation and hopelessness. Early detection of these risk factors can help in preventing or reducing the number of suicides. Online social networking platforms like Twitter, Redditt and Facebook are becoming a new way for the people to express themselves freely without worrying about social stigma. This paper presents a methodology and experimentation using social media as a tool to analyse the suicidal ideation in a better way, thus helping in preventing the chances of being the victim o

... Show More
View Publication Preview PDF
Scopus (36)
Crossref (25)
Scopus Clarivate Crossref
Publication Date
Wed Apr 02 2014
Journal Name
Journal Of Theoretical And Applied Information Technology
TUMOR BRAIN DETECTION THROUGH MR IMAGES: A REVIEW OF LITERATURE
...Show More Authors

Today’s modern medical imaging research faces the challenge of detecting brain tumor through Magnetic Resonance Images (MRI). Normally, to produce images of soft tissue of human body, MRI images are used by experts. It is used for analysis of human organs to replace surgery. For brain tumor detection, image segmentation is required. For this purpose, the brain is partitioned into two distinct regions. This is considered to be one of the most important but difficult part of the process of detecting brain tumor. Hence, it is highly necessary that segmentation of the MRI images must be done accurately before asking the computer to do the exact diagnosis. Earlier, a variety of algorithms were developed for segmentation of MRI images by usin

... Show More
Scopus (46)
Scopus
Publication Date
Thu Aug 31 2017
Journal Name
Journal Of Engineering
Optimum Dimensions of Hydraulic Structures and Foundation Using Genetic Algorithm coupled with Artificial Neural Network
...Show More Authors

      A model using the artificial neural networks and genetic algorithm technique is developed for obtaining optimum dimensions of the foundation length and protections of small hydraulic structures. The procedure involves optimizing an objective function comprising a weighted summation of the state variables. The decision variables considered in the optimization are the upstream and downstream cutoffs lengths and their angles of inclination, the foundation length, and the length of the downstream soil protection. These were obtained for a given maximum difference in head, depth of impervious layer and degree of anisotropy. The optimization carried out is subjected to constraints that ensure a safe structure aga

... Show More
View Publication Preview PDF
Publication Date
Sat Mar 01 2008
Journal Name
Iraqi Journal Of Physics
Enforcing Wiener Filter in the Iterative Blind Restoration Algorithm
...Show More Authors

A new blind restoration algorithm is presented and shows high quality restoration. This
is done by enforcing Wiener filtering approach in the Fourier domains of the image and the
psf environments

View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Sun Oct 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A novel data offloading scheme for QoS optimization in 5G based internet of medical things
...Show More Authors

The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat

... Show More
Publication Date
Fri Jan 01 2016
Journal Name
Journal Of Sensors
WDARS: A Weighted Data Aggregation Routing Strategy with Minimum Link Cost in Event-Driven WSNs
...Show More Authors

Realizing the full potential of wireless sensor networks (WSNs) highlights many design issues, particularly the trade-offs concerning multiple conflicting improvements such as maximizing the route overlapping for efficient data aggregation and minimizing the total link cost. While the issues of data aggregation routing protocols and link cost function in a WSNs have been comprehensively considered in the literature, a trade-off improvement between these two has not yet been addressed. In this paper, a comprehensive weight for trade-off between different objectives has been employed, the so-called weighted data aggregation routing strategy (WDARS) which aims to maximize the overlap routes for efficient data aggregation and link cost

... Show More
View Publication Preview PDF
Scopus (35)
Crossref (21)
Scopus Clarivate Crossref
Publication Date
Sun Oct 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A novel data offloading scheme for QoS optimization in 5G based internet of medical things
...Show More Authors

The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat

... Show More
View Publication
Scopus (5)
Scopus Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref