Preferred Language
Articles
/
ijs-5018
Data Mining Methods for Extracting Rumors Using Social Analysis Tools
...Show More Authors

       Rumors are typically described as remarks whose true value is unknown. A rumor on social media has the potential to spread erroneous information to a large group of individuals. Those false facts will influence decision-making in a variety of societies. In online social media, where enormous amounts of information are simply distributed over a large network of sources with unverified authority, detecting rumors is critical. This research proposes that rumor detection be done using Natural Language Processing (NLP) tools as well as six distinct Machine Learning (ML) methods (Nave Bayes (NB), random forest (RF), K-nearest neighbor (KNN), Logistic Regression (LR), Stochastic Gradient Descent (SGD) and Decision Tree (DT)). The data set size for the suggested experiment was 16,865 samples. For pre-processing tokenization was used to separates each one of the tokens from the others. Normalization that removes all non-word tokens, deleting stop words was utilized to remove all unnecessary words, and stemming was used to obtain the stem of the tokens. Prior to using the six classification algorithms, the major feature extraction approach Term Frequency- Inverse Document Frequency (TF-IDF) was applied. The RF classifier performed better compared to all other classifiers with an accuracy of 99%, according to the data.

Keywords: Machine learning, Text classification, Naïve Byes, RF, KNN, DT, Natural language processing, SGD).

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Aug 17 2020
Journal Name
International Journal Of Applied Mechanics And Engineering
Analysis of structural concrete bar members based on secant stiffness methods
...Show More Authors

In this paper, the behavior of structural concrete linear bar members was studied using numerical model implemented in a computer program written in MATLAB. The numerical model is based on the modified version of the procedure developed by Oukaili. The model is based on real stress-strain diagrams of concrete and steel and their secant modulus of elasticity at different loading stages. The behavior presented by normal force-axial strain and bending moment-curvature relationships is studied by calculating the secant sectional stiffness of the member. Based on secant methods, this methodology can be easily implemented using an iterative procedure to solve non-linear equations. A comparison between numerical and experimental data, illustrated

... Show More
Scopus
Publication Date
Tue Dec 05 2023
Journal Name
Baghdad Science Journal
A Novel System for Confidential Medical Data Storage Using Chaskey Encryption and Blockchain Technology
...Show More Authors

Secure storage of confidential medical information is critical to healthcare organizations seeking to protect patient's privacy and comply with regulatory requirements. This paper presents a new scheme for secure storage of medical data using Chaskey cryptography and blockchain technology. The system uses Chaskey encryption to ensure integrity and confidentiality of medical data, blockchain technology to provide a scalable and decentralized storage solution. The system also uses Bflow segmentation and vertical segmentation technologies to enhance scalability and manage the stored data. In addition, the system uses smart contracts to enforce access control policies and other security measures. The description of the system detailing and p

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Sun Aug 01 2021
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Robust Tests for the Mean Difference in Paired Data by Using Bootstrap Resampling Technique
...Show More Authors

The paired sample t-test for testing the difference between two means in paired data is not robust against the violation of the normality assumption. In this paper, some alternative robust tests have been suggested by using the bootstrap method in addition to combining the bootstrap method with the W.M test. Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these three tests depending on type one error rates and the power rates of the test statistics. The three tests have been applied on different sample sizes generated from three distributions represented by Bivariate normal distribution, Bivariate contaminated normal distribution, and the Bivariate Exponential distribution.

View Publication Preview PDF
Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Baghdad Science Journal
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I Article Sidebar
...Show More Authors

n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func

... Show More
View Publication
Scopus (3)
Scopus Clarivate Crossref
Publication Date
Sat May 08 2021
Journal Name
Iraqi Journal Of Science
EEG Signals Analysis for Epileptic Seizure Detection Using DWT Method with SVM and KNN Classifiers
...Show More Authors

Epilepsy is a critical neurological disorder with critical influences on the way of living of its victims and prominent features such as persistent convulsion periods followed by unconsciousness. Electroencephalogram (EEG) is one of the commonly used devices for seizure recognition and epilepsy detection. Recognition of convulsions using EEG waves takes a relatively long time because it is conducted physically by epileptologists. The EEG signals are analyzed and categorized, after being captured, into two types, which are normal or abnormal (indicating an epileptic seizure).  This study relies on EEG signals which are provided by Arrhythmia Database. Thus, this work is a step beyond the traditional database mission of delivering use

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (2)
Scopus Crossref
Publication Date
Thu Apr 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Rainwater Drainage Service Improvement Using a Number Of Quality Tools at the Directorate of Karbala Sewage
...Show More Authors

The basic objective of the research is to study the quality of the water flow service in the Directorate of Karbala sewage and how to improve it after identifying the deviations of the processes and the final product and then providing the possible solutions in addressing the causes of the deviations and the associated quality gaps. A number of quality tools were used and applied to all data Stations with areas and activities related to the drainage of rainwater, as the research community determines the stations of lifting rainwater in the Directorate of the streams of Karbala holy, and the station was chosen Western station to apply the non-random sampling method intended after meeting a number of. It is one of the largest and m

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Oct 30 2022
Journal Name
Iraqi Journal Of Science
The Best Efficient Solutions for Multi-Criteria Travelling Salesman Problem Using Local Search Methods
...Show More Authors

     In this research, we propose to use two local search methods (LSM's); Particle Swarm Optimization (PSO) and the Bees Algorithm (BA) to solve Multi-Criteria Travelling Salesman Problem (MCTSP) to obtain the best efficient solutions. The generating process of the population of the proposed LSM's may be randomly obtained or by adding some initial solutions obtained from some efficient heuristic methods. The obtained solutions of the PSO and BA are compared with the solutions of the exact methods (complete enumeration and branch and bound methods) and some heuristic methods. The results proved the efficiency of PSO and BA methods for a large number of nodes ( ). The proposed LSM's give the best efficient solutions for the MCTSP for

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Tue Dec 05 2023
Journal Name
Baghdad Science Journal
Processing of Polymers Stress Relaxation Curves Using Machine Learning Methods
...Show More Authors

Currently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Fri Jul 01 2011
Journal Name
25th International Cartographic Conference
User generated content and formal data sources for integrating geospatial data
...Show More Authors

Today, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o

... Show More
Publication Date
Wed Dec 26 2018
Journal Name
Iraqi Journal Of Science
Extraction of Vacant Lands for Baghdad City Using Two Classification Methods of Very High Resolution Satellite Images
...Show More Authors

The use of remote sensing technologies was gained more attention due to an increasing need to collect data for the environmental changes. Satellite image classification is a relatively recent type of remote sensing uses satellite imagery to indicate many key environment characteristics. This study aims at classifying and extracting vacant lands from high resolution satellite images of Baghdad city by supervised Classification tool in ENVI 5.3 program. The classification accuracy was 15%, which can be regarded as fairly acceptable given the difficulty of differentiating vacant land surfaces from other surfaces such as roof tops of buildings.

View Publication Preview PDF