Preferred Language
Articles
/
9Bew5I8BVTCNdQwCVn-K
Copy Move Image Forgery Detection using Multi-Level Local Binary Pattern Algorithm

Digital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different resolutions. By considering features from multiple levels, the detection algorithm can better capture both global and local characteristics of the manipulated regions, enhancing the accuracy of forgery detection. To achieve a high accuracy rate, this paper presents a variety of scenarios based on a machine-learning approach. In Copy-Move detection, artifacts and their properties are used as image features and support Vector Machine (SVM) to determine whether an image is tampered with. The dataset is manipulated to train and test each classifier; the target is to learn the discriminative patterns that detect instances of copy-move forgery. Media Integration and Call Center Forgery (MICC-F2000) were utilized in this paper. Experimental evaluations demonstrate the effectiveness of the proposed methodology in detecting copy-move. The implementation phases in the proposed work have produced encouraging outcomes. In the case of the best-implemented scenario involving multiple trials, the detection stage achieved a copy-move accuracy of 97.8 %. 

Crossref
Publication Date
Thu Jul 01 2021
Journal Name
Iraqi Journal Of Science
Implementation of Machine Learning Techniques for the Classification of Lung X-Ray Images Used to Detect COVID-19 in Humans

COVID-19 (Coronavirus disease-2019), commonly called Coronavirus or CoV, is a dangerous disease caused by the SARS-CoV-2 virus. It is one of the most widespread zoonotic diseases around the world, which started from one of the wet markets in Wuhan city. Its symptoms are similar to those of the common flu, including cough, fever, muscle pain, shortness of breath, and fatigue. This article suggests implementing machine learning techniques (Random Forest, Logistic Regression, Naïve Bayes, Support Vector Machine) by Python to classify a series of chest X-ray images that include viral pneumonia, COVID-19, and healthy (Not infected) cases in humans. The study includes more than 1400 images that are collected from the Kaggle platform. The expe

... Show More
Scopus (32)
Crossref (17)
Scopus Crossref
View Publication Preview PDF
Publication Date
Fri Apr 30 2021
Journal Name
Iraqi Journal Of Science
Intelligent Agent Services in Electronic Libraries

Global services with an agent or a multi-agent system are a promising and new research area. However, several measures have been proposed to demonstrate the benefits of agent technology by supporting distributed services and applying smart agent technology in web dynamics. This paper is designed to build a Semantic Web on the World Wide Web (WWW) to enhance the productivity of managing electronic library applications, which poses a problem to researchers and students, represnted by the process of exchanging books from e-libraries, where the process is slow or the library needs large system data.

Scopus Crossref
View Publication Preview PDF
Publication Date
Sun Apr 01 2018
Journal Name
Construction And Building Materials
Linear viscous approach to predict rut depth in asphalt mixtures

Rutting in asphalt mixtures is a very common type of distress. It occurs due to the heavy load applied and slow movement of traffic. Rutting needs to be predicted to avoid major deformation to the pavement. A simple linear viscous method is used in this paper to predict the rutting in asphalt mixtures by using a multi-layer linear computer programme (BISAR). The material properties were derived from the Repeated Load Axial Test (RLAT) and represented by a strain-dependent axial viscosity. The axial viscosity was used in an incremental multi-layer linear viscous analysis to calculate the deformation rate during each increment, and therefore the overall development of rutting. The method has been applied for six mixtures and at different tem

... Show More
Crossref (11)
Crossref
View Publication Preview PDF
Publication Date
Wed Jul 01 2020
Journal Name
Journal Of Engineering
Examining sensitivity of financial performance at construction projects prequalification stage

Construction projects are complicated in nature and require many considerations in contractor selection. One of the complicated interactions is that between performance with the project size, and contractor financial status, and size of projects contracted. At the prequalification stage, the financial ‎requirements restrict the ‎contractors to meet minimum limits in financial criteria such as net worth, working capital and ‎annual turnover, etc. In construction projects, however, there are cases when contractors meet these requirements but show low performance in practice. The model used in the study predicts the performance by training of a neural network. The data used in the study are 72 of the most recent roadw

... Show More
Crossref
View Publication Preview PDF
Publication Date
Mon May 14 2018
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
The Comparison Between Different Approaches to Overcome the Multicollinearity Problem in Linear Regression Models

    In the presence of multi-collinearity problem, the parameter estimation method based on the ordinary least squares procedure is unsatisfactory. In 1970, Hoerl and Kennard insert analternative method labeled as estimator of ridge regression.

In such estimator, ridge parameter plays an important role in estimation. Various methods were proposed by many statisticians to select the biasing constant (ridge parameter). Another popular method that is used to deal with the multi-collinearity problem is the principal component method. In this paper,we employ the simulation technique to compare the performance of principal component estimator with some types of ordinary ridge regression estimators based on the value of t

... Show More
Crossref
View Publication Preview PDF
Publication Date
Thu Dec 30 2010
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
DETERMINATION OF THE OPTIMUM OPERATING CONDITIONS IN THE GRANULATION OF GAMMA ALUMINA CATALYST SUPPORT

Granulation Technique for Gamma Alumina Catalyst Support was employed in inclined disk granulator (IDG), rotary drum granulator (RD) and extrusion – spheronization equipments .Product with wide size range can be produced with only few parameters like rpm of equipment, ratio of binder and angle of inclination. The investigation was conducted for determination the optimum operating conditions in the three above different granulation equipments.
Results reveal that the optimum operating conditions to get maximum granulation occurred at ( speed: 31rpm , Inclination:420 , binder ratio:225,300% ) for the IDG,( speed: 68rpm , Inclination: 12.50 , binder ratio: 300% ) for the RD and ( speed:1200rpm , time of rotation: 1-2min )for the Caleva

... Show More
View Publication Preview PDF
Publication Date
Tue Nov 30 2021
Journal Name
Iraqi Journal Of Science
Inspecting Hybrid Data Mining Approaches in Decision Support Systems for Humanities Texts Criticism

The majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content.   When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniqu

... Show More
Scopus (1)
Scopus Crossref
View Publication Preview PDF
Publication Date
Thu Nov 30 2023
Journal Name
Iraqi Journal Of Science
An Artificial Intelligence-based Proactive Network Forensic Framework

     is at an all-time high in the modern period, and the majority of the population uses the Internet for all types of communication. It is great to be able to improvise like this. As a result of this trend, hackers have become increasingly focused on attacking the system/network in numerous ways. When a hacker commits a digital crime, it is examined in a reactive manner, which aids in the identification of the perpetrators. However, in the modern period, it is not expected to wait for an attack to occur. The user anticipates being able to predict a cyberattack before it causes damage to the system. This can be accomplished with the assistance of the proactive forensic framework presented in this study. The proposed system combines

... Show More
Crossref
View Publication Preview PDF
Publication Date
Wed Aug 30 2023
Journal Name
Iraqi Journal Of Science
Network Traffic Prediction Based on Time Series Modeling

    Predicting the network traffic of web pages is one of the areas that has increased focus in recent years. Modeling traffic helps find strategies for distributing network loads, identifying user behaviors and malicious traffic, and predicting future trends. Many statistical and intelligent methods have been studied to predict web traffic using time series of network traffic. In this paper, the use of machine learning algorithms to model Wikipedia traffic using Google's time series dataset is studied. Two data sets were used for time series, data generalization, building a set of machine learning models (XGboost, Logistic Regression, Linear Regression, and Random Forest), and comparing the performance of the models using (SMAPE) and

... Show More
Scopus Crossref
View Publication Preview PDF
Publication Date
Tue Aug 15 2023
Journal Name
Journal Of Economics And Administrative Sciences
Machine Learning Techniques for Analyzing Survival Data of Breast Cancer Patients in Baghdad

The Machine learning methods, which are one of the most important branches of promising artificial intelligence, have great importance in all sciences such as engineering, medical, and also recently involved widely in statistical sciences and its various branches, including analysis of survival, as it can be considered a new branch used to estimate the survival and was parallel with parametric, nonparametric and semi-parametric methods that are widely used to estimate survival in statistical research. In this paper, the estimate of survival based on medical images of patients with breast cancer who receive their treatment in Iraqi hospitals was discussed. Three algorithms for feature extraction were explained: The first principal compone

... Show More
Crossref (1)
Crossref
View Publication Preview PDF