Digital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different resolutions. By considering features from multiple levels, the detection algorithm can better capture both global and local characteristics of the manipulated regions, enhancing the accuracy of forgery detection. To achieve a high accuracy rate, this paper presents a variety of scenarios based on a machine-learning approach. In Copy-Move detection, artifacts and their properties are used as image features and support Vector Machine (SVM) to determine whether an image is tampered with. The dataset is manipulated to train and test each classifier; the target is to learn the discriminative patterns that detect instances of copy-move forgery. Media Integration and Call Center Forgery (MICC-F2000) were utilized in this paper. Experimental evaluations demonstrate the effectiveness of the proposed methodology in detecting copy-move. The implementation phases in the proposed work have produced encouraging outcomes. In the case of the best-implemented scenario involving multiple trials, the detection stage achieved a copy-move accuracy of 97.8 %.
This Book is intended to be textbook studied for undergraduate course in multivariate analysis. This book is designed to be used in semester system. In order to achieve the goals of the book, it is divided into the following chapters (as done in the first edition 2019). Chapter One introduces matrix algebra. Chapter Two devotes to Linear Equation System Solution with quadratic forms, Characteristic roots & vectors. Chapter Three discusses Partitioned Matrices and how to get Inverse, Jacobi and Hessian matrices. Chapter Four deals with Multivariate Normal Distribution (MVN). Chapter Five concern with Joint, Marginal and Conditional Normal Distribution, independency and correlations. While the revised new chapters have been added (as the curr
... Show MoreIn this study, active knife and fixed knife of single-row disc silage machine has three different clearance C1, C2 and C3 (1, 3 and 5 mm) and it is tried in three different working speed V1, V2 and V3 (1.8, 2.5 and 3.7 km / h) and PTO speed (540 min-1) and machine's fuel consumption (l/h), average power consumption (kW), field energy consumption (kW/da), product energy consumption (kW/t), field working capacity (da/h), product working capacity (t/h) and Chopping size distribution characteristics of the fragmented material were determined. It has been found that knife-counter knife clearances smaller than 3 mm (1 mm) and larger (5 mm) have a negative effect on machine performance in general. In terms of fuel and power consumptions, the m
... Show MoreAdministrative procedures in various organizations produce numerous crucial records and data. These
records and data are also used in other processes like customer relationship management and accounting
operations.It is incredibly challenging to use and extract valuable and meaningful information from these data
and records because they are frequently enormous and continuously growing in size and complexity.Data
mining is the act of sorting through large data sets to find patterns and relationships that might aid in the data
analysis process of resolving business issues. Using data mining techniques, enterprises can forecast future
trends and make better business decisions.The Apriori algorithm has bee
Problem: Cancer is regarded as one of the world's deadliest diseases. Machine learning and its new branch (deep learning) algorithms can facilitate the way of dealing with cancer, especially in the field of cancer prevention and detection. Traditional ways of analyzing cancer data have their limits, and cancer data is growing quickly. This makes it possible for deep learning to move forward with its powerful abilities to analyze and process cancer data. Aims: In the current study, a deep-learning medical support system for the prediction of lung cancer is presented. Methods: The study uses three different deep learning models (EfficientNetB3, ResNet50 and ResNet101) with the transfer learning concept. The three models are trained using a
... Show MoreExploitation of mature oil fields around the world has forced researchers to develop new ways to optimize reservoir performance from such reservoirs. To achieve that, drilling horizontal wells is an effective method. The effectiveness of this kind of wells is to increase oil withdrawal. The objective of this study is to optimize the location, design, and completion of a new horizontal well as an oil producer to improve oil recovery in a real field located in Iraq. “A” is an oil and gas condensate field located in the Northeast of Iraq. From field production history, it is realized the difficulty to control gas and water production in this kind of complex carbonate reservoir with vertical producer wells. In this study, a horizont
... Show MoreFlying Ad hoc Networks (FANETs) has developed as an innovative technology for access places without permanent infrastructure. This emerging form of networking is construct of flying nodes known as unmanned aerial vehicles (UAVs) that fly at a fast rate of speed, causing frequent changes in the network topology and connection failures. As a result, there is no dedicated FANET routing protocol that enables effective communication between these devices. The purpose of this paper is to evaluate the performance of the category of topology-based routing protocols in the FANET. In a surveillance system involving video traffic, four routing protocols with varying routing mechanisms were examined. Additionally, simulation experiments conduct
... Show MoreThis paper proposes a new algorithm (F2SE) and algorithm (Alg(n – 1)) for solving the
two-machine flow shop problem with the objective of minimizing total earliness. This
complexity result leads us to use an enumeration solution approach for the algorithm (F2SE)
and (DM) is more effective than algorithm Alg( n – 1) to obtain approximate solution.
In this paper, we present a Branch and Bound (B&B) algorithm of scheduling (n) jobs on a single machine to minimize the sum total completion time, total tardiness, total earliness, number of tardy jobs and total late work with unequal release dates. We proposed six heuristic methods for account upper bound. Also to obtain lower bound (LB) to this problem we modified a (LB) select from literature, with (Moore algorithm and Lawler's algorithm). And some dominance rules were suggested. Also, two special cases were derived. Computational experience showed the proposed (B&B) algorithm was effective in solving problems with up to (16) jobs, also the upper bounds and the lower bound were effective in restr
... Show MoreIn the petroleum industry, multiphase flow dynamics within the tubing string have gained significant attention due to associated challenges. Accurately predicting pressure drops and wellbore pressures is crucial for the effective modeling of vertical lift performance (VLP). This study focuses on predicting the multiphase flow behavior in four wells located in the Faihaa oil field in southern Iraq, utilizing PIPESIM software. The process of selecting the most appropriate multiphase correlation was performed by utilizing production test data to construct a comprehensive survey data catalog. Subsequently, the results were compared with the correlations available within the PIPESIM software. The outcomes reveal that the Hagedorn and Brown (H
... Show MoreRecently, biometric technologies are used widely due to their improved security that decreases cases of deception and theft. The biometric technologies use physical features and characters in the identification of individuals. The most common biometric technologies are: Iris, voice, fingerprint, handwriting and hand print. In this paper, two biometric recognition technologies are analyzed and compared, which are the iris and sound recognition techniques. The iris recognition technique recognizes persons by analyzing the main patterns in the iris structure, while the sound recognition technique identifies individuals depending on their unique voice characteristics or as called voice print. The comparison results show that the resul
... Show More