Digital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different resolutions. By considering features from multiple levels, the detection algorithm can better capture both global and local characteristics of the manipulated regions, enhancing the accuracy of forgery detection. To achieve a high accuracy rate, this paper presents a variety of scenarios based on a machine-learning approach. In Copy-Move detection, artifacts and their properties are used as image features and support Vector Machine (SVM) to determine whether an image is tampered with. The dataset is manipulated to train and test each classifier; the target is to learn the discriminative patterns that detect instances of copy-move forgery. Media Integration and Call Center Forgery (MICC-F2000) were utilized in this paper. Experimental evaluations demonstrate the effectiveness of the proposed methodology in detecting copy-move. The implementation phases in the proposed work have produced encouraging outcomes. In the case of the best-implemented scenario involving multiple trials, the detection stage achieved a copy-move accuracy of 97.8 %.
An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w
Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreOften phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreThe distribution of the expanded exponentiated power function EEPF with four parameters, was presented by the exponentiated expanded method using the expanded distribution of the power function, This method is characterized by obtaining a new distribution belonging to the exponential family, as we obtained the survival rate and failure rate function for this distribution, Some mathematical properties were found, then we used the developed least squares method to estimate the parameters using the genetic algorithm, and a Monte Carlo simulation study was conducted to evaluate the performance of estimations of possibility using the Genetic algorithm GA.
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreBackgroundCarcinoma of the larynx represent 10% of head & neck malignancies. The treatment of advanced carcinoma of larynx may include partial or total laryngectomy, with or without laser , radiotherapy,and or chemotherapy . Carcinoma of the larynx usually affect old age , heavy smoker with possible risk of pulmonary diseases & ischemic heart disease , which add risk to the general anaesthetic complication operative & postoperative - Objectives this study was designed to assess the feasibility of total laryngectomy under local anaesthesia in medically unfit patients for general anaesthesia & to re-establish practice doing total laryngectomy under local anaesthesia in those patients.
Methods a prospective study on 12 pa
BackgroundCarcinoma of the larynx represent 10% of head & neck malignancies. The treatment of advanced carcinoma of larynx may include partial or total laryngectomy, with or without laser , radiotherapy,and or chemotherapy . Carcinoma of the larynx usually affect old age , heavy smoker with possible risk of pulmonary diseases & ischemic heart disease , which add risk to the general anaesthetic complication operative & postoperative - Objectives this study was designed to assess the feasibility of total laryngectomy under local anaesthesia in medically unfit patients for general anaesthesia & to re-establish practice doing total laryngectomy under local anaesthesia in those patients.
Methods a prospective study on 12 pa