DeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detection in all previous studies was less than what this paper achieved, especially with the benchmark Flickr faces high-quality dataset (FFHQ). This study proposed, a new, simple, but powerful method called image Re-representation by combining the local binary pattern of multiple-channel (IR-CLBP-MC) color space as an image re-representation technique improved DeepFake detection accuracy. The IRCLBP- MC is produced using the fundamental concept of the multiple-channel of the local binary pattern (MCLBP), an extension of the original LBP. The primary distinction is that in our method, the LBP decimal value is calculated in each local patch channel, merging them to re-represent the image and producing a new image with three color channels. A pretrained convolutional neural network (CNN) was utilized to extract the deep textural features from twelve sets of a dataset of IR-CLBP-MC images made from different color spaces: RGB, XYZ, HLS, HSV, YCbCr, and LAB. Other than that, the experimental results by applying the overlap and non-overlap techniques showed that the first technique was better with the IR-CLBP-MC, and the YCbCr image color space is the most accurate when used with the model and for both datasets. Extensive experimentation is done, and the high accuracy obtained are 99.4% in the FFHQ and 99.8% in the CelebFaces Attributes dataset (Celeb-A).
Background: Legionella pneumophila (L. pneumophila) is gram-negative bacterium, which causes Legionnaires’ disease as well as Pontiac fever. Objective: To determine the frequency of Legionella pneumophila in pneumonic patients, to determine the clinical utility of diagnosing Legionella pneumonia by urinary antigen testing (LPUAT) in terms of sensitivity and specificity, to compares the results obtained from patients by urinary antigen test with q Real Time PCR (RT PCR) using serum samples and to determine the frequency of serogroup 1 and other serogroups of L. pneumophila. Methods: A total of 100 pneumonic patients (community acquired pneumonia) were enrolled in this study during a period between October 2016 to April 2017; 92 sam
... Show MoreBecause of vulnerable threats and attacks against database during transmission from sender to receiver, which is one of the most global security concerns of network users, a lightweight cryptosystem using Rivest Cipher 4 (RC4) algorithm is proposed. This cryptosystem maintains data privacy by performing encryption of data in cipher form and transfers it over the network and again performing decryption to original data. Hens, ciphers represent encapsulating system for database tables
Implementation of TSFS (Transposition, Substitution, Folding, and Shifting) algorithm as an encryption algorithm in database security had limitations in character set and the number of keys used. The proposed cryptosystem is based on making some enhancements on the phases of TSFS encryption algorithm by computing the determinant of the keys matrices which affects the implementation of the algorithm phases. These changes showed high security to the database against different types of security attacks by achieving both goals of confusion and diffusion.
The goal of the research is to develop a sustainable rating system for roadway projects in Iraq for all of the life cycle stages of the projects which are (planning, design, construction and operation and maintenance). This paper investigates the criteria and its weightings of the suggested roadway rating system depending on sustainable planning activities. The methodology started in suggesting a group of sustainable criteria for planning stage and then suggesting weights from (1-5) points for each one of it. After that data were collected by using a closed questionnaire directed to the roadway experts group in order to verify the criteria weightings based on the relative importance of the roadway related impacts
... Show MoreIn this research, the Williamson-Hall method and of size-strain plot method was employed to analyze X- ray lines for evaluating the crystallite size and lattice strain and of cadmium oxide nanoparticles. the crystallite size value is (15.2 nm) and (93.1 nm) and lattice strain (4.2 x10−4 ) and (21x10−4) respectively. Also, other methods have been employed to evaluate the crystallite size. The current methods are (Sherrer and modified Sherrer methods ) and their results are (14.8 nm) and (13.9nm) respectively. Each method of analysis has a different result because the alteration in the crystallite size and lattice strain calculated according to the Williamson-Hall and size-strain plot methods shows that the non-uniform strain in nan
... Show MoreIn this study, structures damage identification method based on changes in the dynamic characteristics
(frequencies) of the structure are examined, stiffness as well as mass matrices of the curved
(in and out-of-plane vibration) beam elements is formulated using Hamilton's principle. Each node
of both of them possesses seven degrees of freedom including the warping degree of freedom. The
curved beam element had been derived based on the Kang and Yoo’s thin-walled curved beam theory
in 1994. A computer program was developing to carry out free vibration analyses of the curved
beam as well as straight beam. Comparing with the frequencies for other researchers using the general
purpose program MATLAB. Fuzzy logic syste
Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show MoreThe purpose of this paper is to define fuzzy subspaces for fuzzy space of orderings and we prove some results about this definition in which it leads to a lot of new results on fuzzy space of orderings. Also we define the sum and product over such spaces such that: If f = < a1,…,an > and g = < b1,…bm>, their sum and product are f + g = < a1…,an, b1, …, bm> and f × g =