DeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detection in all previous studies was less than what this paper achieved, especially with the benchmark Flickr faces high-quality dataset (FFHQ). This study proposed, a new, simple, but powerful method called image Re-representation by combining the local binary pattern of multiple-channel (IR-CLBP-MC) color space as an image re-representation technique improved DeepFake detection accuracy. The IRCLBP- MC is produced using the fundamental concept of the multiple-channel of the local binary pattern (MCLBP), an extension of the original LBP. The primary distinction is that in our method, the LBP decimal value is calculated in each local patch channel, merging them to re-represent the image and producing a new image with three color channels. A pretrained convolutional neural network (CNN) was utilized to extract the deep textural features from twelve sets of a dataset of IR-CLBP-MC images made from different color spaces: RGB, XYZ, HLS, HSV, YCbCr, and LAB. Other than that, the experimental results by applying the overlap and non-overlap techniques showed that the first technique was better with the IR-CLBP-MC, and the YCbCr image color space is the most accurate when used with the model and for both datasets. Extensive experimentation is done, and the high accuracy obtained are 99.4% in the FFHQ and 99.8% in the CelebFaces Attributes dataset (Celeb-A).
Often times, especially in practical applications, it is difficult to obtain data that is not tainted by a problem that may be related to the inconsistency of the variance of error or any other problem that impedes the use of the usual methods represented by the method of the ordinary least squares (OLS), To find the capabilities of the features of the multiple linear models, This is why many statisticians resort to the use of estimates by immune methods Especially with the presence of outliers, as well as the problem of error Variance instability, Two methods of horsepower were adopted, they are the robust weighted least square(RWLS)& the two-step robust weighted least square method(TSRWLS), and their performance was verifie
... Show MoreA remarkable correlation between chaotic systems and cryptography has been established with sensitivity to initial states, unpredictability, and complex behaviors. In one development, stages of a chaotic stream cipher are applied to a discrete chaotic dynamic system for the generation of pseudorandom bits. Some of these generators are based on 1D chaotic map and others on 2D ones. In the current study, a pseudorandom bit generator (PRBG) based on a new 2D chaotic logistic map is proposed that runs side-by-side and commences from random independent initial states. The structure of the proposed model consists of the three components of a mouse input device, the proposed 2D chaotic system, and an initial permutation (IP) table. Statist
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThe acidity of spent lubricant was treated using sodium hydroxide solution. The effect of three variables on the treatment have been studied . These are mixing time rangingfrom 5-35 minutes, NaOH to lubricant weight ratio ranging from 0.25-1.25 and weight percentage of NaOH ranging from 2-6 % .
The experimental design of Box-Wilson method is adopted to find a useful relationship between the three controllable variables and the lowering in the acidity of the spent lubricant. Then the effective variables and interactions are identified using the statistical analysis(F-test) of three variable fractional design. The mathematical model is well represented by a second order polynomial.
By
... Show MoreThis work presents the simulation of a Low density Parity Check (LDPC) coding scheme with
multiuserMulti-Carrier Code Division Multiple Access (MC-CDMA) system over Additive White
Gaussian Noise (AWGN) channel and multipath fading channels. The decoding technique used in
the simulation was iterative decoding since it gives maximum efficiency with ten iterations.
Modulation schemes that used are Phase Shift Keying (BPSK, QPSK and 16 PSK), along with the
Orthogonal Frequency Division Multiplexing (OFDM). A 12 pilot carrier were used in the estimator
to compensate channel effect. The channel model used is Long Term Evolution (LTE) channel with
Technical Specification TS 25.101v2.10 and 5 MHz bandwidth including the chan
The research specified with study the relation between the market share for the sample research banks and the amount of the achieved revenues from the investment, where the dominated belief that there potentiality enhancing the revenue on investment with the increase of the banks shares in their markets after their success in achieving rates of successive growth in their sales of sales and to a suitable achieve market coverage for their products and they have dissemination and suitable promotion activity, the market share represented the competition for the banks, and the markets pay attention to the market share as a strategic objective and to maintain them also increasi
... Show MoreAtmospheric transmission is disturbed by scintillation, where scintillation caused more beam divergence. In this work target image spot radius was calculated in presence of atmospheric scintillation. The calculation depend on few relevant equation based on atmospheric parameter (for Middle East), tracking range, expansion ratio of applied beam expander's, receiving unit lens F-number, and the laser wavelength besides photodetector parameter. At maximum target range Rmax =20 km, target image radius is at its maximum Rs=0.4 mm. As the range decreases spot radius decreases too, until the range reaches limit (4 km) at which target image spot radius at its minimum value (0.22 mm). Then as the range decreases, spot radius increases due to geom
... Show MoreThe accuracy of the Moment Method for imposing no-slip boundary conditions in the lattice Boltzmann algorithm is investigated numerically using lid-driven cavity flow. Boundary conditions are imposed directly upon the hydrodynamic moments of the lattice Boltzmann equations, rather than the distribution functions, to ensure the constraints are satisfied precisely at grid points. Both single and multiple relaxation time models are applied. The results are in excellent agreement with data obtained from state-of-the-art numerical methods and are shown to converge with second order accuracy in grid spacing.
The present paper concerns with the problem of estimating the reliability system in the stress – strength model under the consideration non identical and independent of stress and strength and follows Lomax Distribution. Various shrinkage estimation methods were employed in this context depend on Maximum likelihood, Moment Method and shrinkage weight factors based on Monte Carlo Simulation. Comparisons among the suggested estimation methods have been made using the mean absolute percentage error criteria depend on MATLAB program.