This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method gives better performance, especially in image fine structure preservation, compared with other general denoising algorithms.
The proliferation of many editing programs based on artificial intelligence techniques has contributed to the emergence of deepfake technology. Deepfakes are committed to fabricating and falsifying facts by making a person do actions or say words that he never did or said. So that developing an algorithm for deepfakes detection is very important to discriminate real from fake media. Convolutional neural networks (CNNs) are among the most complex classifiers, but choosing the nature of the data fed to these networks is extremely important. For this reason, we capture fine texture details of input data frames using 16 Gabor filters indifferent directions and then feed them to a binary CNN classifier instead of using the red-green-blue
... Show MoreThis work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian
Multivariate Non-Parametric control charts were used to monitoring the data that generated by using the simulation, whether they are within control limits or not. Since that non-parametric methods do not require any assumptions about the distribution of the data. This research aims to apply the multivariate non-parametric quality control methods, which are Multivariate Wilcoxon Signed-Rank ( ) , kernel principal component analysis (KPCA) and k-nearest neighbor ( −
We propose a new method for detecting the abnormality in cerebral tissues present within Magnetic Resonance Images (MRI). Present classifier is comprised of cerebral tissue extraction, image division into angular and distance span vectors, acquirement of four features for each portion and classification to ascertain the abnormality location. The threshold value and region of interest are discerned using operator input and Otsu algorithm. Novel brain slices image division is introduced via angular and distance span vectors of sizes 24˚ with 15 pixels. Rotation invariance of the angular span vector is determined. An automatic image categorization into normal and abnormal brain tissues is performed using Support Vector Machine (SVM). St
... Show MoreIn order to select the optimal tracking of fast time variation of multipath fast time variation Rayleigh fading channel, this paper focuses on the recursive least-squares (RLS) and Extended recursive least-squares (E-RLS) algorithms and reaches the conclusion that E-RLS is more feasible according to the comparison output of the simulation program from tracking performance and mean square error over five fast time variation of Rayleigh fading channels and more than one time (send/receive) reach to 100 times to make sure from efficiency of these algorithms.
This study relates to the estimation of a simultaneous equations system for the Tobit model where the dependent variables ( ) are limited, and this will affect the method to choose the good estimator. So, we will use new estimations methods different from the classical methods, which if used in such a case, will produce biased and inconsistent estimators which is (Nelson-Olson) method and Two- Stage limited dependent variables(2SLDV) method to get of estimators that hold characteristics the good estimator .
That is , parameters will be estim
... Show MoreTwilight is that light appear on the horizon before sunrise and after sunset, Astronomically it is known that sunrise and sunset are effected by high above sea level, but the effect of high above sea level on the time of astronomical twilight still not decided and controversy among astronomers, in This research we studies the effect of high above sea level on the time of astronomical twilight, through adding the equation correct high above sea level to equation computation of twilight and then calculate of changing in the time of twilight for different highest (0-10000) meters above sea level , and the ratio of increase for time with high between (15.45-20.5) minutes. It was found that there was an increase in the time of the twilight along
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
The partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
Due to a party's violation of his obligations or responsibilities indicated in the contract, many engineering projects confront extensive contractual disputes, which in turn need arbitration or other forms of dispute resolution, which negatively impact the project's outcome. Each contract has its terms for dispute resolution. Therefore, this paper aims to study the provisions for dispute resolution according to Iraqi (SBDW) and the JCT (SBC/Q2016) and also to show the extent of the difference between the two contracts in the application of these provisions. The methodology includes a detailed study of the dispute settlement provisions for both contracts with a comparative analysis to identify the differences in the appli
... Show More