This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method gives better performance, especially in image fine structure preservation, compared with other general denoising algorithms.
The multi-focus image fusion method can fuse more than one focused image to generate a single image with more accurate description. The purpose of image fusion is to generate one image by combining information from many source images of the same scene. In this paper, a multi-focus image fusion method is proposed with a hybrid pixel level obtained in the spatial and transform domains. The proposed method is implemented on multi-focus source images in YCbCr color space. As the first step two-level stationary wavelet transform was applied on the Y channel of two source images. The fused Y channel is implemented by using many fusion rule techniques. The Cb and Cr channels of the source images are fused using principal component analysis (PCA).
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreCarbon nanotubes (CNTs) were synthesized via liquefied petroleum gas (LPG) as precursor using flame fragments deposition (FFD) technique. In vitro, biological activates of carbon nanotubes (CNTs) synthesized by FFD technique were investigated. The physiochemical characterizations of synthesized CNTs are similar to other synthesized CNTs and to the standard sample. Pharmaceutical application of synthesized CNTs was studied via conjugation and adsorption with different types of medicines as promote groups. The conjugation of CNTs was performed by adsorption the drugs such as sulfamethoxazole (SMX) and trimethoprim (TMP) on CNTs depending on physical properties of both bonded parts. The synthesized CNTs almost have the same performance in a
... Show MoreThe Purpose of this research is a comparison between two types of multivariate GARCH models BEKK and DVECH to forecast using financial time series which are the series of daily Iraqi dinar exchange rate with dollar, the global daily of Oil price with dollar and the global daily of gold price with dollar for the period from 01/01/2014 till 01/01/2016.The estimation, testing and forecasting process has been computed through the program RATS. Three time series have been transferred to the three asset returns to get the Stationarity, some tests were conducted including Ljung- Box, Multivariate Q and Multivariate ARCH to Returns Series and Residuals Series for both models with comparison between the estimation and for
... Show MoreA multidimensional systolic arrays realization of LMS algorithm by a method of mapping regular algorithm onto processor array, are designed. They are based on appropriately selected 1-D systolic array filter that depends on the inner product sum systolic implementation. Various arrays may be derived that exhibit a regular arrangement of the cells (processors) and local interconnection pattern, which are important for VLSI implementation. It reduces latency time and increases the throughput rate in comparison to classical 1-D systolic arrays. The 3-D multilayered array consists of 2-D layers, which are connected with each other only by edges. Such arrays for LMS-based adaptive (FIR) filter may be opposed the fundamental requirements of fa
... Show MoreA medical- service platform is a mobile application through which patients are provided with doctor’s diagnoses based on information gleaned from medical images. The content of these diagnostic results must not be illegitimately altered during transmission and must be returned to the correct patient. In this paper, we present a solution to these problems using blind, reversible, and fragile watermarking based on authentication of the host image. In our proposed algorithm, the binary version of the Bose_Chaudhuri_Hocquengham (BCH) code for patient medical report (PMR) and binary patient medical image (PMI) after fuzzy exclusive or (F-XoR) are used to produce the patient's unique mark using secret sharing schema (SSS). The patient’s un
... Show MoreThis paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show More