This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method gives better performance, especially in image fine structure preservation, compared with other general denoising algorithms.
Due to a party's violation of his obligations or responsibilities indicated in the contract, many engineering projects confront extensive contractual disputes, which in turn need arbitration or other forms of dispute resolution, which negatively impact the project's outcome. Each contract has its terms for dispute resolution. Therefore, this paper aims to study the provisions for dispute resolution according to Iraqi (SBDW) and the JCT (SBC/Q2016) and also to show the extent of the difference between the two contracts in the application of these provisions. The methodology includes a detailed study of the dispute settlement provisions for both contracts with a comparative analysis to identify the differences in the appli
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
We propose a new method for detecting the abnormality in cerebral tissues present within Magnetic Resonance Images (MRI). Present classifier is comprised of cerebral tissue extraction, image division into angular and distance span vectors, acquirement of four features for each portion and classification to ascertain the abnormality location. The threshold value and region of interest are discerned using operator input and Otsu algorithm. Novel brain slices image division is introduced via angular and distance span vectors of sizes 24˚ with 15 pixels. Rotation invariance of the angular span vector is determined. An automatic image categorization into normal and abnormal brain tissues is performed using Support Vector Machine (SVM). St
... Show More
ABSTRACT
The researcher seeks to shed light on the relationship analysis and the impact between organizational values in all its dimensions (Administration Management, Mission, relationship management, environmental management) and strategic performance (financial perspective, customer perspective, the perspective of internal processes, learning and development) in the presidency of Two Universities of Baghdad & Al-Nahrain, it has been formulating three hypotheses for this purpose.
The main research problem has been the following question: Is there a relationship and the impact of bet
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreAlthough the Wiener filtering is the optimal tradeoff of inverse filtering and noise smoothing, in the case when the blurring filter is singular, the Wiener filtering actually amplify the noise. This suggests that a denoising step is needed to remove the amplified noise .Wavelet-based denoising scheme provides a natural technique for this purpose .
In this paper a new image restoration scheme is proposed, the scheme contains two separate steps : Fourier-domain inverse filtering and wavelet-domain image denoising. The first stage is Wiener filtering of the input image , the filtered image is inputted to adaptive threshold wavelet
... Show MoreThis paper presents the matrix completion problem for image denoising. Three problems based on matrix norm are performing: Spectral norm minimization problem (SNP), Nuclear norm minimization problem (NNP), and Weighted nuclear norm minimization problem (WNNP). In general, images representing by a matrix this matrix contains the information of the image, some information is irrelevant or unfavorable, so to overcome this unwanted information in the image matrix, information completion is used to comperes the matrix and remove this unwanted information. The unwanted information is handled by defining {0,1}-operator under some threshold. Applying this operator on a given ma
... Show MoreThe continuous pressure of work and daily life and the increasing financial and social stress that Iraqi women are experiencing (both inside and outside Iraq) is one of the main causes of anxiety, particularly in those of working class women. This group of women carry the burden of carrying out multiple roles and responsibilities at the same time. All this collectively make them more prone to developing anxiety compared to men. In addition, the physiological and psychological nature of women, as females, on top of the other roles in life, like being a wife or mother or daughter or sister, all add extra pressure on women especially for those who are considered as productive working individuals in the society. In order to study the relatio
... Show MoreRouting is the process of delivering a packet from a source to a destination in the network using a routing algorithm that tries to create an efficient path. The path should be created with minimum overhead and bandwidth consumption. In literature, routing protocols in VANET were categorized in many ways, according to different aspects. In the present study, we prefer the classification based on the number of hops to reach the destination node. In literature, these are single-hop and multi-hops protocols. We first discuss the two types and then compare the MDDV (multi-hops protocol) with VADD (single-hop protocol). The comparison is theoretically and experimentally implemented by providing a network environment consisting of SUMO, VIENS and
... Show More