In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the methods of the robust circular S method in the case that the data does not contain outlier values because it was recorded the lowest mean criterion, mean squares error (Median MSE), the least median standard error (Median SE) and the largest value of the criterion of the mean cosines of the circular residuals A(K) for all proposed sample sizes (n=20, 50, 100). In the case of the contaminant in the vertical data, it was found that the circular least squares method is not preferred at all contaminant rates and for all sample sizes, and the higher the percentage of contamination in the vertical data, the greater the preference of the validity of estimation methods, where the mean criterion of median squares of error (Median MSE) and criterion of median standard error (Median SE) decrease and the value of the mean criterion of the mean cosines of the circular residuals A(K) increases for all proposed sample sizes. In the case of the contaminant at high lifting points, the circular least squares method is not preferred by a large percentage at all levels of contaminant and for all sample sizes, and the higher the percentage of the contaminant at the lifting points, the greater the preference of the validity estimation methods, so that the mean criterion of mean squares of error (Median MSE) and criterion of median standard error (Median SE) decrease, and the value of the mean criterion increases for the mean cosines of the circular residuals A(K) and for all sample sizes.
TThe property of 134−140Neodymium nuclei have been studied in framework Interacting Boson Model (IBM) and a new method called New Empirical Formula (NEF). The energy positive parity bands of 134−140Nd have been calculated using (IBM) and (NEF) while the negative parity bands of 134−140Nd have been calculated using (NEF) only. The E-GOS curve as a function of the spin (I) has been drawn to determine the property of the positive parity yrast band. The parameters of the best fit to the measured data are determined. The reduced transition probabilities of these nuclei was calculated. The critical point has been determined for 140Nd isotope. The potential energy surfaces (PESs) to the IBM Hamiltonian have been obtained using the intrin
... Show MoreThere many methods for estimation of permeability. In this Paper, permeability has been estimated by two methods. The conventional and modified methods are used to calculate flow zone indicator (FZI). The hydraulic flow unit (HU) was identified by FZI technique. This technique is effective in predicting the permeability in un-cored intervals/wells. HU is related with FZI and rock quality index (RQI). All available cores from 7 wells (Su -4, Su -5, Su -7, Su -8, Su -9, Su -12, and Su -14) were used to be database for HU classification. The plot of probability cumulative of FZI is used. The plot of core-derived probability FZI for both modified and conventional method which indicates 4 Hu (A, B, C and D) for Nahr Umr forma
... Show MoreThis paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreThis search has introduced the techniques of multi-wavelet transform and neural network for recognition 3-D object from 2-D image using patches. The proposed techniques were tested on database of different patches features and the high energy subband of discrete multi-wavelet transform DMWT (gp) of the patches. The test set has two groups, group (1) which contains images, their (gp) patches and patches features of the same images as a part of that in the data set beside other images, (gp) patches and features, and group (2) which contains the (gp) patches and patches features the same as a part of that in the database but after modification such as rotation, scaling and translation. Recognition by back propagation (BP) neural network as com
... Show More<span>One of the main difficulties facing the certified documents documentary archiving system is checking the stamps system, but, that stamps may be contains complex background and surrounded by unwanted data. Therefore, the main objective of this paper is to isolate background and to remove noise that may be surrounded stamp. Our proposed method comprises of four phases, firstly, we apply k-means algorithm for clustering stamp image into a number of clusters and merged them using ISODATA algorithm. Secondly, we compute mean and standard deviation for each remaining cluster to isolate background cluster from stamp cluster. Thirdly, a region growing algorithm is applied to segment the image and then choosing the connected regi
... Show More<p class="0abstract">Image denoising is a technique for removing unwanted signals called the noise, which coupling with the original signal when transmitting them; to remove the noise from the original signal, many denoising methods are used. In this paper, the Multiwavelet Transform (MWT) is used to denoise the corrupted image by Choosing the HH coefficient for processing based on two different filters Tri-State Median filter and Switching Median filter. With each filter, various rules are used, such as Normal Shrink, Sure Shrink, Visu Shrink, and Bivariate Shrink. The proposed algorithm is applied Salt& pepper noise with different levels for grayscale test images. The quality of the denoised image is evaluated by usi
... Show MoreAspect-based sentiment analysis is the most important research topic conducted to extract and categorize aspect-terms from online reviews. Recent efforts have shown that topic modelling is vigorously used for this task. In this paper, we integrated word embedding into collapsed Gibbs sampling in Latent Dirichlet Allocation (LDA). Specifically, the conditional distribution in the topic model is improved using the word embedding model that was trained against (customer review) training dataset. Semantic similarity (cosine measure) was leveraged to distribute the aspect-terms to their related aspect-category cognitively. The experiment was conducted to extract and categorize the aspect terms from SemEval 2014 dataset.
CNC machine is used to machine complex or simple shapes at higher speed with maximum accuracy and minimum error. In this paper a previously designed CNC control system is used to machine ellipses and polylines. The sample needs to be machined is drawn by using one of the drawing software like AUTOCAD® or 3D MAX and is saved in a well-known file format (DXF) then that file is fed to the CNC machine controller by the CNC operator then that part will be machined by the CNC machine. The CNC controller using developed algorithms that reads the DXF file feeds to the machine, extracts the shapes from the file and generates commands to move the CNC machine axes so that these shapes can be machined.
This study aims to demonstrate the role of artificial intelligence and metaverse techniques, mainly logistical Regression, in reducing earnings management in Iraqi private banks. Synthetic intelligence approaches have shown the capability to detect irregularities in financial statements and mitigate the practice of earnings management. In contrast, many privately owned banks in Iraq historically relied on manual processes involving pen and paper for recording and posting financial information in their accounting records. However, the banking sector in Iraq has undergone technological advancements, leading to the Automation of most banking operations. Conventional audit techniques have become outdated due to factors such as the accuracy of d
... Show More