A nonlinear filter for smoothing color and gray images
corrupted by Gaussian noise is presented in this paper. The proposed
filter designed to reduce the noise in the R,G, and B bands of the
color images and preserving the edges. This filter applied in order to
prepare images for further processing such as edge detection and
image segmentation.
The results of computer simulations show that the proposed
filter gave satisfactory results when compared with the results of
conventional filters such as Gaussian low pass filter and median filter
by using Cross Correlation Coefficient (ccc) criteria.
After baking the flour, azodicarbonamide, an approved food additive, can be converted into carcinogenic semicarbazide hydrochloride (SEM) and biurea in flour products. Thus, determine SEM in commercial bread products is become mandatory and need to be performed. Therefore, two accurate, precision, simple and economics colorimetric methods have been developed for the visual detection and quantitative determination of SEM in commercial flour products. The 1st method is based on the formation of a blue-coloured product with λmax at 690 nm as a result of a reaction between the SEM and potassium ferrocyanide in an acidic medium (pH 6.0). In the 2nd method, a brownish-green colored product is formed due to the reaction between the SEM and phosph
... Show MoreA system was used to detect injuries in plant leaves by combining machine learning and the principles of image processing. A small agricultural robot was implemented for fine spraying by identifying infected leaves using image processing technology with four different forward speeds (35, 46, 63 and 80 cm/s). The results revealed that increasing the speed of the agricultural robot led to a decrease in the mount of supplements spraying and a detection percentage of infected plants. They also revealed a decrease in the percentage of supplements spraying by 46.89, 52.94, 63.07 and 76% with different forward speeds compared to the traditional method.
This deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreThe research dealt with a comparative study between some semi-parametric estimation methods to the Partial linear Single Index Model using simulation. There are two approaches to model estimation two-stage procedure and MADE to estimate this model. Simulations were used to study the finite sample performance of estimating methods based on different Single Index models, error variances, and different sample sizes , and the mean average squared errors were used as a comparison criterion between the methods were used. The results showed a preference for the two-stage procedure depending on all the cases that were used