The settlement evaluation for the jet grouted columns (JGC) in soft soils is a problematic matter, because it is influenced by the number of aspects such as soil type, effect mixture between soil and grouting materials, nozzle energy, jet grouting, water flow rate, rotation and lifting speed. Most methods of design the jet-grouting column based on experience. In this study, a prototype single and group jet grouting models (single, 1*2, and 2*2) with the total length and diameter were (2000 and 150 mm) respectively and clear spacing (3D) has been constructed in soft clay and subjected to vertical axial loads. Furthermore, different theoretical methods have been used for the estimation
A nonlinear filter for smoothing color and gray images
corrupted by Gaussian noise is presented in this paper. The proposed
filter designed to reduce the noise in the R,G, and B bands of the
color images and preserving the edges. This filter applied in order to
prepare images for further processing such as edge detection and
image segmentation.
The results of computer simulations show that the proposed
filter gave satisfactory results when compared with the results of
conventional filters such as Gaussian low pass filter and median filter
by using Cross Correlation Coefficient (ccc) criteria.
After baking the flour, azodicarbonamide, an approved food additive, can be converted into carcinogenic semicarbazide hydrochloride (SEM) and biurea in flour products. Thus, determine SEM in commercial bread products is become mandatory and need to be performed. Therefore, two accurate, precision, simple and economics colorimetric methods have been developed for the visual detection and quantitative determination of SEM in commercial flour products. The 1st method is based on the formation of a blue-coloured product with λmax at 690 nm as a result of a reaction between the SEM and potassium ferrocyanide in an acidic medium (pH 6.0). In the 2nd method, a brownish-green colored product is formed due to the reaction between the SEM and phosph
... Show MoreA medical- service platform is a mobile application through which patients are provided with doctor’s diagnoses based on information gleaned from medical images. The content of these diagnostic results must not be illegitimately altered during transmission and must be returned to the correct patient. In this paper, we present a solution to these problems using blind, reversible, and fragile watermarking based on authentication of the host image. In our proposed algorithm, the binary version of the Bose_Chaudhuri_Hocquengham (BCH) code for patient medical report (PMR) and binary patient medical image (PMI) after fuzzy exclusive or (F-XoR) are used to produce the patient's unique mark using secret sharing schema (SSS). The patient’s un
... Show MoreThis paper presents a proposed method for (CBIR) from using Discrete Cosine Transform with Kekre Wavelet Transform (DCT/KWT), and Daubechies Wavelet Transform with Kekre Wavelet Transform (D4/KWT) to extract features for Distributed Database system where clients/server as a Star topology, client send the query image and server (which has the database) make all the work and then send the retrieval images to the client. A comparison between these two approaches: first DCT compare with DCT/KWT and second D4 compare with D4/KWT are made. The work experimented over the image database of 200 images of 4 categories and the performance of image retrieval with respect to two similarity measures namely Euclidian distance (ED) and sum of absolute diff
... Show MoreAlthough the Wiener filtering is the optimal tradeoff of inverse filtering and noise smoothing, in the case when the blurring filter is singular, the Wiener filtering actually amplify the noise. This suggests that a denoising step is needed to remove the amplified noise .Wavelet-based denoising scheme provides a natural technique for this purpose .
In this paper a new image restoration scheme is proposed, the scheme contains two separate steps : Fourier-domain inverse filtering and wavelet-domain image denoising. The first stage is Wiener filtering of the input image , the filtered image is inputted to adaptive threshold wavelet
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
In this paper, two new simple, fast and efficient block matching algorithms are introduced, both methods begins blocks matching process from the image center block and moves across the blocks toward image boundaries. With each block, its motion vector is initialized using linear prediction that depending on the motion vectors of its neighbor blocks that are already scanned and their motion vectors are assessed. Also, a hybrid mechanism is introduced, it depends on mixing the proposed two predictive mechanisms with Exhaustive Search (ES) mechanism in order to gain matching accuracy near or similar to ES but with Search Time ST less than 80% of the ES. Also, it offers more control capability to reduce the search errors. The experimental tests
... Show MoreThe research dealt with a comparative study between some semi-parametric estimation methods to the Partial linear Single Index Model using simulation. There are two approaches to model estimation two-stage procedure and MADE to estimate this model. Simulations were used to study the finite sample performance of estimating methods based on different Single Index models, error variances, and different sample sizes , and the mean average squared errors were used as a comparison criterion between the methods were used. The results showed a preference for the two-stage procedure depending on all the cases that were used