The removal of direct blue 71 dye from a prepared wastewater was studied employing batch electrocoagulation (EC) cell. The electrodes of aluminum were used. The influence of process variables which include initial pH (2.0-12.0), wastewater conductivity (0.8 -12.57) mS/cm , initial dye concentration (30 -210) mg/L, electrolysis time (3-12) min, current density (10-50) mA/cm2 were studied in order to maximize the color removal from wastewater. Experimental results showed that the color removal yield increases with increasing pH until pH 6.0 after that it decreased with increasing pH. The color removal increased with increasing current density, wastewater conductivity, electrolysis time, and decreased with increasing the concentration of initial dye. The maximum color removal yield of 96.5% was obtained at pH 6.0, wastewater conductivity 9.28 mS/cm , electrolysis time 6 min ,the concentration of initial dye 6 0 mg/L and current density 30 mA/cm2 .
Background: Glass ionomers have good biocompatibility and the ability to adhere to both enamel and dentin. However, they have certain demerits, mainly low tensile and compressive strengths. Therefore, this study was done to assess consistency and compressive strength of glass ionomer reinforced by different amount of hydroxyapatite. Materials and Methods: In this study hydroxyapatite materials were added to glass ionomer cement at different ratios, 10%, 15%, 20%, 25% and 30% (by weight). The standard consistency test described in America dental association (ADA) specification No. 8 was used, so that all new base materials could be conveniently mixed and the results would be of comparable value and the compressive strength test described by
... Show MoreDifferent solvents (light naphtha, n-heptane, and n-hexane) are used to treat Iraqi Atmospheric oil residue by the deasphalting process. Oil residue from Al-Dura refinery with specific gravity 0.9705, API 14.9, and 0.5 wt. % sulfur content was used. Deasphalting oil (DAO) was examined on a laboratory scale by using solvents with different operation conditions (temperature, concentration of solvent, solvent to oil ratio, and duration time). This study investigates the effects of these parameters on asphaltene yield. The results show that an increase in temperature for all solvents increases the extraction of asphaltene yield. The higher reduction in asphaltene content is obtained with hexane solvent at operating conditions of (90 °C
... Show MoreStatistics has an important role in studying the characteristics of diverse societies. By using statistical methods, the researcher can make appropriate decisions to reject or accept statistical hypotheses. In this paper, the statistical analysis of the data of variables related to patients infected with the Coronavirus was conducted through the method of multivariate analysis of variance (MANOVA) and the statement of the effect of these variables.
This study was aimed to determine a phytotoxicity experiment with kerosene as a model of a total petroleum hydrocarbon (TPHs) as Kerosene pollutant at different concentrations (1% and 6%) with aeration rate (0 and 1 L/min) and retention time (7, 14, 21, 28 and 42 days), was carried out in a subsurface flow system (SSF) on the Barley wetland. It was noted that greatest elimination 95.7% recorded at 1% kerosene levels and aeration rate 1L / min after a period of 42 days of exposure; whereas it was 47% in the control test without plants. Furthermore, the percent of elimination efficiencies of hydrocarbons from the soil was ranged between 34.155%-95.7% for all TPHs (Kerosene) concentrations at aeration rate (0 and 1 L/min). The Barley c
... Show MoreLowpass spatial filters are adopted to match the noise statistics of the degradation seeking
good quality smoothed images. This study imply different size and shape of smoothing
windows. The study shows that using a window square frame shape gives good quality
smoothing and at the same time preserving a certain level of high frequency components in
comparsion with standard smoothing filters.
Electrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the
The wavelet transform has become a useful computational tool for a variety of signal and image processing applications.
The aim of this paper is to present the comparative study of various wavelet filters. Eleven different wavelet filters (Haar, Mallat, Symlets, Integer, Conflict, Daubechi 1, Daubechi 2, Daubechi 4, Daubechi 7, Daubechi 12 and Daubechi 20) are used to compress seven true color images of 256x256 as a samples. Image quality, parameters such as peak signal-to-noise ratio (PSNR), normalized mean square error have been used to evaluate the performance of wavelet filters.
In our work PSNR is used as a measure of accuracy performanc
... Show MoreIn this paper two main stages for image classification has been presented. Training stage consists of collecting images of interest, and apply BOVW on these images (features extraction and description using SIFT, and vocabulary generation), while testing stage classifies a new unlabeled image using nearest neighbor classification method for features descriptor. Supervised bag of visual words gives good result that are present clearly in the experimental part where unlabeled images are classified although small number of images are used in the training process.
Some problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show More