Two types of adsorbents were used to treat oily wastewater, activated carbon and zeolite. The removal efficiencies of these materials were compared to each other. The results showed that activated carbon performed some better properties in removal of oil. The experimental methods which were employed in this investigation included batch and column studies. The former was used to evaluate the rate and equilibrium of carbon and zeolie adsorption, while the latter was used to determine treatment efficiencies and performance characteristics. Expanded bed adsorber was constructed in the column studies. In this study, the adsorption behavior of vegetable oil (corn oil) onto activated carbon and zeolite was examined as a function of the concentration of the adsorbate, contact time, adsorbent dosage and amount of coagulant salt(calcium sulphate) added . The adsorption data was modeled with Freundlich and Langmuir adsorption isotherms. and it was found that the adsorption process on activated carbon and zeolite fit the Freundlich isotherm model. The amount of oil adsorbed increased with increasing the contact time, but longer mixing duration did not increase residual oil removal from wastewater due to the coverage of the adsorbent surface with oil molecules. It was found that as the dosage of adsorbent increased, the percentage of residual oil removal also increased. The effects of adsorbent type and amount of coagulant salt(calcium sulphate) added on the breakthrough curve were studied in details in the column studies. Expanded bed behavior was modeled using the Richardson-Zaki correlation between the superficial velocity of the feed stream and the void fraction of the bed at moderate Reynolds number.
The paper aims to propose Teaching Learning based Optimization (TLBO) algorithm to solve 3-D packing problem in containers. The objective which can be presented in a mathematical model is optimizing the space usage in a container. Besides the interaction effect between students and teacher, this algorithm also observes the learning process between students in the classroom which does not need any control parameters. Thus, TLBO provides the teachers phase and students phase as its main updating process to find the best solution. More precisely, to validate the algorithm effectiveness, it was implemented in three sample cases. There was small data which had 5 size-types of items with 12 units, medium data which had 10 size-types of items w
... Show MoreThe increased size of grayscale images or upscale plays a central role in various fields such as medicine, satellite imagery, and photography. This paper presents a technique for improving upscaling gray images using a new mixing wavelet generation by tensor product. The proposed technique employs a multi-resolution analysis provided by a new mixing wavelet transform algorithm to decompose the input image into different frequency components. After processing, the low-resolution input image is effectively transformed into a higher-resolution representation by adding a zeroes matrix. Discrete wavelets transform (Daubechies wavelet Haar) as a 2D matrix is used but is mixed using tensor product with another wavelet matrix’s size. MATLAB R2021
... Show MoreThe need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as v
... Show MoreComputer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreSimulation of direct current (DC) discharge plasma using
COMSOL Multiphysics software were used to study the uniformity
of deposition on anode from DC discharge sputtering using ring and
disc cathodes, then applied it experimentally to make comparison
between film thickness distribution with simulation results. Both
simulation and experimental results shows that the deposition using
copper ring cathode is more uniformity than disc cathode
The researchers have a special interest in studying Markov chains as one of the probability samples which has many applications in different fields. This study comes to deal with the changes issue that happen on budget expenditures by using statistical methods, and Markov chains is the best expression about that as they are regarded reliable samples in the prediction process. A transitional matrix is built for three expenditure cases (increase ,decrease ,stability) for one of budget expenditure items (base salary) for three directorates (Baghdad ,Nineveh , Diyala) of one of the ministries. Results are analyzed by applying Maximum likelihood estimation and Ordinary least squares methods resulting
... Show More