In this study, iron was coupled with copper to form a bimetallic compound through a biosynthetic method, which was then used as a catalyst in the Fenton-like processes for removing direct Blue 15 dye (DB15) from aqueous solution. Characterization techniques were applied on the resultant nanoparticles such as SEM, BET, EDAX, FT-IR, XRD, and zeta potential. Specifically, the rounded and shaped as spherical nanoparticles were found for green synthesized iron/copper nanoparticles (G-Fe/Cu NPs) with the size ranging from 32-59 nm, and the surface area was 4.452 m2/g. The effect of different experimental factors was studied in both batch and continuous experiments. These factors were H2O2 concentration, G-Fe/CuNPs amount, pH, initial DB15 concentration, and temperature in the batch system. The batch results showed 98% of 100 mg/L of DB15 was degraded with optimum H2O2 concentration, G-Fe/Cu-NPs dose, pH, and temperature 3.52 mmol/L, 0.7 g/L, 3, and 50℃ respectively. For the continuous mode, the influences of initial DB15 concentration, feed flow rate, G-Fe/Cu-NPs depth were investigated using an optimized experimental Box-Behnken design, while the conditions of pH and H2O2 concentration were based on the best value found in the batch experiments. The model optimization was set the parameters at 2.134 ml/min flow rate, 26.16 mg/L initial dye concentration, and 1.42 cm catalyst depth. All the parameters of the breakthrough curve were also studied in this study including break time, saturation time, length of mass transfer zone, the volume of bed, and volume effluent.
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreCurrently, the prominence of automatic multi document summarization task belongs to the information rapid increasing on the Internet. Automatic document summarization technology is progressing and may offer a solution to the problem of information overload.
Automatic text summarization system has the challenge of producing a high quality summary. In this study, the design of generic text summarization model based on sentence extraction has been redirected into a more semantic measure reflecting individually the two significant objectives: content coverage and diversity when generating summaries from multiple documents as an explicit optimization model. The proposed two models have been then coupled and def
... Show MoreAkaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
A substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show MoreIn low-latitude areas less than 10° in latitude angle, the solar radiation that goes into the solar still increases as the cover slope approaches the latitude angle. However, the amount of water that is condensed and then falls toward the solar-still basin is also increased in this case. Consequently, the solar yield still is significantly decreased, and the accuracy of the prediction method is affected. This reduction in the yield and the accuracy of the prediction method is inversely proportional to the time in which the condensed water stays on the inner side of the condensing cover without collection because more drops will fall down into the basin of the solar-still. Different numbers of scraper motions per hour (NSM), that is
... Show MoreGraphite Coated Electrodes (GCE) based on molecularly imprinted polymers were fabricated for the selective potentiometric determination of Risperidone (Ris). The molecularly imprinted (MIP) and nonimprinted (NIP) polymers were synthesized by bulk polymerization using (Ris.) as a template, acrylic acid (AA) and acrylamide (AAm) as monomers, ethylene glycol dimethacrylate (EGDMA) as a cross-linker and benzoyl peroxide (BPO) as an initiator. The imprinted membranes and the non-imprinted membranes were prepared using dioctyl phthalate (DOP) and Dibutylphthalate (DBP) as plasticizers in PVC matrix. The membranes were coated on graphite electrodes. The MIP electrodes using
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreA low speed open circuit wind tunnel has been designed, manufactured and constructed at the Mechanical Engineering Department at Baghdad University - College of Engineering. The work is one of the pioneer projects adapted by the R & D Office at the Iraqi MOHESR. The present paper describes the first part of the work; that is the design calculations, simulation and construction. It will be followed by a second part that describes testing and calibration of the tunnel. The proposed wind tunnel has a test section with cross sectional area of (0.7 x 0.7 m2) and length of (1.5 m). The maximum speed is about (70 m/s) with empty test section. The contraction ratio is (8.16). Three screens are used to minimize flow disturbances in the test section.
... Show MoreTheoretical study computerized has been carried out in field electron optics , to design electrostatic unipotential lens , the inverse problem is important method in the design of electrostatic lenses by suggesting an axial electrostatic potential distribution using polynomial function. The paraxial –ray equation is solved to obtain the trajectory particles that satisfy the suggested potential function. In this research , design electrostatic unipotential lens three-electrode accelerating and decelerating L=5 mm operated under finite and infinite magnification conditions. The electrode shape of the electrostatic lens was then determined from the solution of the Laplace's equation's. the results showed low values of spherica
... Show More