The exploitation of obsolete recyclable resources including paper waste has the advantages of saving resources and environment protection. This study has been conducted to study utilizing paper waste to adsorb phenol which is one of the harmful organic compound byproducts deposited in the environment. The influence of different agitation methods, pH of the solution (3-11), initial phenol concentration (30-120ppm), adsorbent dose (0.5-2.5 g) and contact time (30-150 min) were studied. The highest phenol removal efficiency obtained was 86% with an adsorption capacity of 5.1 mg /g at optimization conditions (pH of 9, initial phenol concentration of 30 mg/L, an adsorbent dose of 2 g and contact time of 120min and at room temperature). The well-known Langmuir and Freundlich adsorption models were studied. The results show that the equilibrium data fitted to the Freundlich model with R2=0.9897 within the concentration range studied. The main objective of this study is finding the best mixing and conditions for phenol removal by adsorption via paper waste.
In this study, the optimum conditions for COD removal from petroleum refinery wastewater by using a combined electrocoagulation- electro-oxidation system were attained by Taguchi method. An orthogonal array experimental design (L18) which is of four controllable parameters including NaCl concentration, C.D. (current density), PH, and time (time of electrolysis) was employed. Chemical oxygen demand (COD) removal percentage was considered as the quality characteristics to be enhanced. Also, the value of turbidity and TDS (total dissolved solid) were estimated. The optimum levels of the studied parameters were determined precisely by implementing S/N analysis and analysis of variance (ANOVA). The optimum conditions were found to be NaCl = 2.5
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreProsthetic is an artificial tool that replaces a member of the human frame that is absent because of ailment, damage, or distortion. The current research activities in Iraq draw interest to the upper limb discipline because of the growth in the number of amputees. Thus, it becomes necessary to increase researches in this subject to help in reducing the struggling patients. This paper describes the design and development of a prosthesis for people able and wear them from persons who have amputation in the hands. This design is composed of a hand with five fingers moving by means of a gearbox ism mechanism. The design of this artificial hand has 5 degrees of freedom. This artificial hand works based on the principle of &n
... Show MoreThis work implements the face recognition system based on two stages, the first stage is feature extraction stage and the second stage is the classification stage. The feature extraction stage consists of Self-Organizing Maps (SOM) in a hierarchical format in conjunction with Gabor Filters and local image sampling. Different types of SOM’s were used and a comparison between the results from these SOM’s was given.
The next stage is the classification stage, and consists of self-organizing map neural network; the goal of this stage is to find the similar image to the input image. The proposal method algorithm implemented by using C++ packages, this work is successful classifier for a face database consist of 20
... Show MoreImage compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreOptical fiber chemical sensor based surface Plasmon resonance for sensing and measuring the refractive index and concentration for Acetic acid is designed and implemented during this work. Optical grade plastic optical fibers with a diameter of 1000μm were used with a diameter core of 980μm and a cladding of 20μm, where the sensor is fabricated by a small part (10mm) of optical fiber in the middle is embedded in a resin block and then the polishing process is done, after that it is deposited with about (40nm) thickness of gold metal and the Acetic acid is placed on the sensing probe.