Used automobile oils were subjected to filtration to remove solid material and dehydration to remove water, gasoline and light components by using vacuum distillation under moderate pressure, and then the dehydrated waste oil is subjected to extraction by using liquid solvents. Two solvents, namely n-butanol and n-hexane were used to extract base oil from automobile used oil, so that the expensive base oil can be reused again.
The recovered base oil by using n-butanol solvent gives (88.67%) reduction in carbon residue, (75.93%) reduction in ash content, (93.73%) oil recovery, (95%) solvent recovery and (100.62) viscosity index, at (5:1) solvent to used oil ratio and (40 oC) extraction temperature, while using n-hexane solvent gives (60.25%) reduction in carbon residue, (76.54%) reduction in ash content, (89.06%) oil recovery, (94.78%) solvent recovery and (100.3) viscosity index, at (6:1) solvent to used oil ratio and (50 oC) extraction temperature.
Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreIn this work, the study of
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreMolecular barcoding was widely recognized as a powerful tool for the identification of organisms during the past decade; the aim of this study is to use the molecular approach to identify the diatoms by using the environmental DNA. The diatom specimens were taken from Tigris River. The environmental DNA(e DNA) extraction and analysis of sequences using the Next Generation Sequencing (NGS) method showed the highest percentage of epipelic diatom genera including Achnanthidium minutissimum (Kützing) Czarnecki, 1994 (21.1%), Cocconeis placentula Ehrenberg, 1838 (21.3%) and Nitzschia palea (Kützing) W. Smith, 1856 (16.3%).
Five species of diatoms: Achnanthidiu
... Show MoreA demonstration of the inverse kinematics is a very complex problem for redundant robot manipulator. This paper presents the solution of inverse kinematics for one of redundant robots manipulator (three link robot) by combing of two intelligent algorithms GA (Genetic Algorithm) and NN (Neural Network). The inputs are position and orientation of three link robot. These inputs are entering to Back Propagation Neural Network (BPNN). The weights of BPNN are optimized using continuous GA. The (Mean Square Error) MSE is also computed between the estimated and desired outputs of joint angles. In this paper, the fitness function in GA is proposed. The sinwave and circular for three link robot end effecter and desired trajectories are simulated b
... Show MoreIn this work, the elemental constituents of smoker and nonsmoker
teeth samples of human were analyzed by Laser induced breakdown
spectroscopy method (LIBS). Many elements have been detected in
the healthy teeth samples, the important once are Ca, P, Mg, Fe, Pb
and Na. Many differences were found between (female and male)
teeth in Ca, P, Mg, Na and Pb contents. The concentrations of most
toxic elements were found significantly in the smoker group. The
maximum concentrations of toxic elements such as Pb, Cd and Co
were found in older male age above 60 year. Also, it was found that
the minimum concentrations of trace elements such as Ca, P and Na
exist in this age group. From these results it is clear that the
Optical fiber chemical sensor based surface Plasmon resonance for sensing and measuring the refractive index and concentration for Acetic acid is designed and implemented during this work. Optical grade plastic optical fibers with a diameter of 1000μm were used with a diameter core of 980μm and a cladding of 20μm, where the sensor is fabricated by a small part (10mm) of optical fiber in the middle is embedded in a resin block and then the polishing process is done, after that it is deposited with about (40nm) thickness of gold metal and the Acetic acid is placed on the sensing probe.