Used automobile oils were subjected to filtration to remove solid material and dehydration to remove water, gasoline and light components by using vacuum distillation under moderate pressure, and then the dehydrated waste oil is subjected to extraction by using liquid solvents. Two solvents, namely n-butanol and n-hexane were used to extract base oil from automobile used oil, so that the expensive base oil can be reused again.
The recovered base oil by using n-butanol solvent gives (88.67%) reduction in carbon residue, (75.93%) reduction in ash content, (93.73%) oil recovery, (95%) solvent recovery and (100.62) viscosity index, at (5:1) solvent to used oil ratio and (40 oC) extraction temperature, while using n-hexane solvent gives (60.25%) reduction in carbon residue, (76.54%) reduction in ash content, (89.06%) oil recovery, (94.78%) solvent recovery and (100.3) viscosity index, at (6:1) solvent to used oil ratio and (50 oC) extraction temperature.
As one type of resistance furnace, the electrical tube furnace (ETF) typically experiences input noise, measurement noise, system uncertainties, unmodeled dynamics and external disturbances, which significantly degrade its temperature control performance. To provide precise, and robust temperature tracking performance for the ETF, a robust composite control (RCC) method is proposed in this paper. The overall RCC method consists of four elements: First, the mathematical model of the ETF system is deduced, then a state feedback control (SFC) is constructed. Third, a novel disturbance observer (DO) is designed to estimate the lumped disturbance with one observer parameter. Moreover, the stability of the closed loop system including controller
... Show MoreHyperpigmentation is the increase in the natural color of the skin. The purpose of this study is to evaluate the efficacy and safety of Q-Switched Nd:YAG (1064 & 532 nm) Laser in treatment of skin hyper pigmentation. This study was done in the research clinic of Institute of laser for postgraduate Studies/University of Baghdad from October 2008 to the end of January 2009. After clinical assessment of skin hyperpigmentation color, twenty six patients were divided according to their lesions. Eight Patients with freckles, seven patients with melasma, four patients with tattoo. Cases with tattoo, were subdivided into amateur tattoos two, professional tattoos one, and one traumatic tattoo. Four Patients with post inflammatory hyperpigment
... Show MoreWith its rapid spread, the coronavirus infection shocked the world and had a huge effect on billions of peoples' lives. The problem is to find a safe method to diagnose the infections with fewer casualties. It has been shown that X-Ray images are an important method for the identification, quantification, and monitoring of diseases. Deep learning algorithms can be utilized to help analyze potentially huge numbers of X-Ray examinations. This research conducted a retrospective multi-test analysis system to detect suspicious COVID-19 performance, and use of chest X-Ray features to assess the progress of the illness in each patient, resulting in a "corona score." where the results were satisfactory compared to the benchmarked techniques. T
... Show MoreThe purpose of this paper to discriminate between the poetic poems of each poet depending on the characteristics and attribute of the Arabic letters. Four categories used for the Arabic letters, letters frequency have been included in a multidimensional contingency table and each dimension has two or more levels, then contingency coefficient calculated.
The paper sample consists of six poets from different historical ages, and each poet has five poems. The method was programmed using the MATLAB program, the efficiency of the proposed method is 53% for the whole sample, and between 90% and 95% for each poet's poems.
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
In this study, a new technique is considered for solving linear fractional Volterra-Fredholm integro-differential equations (LFVFIDE's) with fractional derivative qualified in the Caputo sense. The method is established in three types of Lagrange polynomials (LP’s), Original Lagrange polynomial (OLP), Barycentric Lagrange polynomial (BLP), and Modified Lagrange polynomial (MLP). General Algorithm is suggested and examples are included to get the best effectiveness, and implementation of these types. Also, as special case fractional differential equation is taken to evaluate the validity of the proposed method. Finally, a comparison between the proposed method and other methods are taken to present the effectiveness of the proposal meth
... Show MoreMeloxicam (MLX) is non-steroidal anti -inflammatory, poorly water soluble, highly permeable drug and the rate of its oral absorption is often controlled by the dissolution rate in the gastrointestinal tract. Solid dispersion (SD) is an effective technique for enhancing the solubility and dissolution rate of such drug.
The present study aims to enhance the solubility and the dissolution rate of MLX by SD technique by solvent evaporation method using sodium alginate (SA), hyaluronic acid (HA), collagen and xyloglucan (XG) as gastro-protective hydrophilic natural polymers.
Twelve formulas were prepared in different drug: polymer ratios and evaluated for their, percentage yield, drug content, water so
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show More