Preferred Language
Articles
/
ijcpe-828
Removal of Vanadium and Nickel Ions from Iraqi Atmospheric Residue by Using Solvent Extraction Method
...Show More Authors

Iraqi crude Atmospheric residual fraction supplied from al-Dura refinery was treated to remove metals contaminants by solvent extraction method, with various hydrocarbon solvents and concentrations. The extraction method using three different type solvent (n-hexane, n-heptane, and light naphtha) were found to be effective for removal of oil-soluble metals from heavy atmospheric residual fraction. Different solvents with using three different hydrocarbon solvents (n-hexane, n-heptane, and light naphtha) .different variables were studied solvent/oil ratios (4/1, 8/1, 10/1, 12/1, and 15/1), different intervals of perceptual (15, 30-60, 90 and 120 min) and different temperature (30, 45, 60 and 90 °C) were used. The metals removal percent were found depending on the yield of asphaltene. The solvent-oil ratio had important effects on the amount of metal removal. The metals removal was increased at increasing temperatures from 30 to 90 0C increases the metal ion precipitated. The highest Ni precipitated was 79.23 ppm using heptane at 90 0C while for V the highest value was 64.51 ppm using also heptane at 90 0C, while the mixing time decreased metals removal. With increasing asphalt yield, the removal of metal was more selective. Among the solvents used in the extraction treatment method, the highest Ni precipitated was 76 ppm using hexane at 150 ml solvent and showed the most promising results. Increasing mixing time increases metals removal for V, the highest value was 65.51 ppm using either heptane or light naphtha.

   The highest Ni precipitated was 78 ppm using heptane at 120 min while for V the highest value was 67 ppm using either heptane or light naphtha after 120 min.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Aug 10 2021
Journal Name
Design Engineering
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye

... Show More
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Engineering
Exploring the Factors Affecting the Elemental Cost Estimation with Relationship Analysis Using AHP
...Show More Authors

Cost estimation is considered one of the important tasks in the construction projects management. The precise estimation of the construction cost affect on the success and quality of a construction project. Elemental estimation is considered a very important stage to the project team because it represents one of the key project elements. It helps in formulating the basis to strategies and execution plans for construction and engineering.  Elemental estimation, which in the early stage, estimates the construction costs depending on  . minimum details of the project so that it gives an indication for the initial design stage of a project. This paper studies the factors that affect the elemental cost estimation as well as the rela

... Show More
View Publication Preview PDF
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus
Publication Date
Sat Jun 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Probabilistic Model building using the Transformation Entropy for the Burr type –xii Distribution
...Show More Authors

Entropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2016
Journal Name
Machine Learning And Data Mining In Pattern Recognition
A New Strategy for Case-Based Reasoning Retrieval Using Classification Based on Association
...Show More Authors

View Publication Preview PDF
Scopus (7)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Mon Sep 01 2014
Journal Name
Al-khwarizmi Engineering Journal
Trajectory Tracking Control for a Wheeled Mobile Robot Using Fractional Order PIaDb Controller
...Show More Authors

Nowadays, Wheeled Mobile Robots (WMRs) have found many applications as industry, transportation, inspection, and other fields. Therefore, the trajectory tracking control of the nonholonomic wheeled mobile robots have an important problem. This work focus on the application of model-based on Fractional Order  PIaDb (FOPID) controller for trajectory tracking problem. The control algorithm based on the errors in postures of mobile robot which feed to FOPID controller to generate correction signals that transport to  torque for each driven wheel, and by means of dynamics model of mobile robot these torques used to compute the linear and angular speed to reach the desired pose. In this work a dynamics model of

... Show More
View Publication Preview PDF
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Aug 30 2018
Journal Name
Journal Of Engineering
An Optimum Strategy for Producing Precise GPS Satellite Orbits using Double-Differenced Observations
...Show More Authors

Both the double-differenced and zero-differenced GNSS positioning strategies have been widely used by the geodesists for different geodetic applications which are demanded for reliable and precise positions. A closer inspection of the requirements of these two GNSS positioning techniques, the zero-differenced positioning, which is known as Precise Point Positioning (PPP), has gained a special importance due to three main reasons. Firstly, the effective applications of PPP for geodetic purposes and precise applications depend entirely on the availability of the precise satellite products which consist of precise satellite orbital elements, precise satellite clock corrections, and Earth orientation parameters. Secondly, th

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Feb 08 2019
Journal Name
Iraqi Journal Of Laser
Urinary Tract Stones Fragmentation using (2100 nm) Holmium: YAG Laser: (In vitro Analysis)
...Show More Authors

Urinary stones are one of the most common painful disorders of the urinary system. Four new technologies have transformed the treatment of urinary stones: Electrohydraulic lithotripsy, ultrasonic lithotripsy, extracorporeal shock wave lithotripsy, and laser lithotripsy.The purpose of this study is to determine whether pulsed holmium laser energy is an effective method for fragmenting urinary tract stones in vitro, and to determine whether stone composition affects the efficacy of holmium laser lithotripsy. Human urinary stones of known composition with different sizes, shapes and colors were used for this study. The weight and the size of each stone were measured. The surgical laser system which used in our study is Ho:YAG laser(2100nm)

... Show More
View Publication Preview PDF
Publication Date
Sun Mar 01 2015
Journal Name
Journal Of Engineering
Multi-Sites Multi-Variables Forecasting Model for Hydrological Data using Genetic Algorithm Modeling
...Show More Authors

A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was

... Show More
View Publication Preview PDF