Preferred Language
Articles
/
phcaNo8BVTCNdQwC4WKq
Data Missing Solution Using Rough Set theory and Swarm Intelligence
...Show More Authors

This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estimation through working with rough set theory. The results obtained from most code sets show that Bees algorithm better than ID3 in decreasing the number of extracted rules without affecting the accuracy and increasing the accuracy ratio of null values estimation, especially when the number of null values is increasing

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Aug 01 2014
Journal Name
Journal Of Economics And Administrative Sciences
Using Bayesian method to estimate the parameters of Exponential Growth Model with Autocorrelation problem and different values of parameter of correlation-using simulation
...Show More Authors

We have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.

The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jan 30 2022
Journal Name
Iraqi Journal Of Science
A Survey on Arabic Text Classification Using Deep and Machine Learning Algorithms
...Show More Authors

    Text categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th

... Show More
Scopus (14)
Crossref (4)
Scopus Crossref
Publication Date
Fri Dec 24 2021
Journal Name
Journal Of Engineering Science And Technology. Journal Of Engineering Science And Technology
Grey-Level Image Compression Using 1-D Polynomial and Hybrid Encoding Techniques
...Show More Authors

Scopus (5)
Scopus
Publication Date
Sun Jun 12 2011
Journal Name
Baghdad Science Journal
Image Compression Using Tap 9/7 Wavelet Transform and Quadtree Coding Scheme
...Show More Authors

This paper is concerned with the design and implementation of an image compression method based on biorthogonal tap-9/7 discrete wavelet transform (DWT) and quadtree coding method. As a first step the color correlation is handled using YUV color representation instead of RGB. Then, the chromatic sub-bands are downsampled, and the data of each color band is transformed using wavelet transform. The produced wavelet sub-bands are quantized using hierarchal scalar quantization method. The detail quantized coefficient is coded using quadtree coding followed by Lempel-Ziv-Welch (LZW) encoding. While the approximation coefficients are coded using delta coding followed by LZW encoding. The test results indicated that the compression results are com

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Feb 07 2019
Journal Name
Iraqi Journal Of Laser
Treatment of Skin Hyperpigmentation using Q-Switched (1064nm and 532nm) Nd:YAG Laser
...Show More Authors

Hyperpigmentation is the increase in the natural color of the skin. The purpose of this study is to evaluate the efficacy and safety of Q-Switched Nd:YAG (1064 & 532 nm) Laser in treatment of skin hyper pigmentation. This study was done in the research clinic of Institute of laser for postgraduate Studies/University of Baghdad from October 2008 to the end of January 2009. After clinical assessment of skin hyperpigmentation color, twenty six patients were divided according to their lesions. Eight Patients with freckles, seven patients with melasma, four patients with tattoo. Cases with tattoo, were subdivided into amateur tattoos two, professional tattoos one, and one traumatic tattoo. Four Patients with post inflammatory hyperpigment

... Show More
View Publication Preview PDF
Publication Date
Mon Jan 01 2018
Journal Name
Journal Of Biotechnology Research Center
Treatment of Waste Paper Using Ultrasound and Sodium Hydroxide for Bioethanol Production
...Show More Authors

Bioethanol produced from lignocellulose feedstock is a renewable substitute to declining fossil fuels. Pretreatment using ultrasound assisted alkaline was investigated to enhance the enzyme digestibility of waste paper. The pretreatment was conducted over a wide range of conditions including waste paper concentrations of 1-5%, reaction time of 10-30 min and temperatures of 30-70°C. The optimum conditions were 4 % substrate loading with 25 min treatment time at 60°C where maximum reducing sugar obtained was 1.89 g/L. Hydrolysis process was conducted with a crude cellulolytic enzymes produced by Cellulomonas uda (PTCC 1259).The maximum amount of sugar released and hydrolysis efficiency were 20.92 g/L and 78.4 %, respectively. Sugars

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Mon Sep 01 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Evaluation of Hydrocarbon Saturation Using Carbon Oxygen (CO) Ratio and Sigma Tool
...Show More Authors

The main aim of this study is to evaluate the remaining oil in previously produced zones, locate the water productive zone and look for any bypassed oil behind casing in not previously perforated intervals. Initial water saturation was calculated from digitized open hole logs using a cut-off value of 10% for irreducible water saturation. The integrated analysis of the thermal capture cross section, Sigma and Carbon/oxygen ratio was conducted and summarized under well shut-in and flowing conditions. The logging pass zone run through sandstone Zubair formation at north Rumaila oil field. The zones where both the Sigma and the C/O analysis show high remaining oil saturation similar to the open hole oil saturation, could be good oil zones that

... Show More
View Publication Preview PDF
Publication Date
Mon Jun 05 2023
Journal Name
Al-khwarizmi Engineering Journal
Fabrication and Analysis of Denture Plate Using Single Point Incremental Sheet Forming
...Show More Authors

Incremental sheet forming (ISF) is a metal forming technology in which small incremental deformations determine the final shape. The sheet is deformed by a hemispherical tool that follows the required shape contour to deform the sheet into the desired geometry. In this study, single point incremental sheet forming (SPIF) has been implemented in dentistry to manufacture a denture plate using two types of stainless steel, 304 and 316L, with an initial thickness of 0.5mm and 0.8mm, respectively. Stainless steel was selected due to its biocompatibility and reasonable cost. A three-dimensional (3D) analysis procedure was conducted to evaluate the manufactured part's geometrical accuracy and thickness distribution. The obtained results confirm

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Wed Sep 01 2010
Journal Name
Journal Of Economics And Administrative Sciences
Using simulation to estimate parameters and reliability function for extreme value distribution
...Show More Authors

   This study includes Estimating scale parameter, location parameter  and reliability function  for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).

 Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Int. J. Nonlinear Anal. Appl.
Feeble regular and feeble normal spaces in α-topological spaces using graph
...Show More Authors

This paper introduces some properties of separation axioms called α -feeble regular and α -feeble normal spaces (which are weaker than the usual axioms) by using elements of graph which are the essential parts of our α -topological spaces that we study them. Also, it presents some dependent concepts and studies their properties and some relationships between them.

Preview PDF
Scopus (1)
Scopus Clarivate