Preferred Language
Articles
/
bsj-4525
Optimum Median Filter Based on Crow Optimization Algorithm
...Show More Authors

          A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the results present that the improved median filter with crow optimization algorithm is more effective than the original median filter algorithm and some recently methods; they show that the suggested process is robust to reduce the error problem and remove noise because of a candidate of the median filter; the results will show by the minimized mean square error to equal or less than (1.38), absolute error to equal or less than (0.22) ,Structural Similarity (SSIM) to equal (0.9856) and getting PSNR more than (46 dB). Thus, the percentage of improvement in work is (25%).

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Apr 23 2017
Journal Name
International Conference Of Reliable Information And Communication Technology
Classification of Arabic Writer Based on Clustering Techniques
...Show More Authors

Arabic text categorization for pattern recognitions is challenging. We propose for the first time a novel holistic method based on clustering for classifying Arabic writer. The categorization is accomplished stage-wise. Firstly, these document images are sectioned into lines, words, and characters. Secondly, their structural and statistical features are obtained from sectioned portions. Thirdly, F-Measure is used to evaluate the performance of the extracted features and their combination in different linkage methods for each distance measures and different numbers of groups. Finally, experiments are conducted on the standard KHATT dataset of Arabic handwritten text comprised of varying samples from 1000 writers. The results in the generatio

... Show More
Scopus (6)
Scopus
Publication Date
Mon May 15 2017
Journal Name
International Journal Of Image And Data Fusion
Image edge detection operators based on orthogonal polynomials
...Show More Authors

View Publication
Scopus (32)
Crossref (10)
Scopus Crossref
Publication Date
Sun Sep 24 2023
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin

... Show More
View Publication
Crossref
Publication Date
Sat Jun 26 2021
Journal Name
2021 Ieee International Conference On Automatic Control & Intelligent Systems (i2cacis)
Vulnerability Assessment on Ethereum Based Smart Contract Applications
...Show More Authors

View Publication
Scopus (10)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Nov 01 2020
Journal Name
Journal Of Physics: Conference Series
Improve topic modeling algorithms based on Twitter hashtags
...Show More Authors
Abstract<p>Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned</p> ... Show More
View Publication
Scopus (20)
Crossref (18)
Scopus Crossref
Publication Date
Fri Sep 23 2022
Journal Name
Specialusis Ugdymas
Text Cryptography based on Arabic Words Characters Number
...Show More Authors

Cryptography is a method used to mask text based on any encryption method, and the authorized user only can decrypt and read this message. An intruder tried to attack in many manners to access the communication channel, like impersonating, non-repudiation, denial of services, modification of data, threatening confidentiality and breaking availability of services. The high electronic communications between people need to ensure that transactions remain confidential. Cryptography methods give the best solution to this problem. This paper proposed a new cryptography method based on Arabic words; this method is done based on two steps. Where the first step is binary encoding generation used t

... Show More
Publication Date
Thu Jan 01 2015
Journal Name
Journal Of Engineering
GNSS Baseline Configuration Based on First Order Design
...Show More Authors

The quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution  of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.

FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic

... Show More
View Publication
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
Classification of fetal abnormalities based on CTG signal
...Show More Authors

The fetal heart rate (FHR) signal processing based on Artificial Neural Networks (ANN),Fuzzy Logic (FL) and frequency domain Discrete Wavelet Transform(DWT) were analysis in order to perform automatic analysis using personal computers. Cardiotocography (CTG) is a primary biophysical method of fetal monitoring. The assessment of the printed CTG traces was based on the visual analysis of patterns that describing the variability of fetal heart rate signal. Fetal heart rate data of pregnant women with pregnancy between 38 and 40 weeks of gestation were studied. The first stage in the system was to convert the cardiotocograghy (CTG) tracing in to digital series so that the system can be analyzed ,while the second stage ,the FHR time series was t

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Oct 01 2017
Journal Name
International Journal Of Scientific & Engineering Research
Horizontal Fragmentation for Most Frequency Frequent Pattern Growth Algorithm
...Show More Authors

Abstract: Data mining is become very important at the present time, especially with the increase in the area of information it's became huge, so it was necessary to use data mining to contain them and using them, one of the data mining techniques are association rules here using the Pattern Growth method kind enhancer for the apriori. The pattern growth method depends on fp-tree structure, this paper presents modify of fp-tree algorithm called HFMFFP-Growth by divided dataset and for each part take most frequent item in fp-tree so final nodes for conditional tree less than the original fp-tree. And less memory space and time.

View Publication Preview PDF
Publication Date
Tue Sep 30 2008
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Optimization of Biochemical Treatment of Tannery Wastewater
...Show More Authors

The present work is concerned with the finding of the optimum conditions for biochemical wastewater treatment for a local tannery. The water samples were taken from outline areas (the wastewater of the chrome and vegetable tannery) in equal volumes and subjected to sedimentation, biological treatment, and chemical and natural sedimentation treatment.
The Box-Wilson method of experimental design was adopted to find useful relationships between three operating variables that affect the treatment processes (temperature, aeration period and phosphate concentration) on the Biochemical Oxygen Demand (BOD5).
The experimental data collected by this method were successfully fitted to a second order polynomial mathematical model. The most fa

... Show More
View Publication Preview PDF